# Reconsidering Continuity: The Hidden Nature of 'Nothing'
Written on
Chapter 1: The Myth of Continuity
In my previous article, I made a bold assertion: the concept of 'nothing' as a seamless and continuous entity does not exist—neither in theory nor in the physical world. This applies to space, time, and even the realm of real numbers. I discussed the shaky foundations of mathematics and how our cognitive biases can mislead us, leaving readers eager for an explanation of why this perspective is crucial for scientific advancement. In this piece, I will delve deeper into that notion and illustrate how it could potentially ignite a new revolution in our comprehension of the cosmos.
To embark on this intellectual journey, we must examine calculus, the highly esteemed mathematical discipline that scientists cherish, students dread, and many struggle to grasp. This powerful tool has been instrumental in shaping our models of physical reality since it was developed by Newton and Leibniz in the 17th century.
For a glimpse into the profound impact of calculus on civilization, one could turn to Steven Strogatz's remarkable book, Infinite Powers. It reveals how calculus has influenced our understanding of various phenomena—from fluid dynamics to economic models, from sound waves to the technology you are using to read this article. Its capabilities are truly extraordinary.
It’s easy to conclude that calculus must reflect a fundamentally smooth universe, especially given its historical successes. This was indeed the prevailing belief among scientists in the 19th century. They assumed that matter existed in continuous forms, evident in their conviction that one could pour half of a cup of water repeatedly without ever exhausting the supply.
However, we now recognize that water, like all matter, is composed of atoms. This realization marked a significant shift; the notion of continuous matter persisted stubbornly until the early 20th century. The equations at scientists' disposal were beautifully simple and aligned with nearly all observable phenomena, leaving little reason to challenge the status quo.
A pivotal change occurred following Einstein's revolutionary paper on Brownian motion, which prompted a reassessment of these continuous models. This resistance to embracing atomism had dire consequences. The renowned physicist Ludwig Boltzmann, who laid the groundwork for Einstein's insights, tragically took his own life after enduring harsh criticism from his contemporaries, who ridiculed the notion of unseen, erratic particles. His groundbreaking contributions, however, paved the way for quantum mechanics—a field that would reshape our understanding of reality.
Quantum mechanics unequivocally reveals that certain natural entities exist in discrete quantities, such as the defined energy levels of electrons within atoms. Despite this acceptance, intellectual progress in this direction stagnated. Physics advanced only as far as quantum field theory, largely influenced by Richard Feynman's work in the 1960s, and little has shifted since.
This stagnation is unfortunate because the assumption of a continuous backdrop of space, which quantum field theory depends upon, is fundamentally flawed. Yes, you read that correctly. This framework, which is rooted in calculus and arguably the most successful scientific theory to date, does not withstand critical examination.
It's important to clarify that I am not a trained mathematician or physicist; rather, I am an enthusiastic amateur who has invested considerable time in these subjects. Feel free to dismiss my ideas or engage with them in the comments; I welcome informed dialogue as one of life's great pleasures.
To clarify further, calculus allows us to model systems that seem continuous by presuming that effects at minuscule scales are negligible. We essentially reduce these effects to zero and discard them mathematically. While this may sound peculiar, it is effective.
For instance, if you wish to determine the area of an irregular shape, you can approximate it by placing narrow planks across its surface and summing their areas. While this method might overlook some edges, the accuracy improves as the planks become thinner and more numerous, eventually revealing the correct area as their width approaches zero. This is the essence of integration, while differentiation serves to analyze how the curve of a function alters.
However, this process relies on avoiding what is known as a completed infinity. If we were to reduce the width of the planks to exactly zero, the total area would also diminish to zero, rendering the calculations ineffective. Instead, we must regard the width of the planks as infinitesimally small, employing a technique mathematicians refer to as taking a limit.
To understand this concept, we observe how our area estimates evolve as we employ increasingly narrow planks in greater quantities, approaching infinity. This process allows us to infer what the value would be at infinity.
Nevertheless, this approach differs significantly from Georg Cantor's treatment of real numbers, which I discussed in my earlier article. In this scenario, the number of planks represents a countable infinity, as the limit process does not accommodate anything beyond that. We literally increase our plank count and analyze the function's behavior.
In Cantor's interpretation of continuity, every point on the number line is filled with numbers, creating an uncountable infinity. This means that calculus could yield incorrect results for smooth functions that experience abrupt changes below the thresholds of computable numbers—let's refer to these as 'naughty functions.' If we were to develop an analogous system using Cantor's approach to address these naughty functions, we would require a fundamentally different methodology.
Thus, calculus is not inherently linked to 'true' continuity; rather, it serves as an approximation.
"Surely, Lebesgue integration addresses this!" a mathematician might interject. "These concepts have been explored in measure theory and are well-established!"
"What in the world is Lebesgue integration?" another voice might reply.
This technique involves measuring areas by stacking planks horizontally instead of vertically. This allows for a Cantor-style assessment, asserting that these planks possess uncountable properties. However, it still grapples with the same issue. We remain within the realm of limit approximations, albeit along a different axis. Measure theory must navigate the intricacies of length by claiming that uncountable infinities have been managed, which is the best approach available. This is an inherent limitation, as uncountable sets lack many numerical properties that would render them workable.
While Lebesgue integration can address some problematic discontinuous functions, such as the Cantor function, it still struggles with our naughty functions, which exhibit smooth fluctuations beneath the scale of computable values. The fundamental process employed to determine integrals remains unchanged.
Ultimately, what we are doing with calculus is examining the behavior of a system when the scale of its foundational elements is deemed negligible. We concentrate on proportional changes. This aligns with how calculus is utilized in practice. All the examples I mentioned earlier—sound waves, economies, fluids, and voltages—are not genuinely continuous; they simply represent systems that are vastly larger than the units that dictate their behavior. Nonetheless, calculus can describe them exceptionally well.
In fact, these are the only systems for which we have empirical evidence supporting the efficacy of calculus. There is no proof that calculus operates on actual continua, as such evidence is unattainable. This is acceptable, as, to my knowledge, no naughty functions exist. I'm uncertain how one might even articulate such constructs.
Thus, calculus serves as a tool for measuring large phenomena. For practical purposes, it is a method for mapping one set of computable numbers to another through well-behaved functions. It can only process computable numbers as inputs and yield computable numbers as outputs. It cannot accommodate non-computable constants. Furthermore, every operation it performs is itself a computation. It is an exceptional tool for describing extensive systems, but that is its limit. Our belief in calculus's connection to genuine continuity is a cultural construct we have adopted because it feels intuitively right, despite the lack of supporting evidence.
The only way to relate calculus to Cantor's real numbers is by asserting that it must inherently do so. If we could introduce non-computable numbers into calculus, we would presumably receive other logical non-computable outcomes. However, we can achieve similar results with various numerical types! I could create a new category of numbers—super-fun numbers—where every value corresponds to an uncountably distinct smiley face, which would transform nicely for all differentiable functions. However, that does not imply that calculus inherently contains super-fun numbers within its framework.
Ultimately, this implies that physics reliant on calculus can only portray systems composed of computable elements. It exclusively addresses computable points. Nothing else has ever existed. Given this understanding, we can engage in some productive reasoning.
Either: A: No physical theory will ever adequately model nature in the way that a true continuum requires for comprehension. (There is no supporting evidence for this assertion.)
Or B: Physics does function (as we might hope), implying that there must be a means to describe space or spacetime using a countable set of components. If such a countable set exists, then some discrete representation of nature's smallest scales must always suffice.
If we adopt B, then either B1: there is a fundamental scale to nature, or B2: we inhabit a universe where a certain scale will consistently suffice to characterize phenomena, although the necessary representation may change according to circumstances.
B2 suggests a universe in which a symbolic representation of nature, employing differential equations, is genuinely the best we can achieve. But how could this be realized? It implies a computable mechanism governing the universe that remains frustratingly unknowable. This represents a weak, perhaps even pathological, foundation from which to comprehend nature.
Conversely, if we embrace B1, we unlock a realm of possibilities. We can construct new models, establishing constraints on physical theories based on the behaviors permitted at the fundamental scale. This empowers us to relegate theories like string theory, which rely on immeasurable topologies, to the periphery of our candidate theories. It encourages us to concentrate on alternative systems, such as loop quantum gravity or the causal set program, with renewed confidence that they provide a more robust mathematical framework for understanding nature than their rivals.
We can confront the persistent divide between quantum mechanics and relativity that has endured for a century, recognizing that a resolution is necessary. This positions us for models of nature that can sustain realism, given that a smooth manifold-like backdrop to the universe must emerge from the system under investigation. It sets the stage for scientific triumph after decades of conceptual stagnation. All we need to do is discard our outdated 19th-century notions. In my next article, I will endeavor to clarify what I mean.
Chapter 2: The Nature of Nothing and Its Mathematical Implications
The first video titled "NOTHING: The Science of Emptiness" explores the philosophical and scientific interpretations of 'nothingness,' delving into its implications in various fields.
The second video, "Coffin Curse (Chl) - The Continuous Nothing (Album 2024)," presents a musical reflection on the themes of continuity and emptiness, echoing the discussions on the nature of nothing in science and mathematics.