With all the buzz for space, space technology, the ISS and SpaceX I have been pondering more and more about my place in the universe. A friend of mine posted a great read on Facebook about intelligent life. It sparked a discussion… blah. Here is the article she posted – http://gizmodo.com/the-fermi-paradox-where-the-hell-are-the-other-earths-1580345495
I read this (the repost that is) a while back, and being a developer I think this is a great read. I don’t necessarily subscribe to the “programmed” aspect of the content but the nature of our reality but considering we’re not even a Type I Civilization, maybe this fellow is on to something.
Extrapolation is a technique for projecting a trend into the future. It has been used liberally by economists, futurists, and other assorted big thinkers for many years, to project population growth, food supply, market trends, singularities, technology directions, skirt lengths, and other important trends. It goes something like this:
If a city’s population has been growing linearly by 10% per year for many years, one can safely predict that it will be around 10% higher next year, 21% higher in two years, and so on. Or, if chip density has been increasing by a factor of 2 every two years (as it has for the past 40), one can predict that it will be 8 times greater than today in three years (Moore’s Law). Ray Kurzweil and other Singularity fans extrapolate technology trends to conclude that our world as we know it will come to an end in 2045 in the form of a technological singularity. Of course there are always unknown and unexpected events that can cause these predictions to be too low or too high, but given the information that is known today, it is still a useful technique.
To my knowledge, extrapolation has not really been applied to the problem that I am about to present, but I see no reason why it couldn’t give an interesting projection…
…for the nature of matter.
In ancient Greece, Democritus put forth the idea that solid objects were comprised of atoms of that element or material, either jammed tightly together, as in the case of a solid object, or separated by a void (space). These atoms were thought to be little indivisible billiard-ball-like objects made of some sort of “stuff.” Thinking this through a bit, it was apparent that if atoms were thought to be spherical and they were crammed together in an optimal fashion, then matter was essentially 74% of the space that it takes up, the rest being air, or empty space. So, for example, a solid bar of gold was really only 74% gold “stuff,” at most.
That view of matter was resurrected by John Dalton in the early 1800s and revised onceJ. J. Thomson discovered electrons. At that point, atoms were thought to look like plum pudding, with electrons embedded in the proton pudding. Still, the density of “stuff” didn’t change, at least until the early 1900s when Ernest Rutherford determined that atoms were actually composed of a tiny dense nucleus and a shell of electrons. Further measurements revealed that these subatomic particles (protons, electrons, and later, neutrons) were actually very tiny compared to the overall atom and, in fact, most of the atom was empty space. That model, coupled with a realization that atoms in a solid actually had to have some distance between them, completely changed our view on how dense matter was. It turned out that in our gold bar only 1 part in 10E15 was “stuff.”
That was, until the mid-60’s, when quark theory was proposed, which said that protons and neutrons were actually comprised of three quarks each. As the theory (aka QCD) is now fairly accepted and some measurement estimates have been made of quark sizes, one can calculate that since quarks are between a thousand and a million times smaller than the subatomic particles that they make up, matter is now 10E9 to 10E18 times more tenuous than previously thought. Hence our gold bar is now only about 1 part in 10E30 (give or take a few orders of magnitude) “stuff” and the rest in empty space. By way of comparison, about 1.3E32 grains of sand would fit inside the earth. So matter is roughly as dense with “stuff” as one grain of sand is to our entire planet.
So now we have three data points to start our extrapolation. Since the percentage of “stuff” that matter is made of is shrinking exponentially over time, we can’t plot our trend in normal scales, but need to use log-log scales.
And now, of course, we have string theory, which says that all subatomic particles are really just bits of string vibrating at specific frequencies, each string possibly having a width of the Planck length. If so, that would make subatomic particles all but 1E-38 empty space, leaving our gold bar with just 1 part in 1E52 of “stuff”.
Gets kind of ridiculous doesn’t it? Doesn’t anyone see where this is headed?
In fact, if particles are comprised of strings, why do we even need the idea of “stuff?” Isn’t it enough to define the different types of matter by the single number – the frequency at which the string vibrates?
What is matter anyway? It is a number assigned to a type of object that has to do with how that object behaves in a gravitational field. In other words, it is just a rule.
We don’t really experience matter. What we experience is electromagnetic radiation influenced by some object that we call matter (visual). And the effect of the electromagnetic force rule due to the repulsion of charges between the electron shells of the atoms in our fingers and the electron shells of the atoms in the object (tactile).
In other words, rules.
In any case, if you extrapolate our scientific progress, it is easy to see that the ratio of “stuff” to “space” is trending toward zero. Which means what?
That matter is most likely just data. And the forces that cause us to experience matter the way we do are just rules about how data interacts with itself.
Data and Rules – that’s all there is.
Oh yeah, and Consciousness.