Physics takes pride in being the foundational science—an anchor of mathematical laws that describe the functioning of the physical world and that relate one phenomenon to another.
Albert Einstein’s 1917 equation E=mc2, for example, is elegant in its simple statement of the relationship between energy to mass via the constant “c,” the speed of light. Because mass is related to gravity, the theory of relativity, which describes how gravity operates at cosmic distances, is deeply connected to this equation. If new observations and measurements over the past 90 years had required regular tinkering with the equation by adding new factors and adjustments, we would easily discern that the theory attached to it was at best incomplete.
The term “theory” in science is lavished only on the big ideas that appear to have predictive power. Atomic, nuclear and even evolutionary theory are given high regard, because they each present a worldview that is consistent with a wide range of observations. Unfortunately, as astrophysicist J.V. Narlikar states, “you can always fit theory to data by adding a large number of parameters.” But doing science this way is like performing a secret-number trick: a magician, by getting an audience member to apply a certain sequence of mathematical operations to a secret number, amazingly comes up with the hidden number.
With regard to the big bang theory, research astronomer Tom Van Flandern finds it wanting on several fronts, but what concerns him most is its preservation through the invention of new mathematical factors or parameters. These, he believes, simply shave the square pegs of new observations to fit the round holes of the big bang model. Van Flandern argues that, like the magician who has established an arithmetic sequence that will invariably yield the answer he’s looking for, the big bang proponent makes the observation fit the theory through mathematical sleight of hand.
“Big bang models now use an ever increasing variety of free parameters to maintain consistency with various observational constraints,” he writes on his MetaResearch website. “Related to origin and expansion conditions alone, we now have the Hubble constant h (= expansion rate); the cosmological constant Λ (= pressure resisting gravity); the cosmic deceleration parameter q0 (= expansion deceleration); the density parameter Ω (= ratio of actual matter density to density needed for flat universe), subdivided into the density for ordinary matter and that for invisible dark matter; and the bias parameter b (= measures lumpiness of matter distribution). The hypothetical dark matter is itself a fudge factor required to obtain agreement with observations that were not in accord with big bang expectations. . . .
“If the field of astronomy were not presently over-invested in the expanding universe paradigm, it is clear that modern observations would now compel us to adopt a static universe model as the basis of any sound cosmological theory.”