|Unity and Disunity in Science|
Unity covers a wide range of loosely connected ideas in science, differently analyzed by different interpreters. Generally, they are expressions, or echoes, of the idea that science can succeed in providing one consistent, integrated, simple, and comprehensive description of the world.
This entry will provide a historical perspective on such ways of thinking about unity in science. (Readers should bear in mind that the real history is much more complex and interesting than the following microsketch, which is intended only to introduce the leading ideas.)
Mechanisms and Laws
The scientific revolution of the seventeenth century involved consolidation of the “mechanical (or corpuscularian) philosophy” according to which natural phenomena are to be understood in terms of shaped matter in motion, with the natural world likened to a giant mechanism.
|Mechanisms and Laws|
Natural philosophy could look for unity in this regard by thinking of the parts of the world machine as all governed by the same simple set of rules or laws. Isaac Newton’s mechanics could be seen in this regard as a paradigm of unification, showing how the same laws covered motion in both the heavens and on Earth.
But there was a monkey wrench in this mechanist paradigm: Newton’s law of gravity involved “action at a distance,” inadmissible by most seventeenth-century interpreters as a legitimate mechanical principle. Mechanism required contact action. Newton’s official response was that “I make no hypotheses,” that is, no hypotheses or speculations about what the underlying real mechanism of gravity might be.
Instead, he presented his mechanics as “mathematical only,” that is, mathematical principles by which motions can be reliably and accurately described but with no pretense to describing what makes things move as they do.
Accordingly, some of Newton’s successors thought of unity in theory and in science in terms of a simple set of general, mathematical laws that integrate, by covering, a wide range of phenomena that otherwise might seem independent, and all this without any thought of underlying mechanisms. This will be referred to as the “nomological attitude.”
These two ideas, seeing disparate phenomena as manifestation of one underlying mechanism or covered by one set of simple laws, interacted and intertwined during the eighteenth and nineteenth centuries.
For example, James Clerk Maxwell worked to treat first electric and magnetic effects and then discovered he could also cover optical phenomena, thinking of all of these first as manifestations of one underlying mechanism, developing the laws that might govern such a mechanism, and then letting go of the postulated underlying mechanism as unverifiable speculation in favor of the general laws that had emerged.
|Heinrich Rudolf Hertz|
Heinrich Rudolf Hertz maintained that Maxwell’s theory is Maxwell’s equations, and eventually Albert Einstein’s special relativity did in the speculated stuff of electromagnetic mechanisms, the luminiferous aether.
The opposition of mechanisms versus laws also played out, with the opposite result, during the second half of the nineteenth century over the issue of atoms. The predictive and explanatory success of chemistry, as well as the nascent kinetic theory (statistical mechanics), emboldened some to see atoms and molecules as real cogs in the cosmic machine.
Others scoffed at postulation of things too small to see or individually detect as “metaphysics,” not science. Continuum mechanics and even contact action presented severe problems for an atomistic theory. The speculated indivisibility of atoms, though mentioned by some, was not really the issue.
Rather, it was whether one could correctly think of the underlying order in terms of discrete parts interacting in something like the mechanist tradition or whether this should be seen, at best, as a kind of pretty imaginative picture, while scientific truth was exhausted by mathematical laws in the nomological tradition.
The issue of atoms came to a head in the first decade of the twentieth century in the work augmented and integrated by Jean-Baptiste Perrin. Perrin catalogued the astonishingly numerous and diverse facts that could be encompassed by postulating atoms: constant ratios in chemistry, relative atomic weighs, diffusion and other fluctuation phenomena, osmotic pressure, behavior of electrolytes, specific heat, behavior of thin materials, even why the sky is blue. Perrin tabled sixteen independent ways of reaching the same estimate of Avogadro’s number.
Einstein’s theory of Brownian motion proved especially effective—in a sense one could “see” the causal effects of individual molecular collisions. A vast range of otherwise diverse observable phenomena were unified in the sense of interpreting them as the manifestation of the properties and behavior of atoms. By 1913 most of the physics community accepted atoms as real.
Electric, magnetic, and optical phenomena unified by Maxwell’s laws. Perrin’s diverse phenomena unified by postulation of atoms. Though they are in some ways polar attitudes, mechanistic and nomological thinking really cannot operate without one another.
To provide unifying explanations, mechanisms need to be governed by laws, and laws, if they are to do more than exhaustively list superficially observable phenomena, must at least have the form of describing some conceptually more economic structure.
The nineteenth century saw explosive development of the natural sciences, emboldening some toward the end of the century to speculate that physics was almost completed with little left to do but to work out the applications to other natural phenomena.
Contrary to what one might have imagined, the shocks of relativity and quantum mechanics in the first quarter of the twentieth century initially encouraged rather than tempered such scientific utopian attitudes.
Some strands of positivism in the second quarter of the century described unity of science in terms of unity of language and methods; others took the spirit of unification to its logical extreme, emphasizing axiomatic formulation and developing the idea of reduction of all natural phenomena to “fundamental physics” in the spirit of the logicists’ hope of reducing all of mathematics to logic.
By the 1950s and 1960s reductionistic thinking had taken a deep hold on much thinking in both philosophy and science, no doubt encouraged by advances within science in subjects such as quantum chemistry and microbiology.
Unity now took the form of (expected) chains of reductive definitions, identifying not just complex physical, but biological, psychological, and social phenomena with the behavior of physical parts, everything ultimately to be described in terms of the laws of fundamental physics.
Again a monkey wrench, or this time two, brought the reductionist juggernaut to a halt. In the 1970s and 1980s philosophy of science became acutely aware of difficulties with the whole reductionist program. The reversal began with the collapse of the two show cases: claimed deductive reduction of thermodynamics to statistical mechanics and of Mendelian to molecular genetics.
Temperature is in fact realized by mechanisms in addition to mean kinetic energy, and in principle could be realized in indefinitely many ways. There is no neat one trait–one gene correlation and the developmental effects of any one bit of DNA depend, not just on its genetic, but on its overall environmental context.
If temperature and genes are multiply realizable by disparate physical constructs, then surely also, for example, are mental states. Higher level objects and phenomena may still all be physically realized, but in such diverse ways that the program of reduction by definitions and deduction loses plausibility. Unity no longer seems such an apt term.
This first basis for some kind of disunity was followed in the 1980s and 1990s by a second. Nancy Cartwright, Ronald N. Giere, and others have pointed out that, whatever the ultimate aims of science or of some scientists might be, the science we actually have, now or any time in the foreseeable future, hardly follows the pattern of calculation of phenomena from universally applicable, exact, true laws or of description in terms of mechanisms known or even believed to operate exactly as described.
Rather, science uses laws in the construction of idealized models, always limited in scope, and even where they apply never exactly correct. Rather than providing descriptions that set out exactly what the phenomena are, the laws of science are only true, or at least only exactly true, of the idealized models that in turn enable us to understand phenomena and their hidden sources in terms of the idealizations to which the phenomena are similar.
For the puny minds of even the best physicists, to understand the fluid properties of water we need to resort to continuum hydrodynamical models, while to understand dispersive phenomena we turn to the discontinuous models of statistical mechanics.
“Foundational” theories fare no better. Quantum field theory and general relativity each idealize away from the phenomena of the other, are mutually inconsistent, and have no humanly accessible direct application to most phenomena of human interest. The science we have displays disunity on a grand scale.
Unity and Disunity in Science
Or does it? Few would dispute the claims just listed about science and idealized models. But many challenge the interpretation of these facts as constituting disunity in any weighty sense. Since unity and disunity have no well-established univocal usage and are susceptible to expropriation as rhetorical weapons by advocates of one or another larger position, we have difficulty in saying just what the issue really is, let alone in resolving it.
Yet there are interesting and important issues here, ones that it is suggested we do not understand at all well. For elaboration let us, with hindsight, revisit the unification afforded by the postulation of atoms.
Descriptions of none of the phenomena described as manifestations of the existence and behavior of atoms follow from the bare postulation of atoms alone. We require assumptions, not only about the properties and behavior of atoms but also—for many of the phenomena—about a great deal else.
The accounts based on the postulation of atoms hardly constitute the deductions imagined by the reductionists. Rather, they work, often fortuitously, by appealing to a helter-skelter of plausible assumptions, phenomenological observations, disconnected results from other accounts, and a wide range of approximative mathematical methods and experimental techniques from independently practiced fields.
Nonetheless, all these accounts have at their core the assumption that material is composed of relatively stable and discrete parts with properties that admit of systematic investigation. In all the admittedly disunified messy process of science the postulation of atoms is doing real and systematic work—we would not have this body of accounts without the postulation of atoms.
This kind of intertheoretic asymmetry occurs broadly. Quantum theory plays a role in understanding chemistry that chemistry does not play in understanding quantum theory, and similarly for chemistry and biology, biology and psychology, and many other pairs of theories and theoretical domains. Clearly, such asymmetry has to do with the circumstance that parts of an object or process play crucial roles in the behavior of the containing whole.
But one does not yet understand at all clearly the nature of such intertheoretic relations—reductionism was a vast oversimplification. The mirage of some kind of simple unity was the artifact of imagining that the human mind could get its head around all of the natural world, exactly and, at least potentially, in all its detail.
This will not happen, at least not until long after this encyclopedia has become hopelessly out of date. In the mean time we face the complex and interesting challenge of charting the complex interplay of elements of unity and disunity in the science we know.