We have discussed inductive and deductive science, now here is more to chew on, how excess instrumentation and data gathering can make science spin its wheels.
Phillip Ball, professional writer on science issues, discusses in sequence the new expansion of instrumentation and it’s impact on science.
Just a few tid bits from this thoughtful and insightful discussion.
. . . The tools of science are so specialised that we accept them as a kind of occult machinery for producing knowledge. We figure that they must know how it all works. Likewise, histories of science focus on ideas rather than methods — for the most part, readers just want to know what the discoveries were. Even so, most historians these days recognise that the relationship between scientists and their instruments is an essential part of the story. It isn’t simply that the science is dependent on the devices; the devices actually determine what is known. You explore the things that you have the means to explore, planning your questions accordingly.
Then a fascinating tale of the great Rutherford and his lab glass blower.
Take the work of the New Zealand physicist Ernest Rutherford, perhaps the finest experimental scientist of the 20th century. It was at a humble benchtop with cheap, improvised equipment that he discovered the structure of the atom, then proceeded to split it. Rather than being limited by someone else’s view of what one needed to know, Rutherford devised an apparatus to tell him precisely what he wanted to find out. His experiments emerged organically from his ideas: they almost seem like theories constructed out of glass and metal foil.
In one of his finest moments, at Manchester University in 1908, Rutherford and his colleagues figured out that the alpha particles spewed out during radioactive decay were the nuclei of helium atoms. The natural way to test the hypothesis is to collect the particles and see if they behave like helium. Rutherford ordered his glassblower, Otto Baumbach, to make a glass capillary tube with extraordinarily thin walls such that the alpha particles emitted from radium could pass right through. Once the particles had accumulated in an outer chamber, Rutherford connected up the apparatus to become a gas-discharge tube. As electrodes converted the atoms in the gas into charged ions, they would emit light at a wavelength that depended on their chemical identity. Thus he revealed the trapped alpha particles to be helium, disclosed by the signature wavelength of their glow. It was an exceedingly rare example of a piece of apparatus that answers a well-defined question — are alpha particles helium? — with a simple yes/no answer, almost literally by whether a light switches on or not.
In sequence Ball talks of the data overload and “industrialization” of research that is driven by instruments. How Bacon himself was overcome by empiricism. How biological (for example genomic) and high energy physics (think CERN) research is data/instrument driven–with scientific discovery and hypothesis generation following slowly behind the production of mountains of data in some cases.
We creep forward in our understanding. For example we know a lot less than the hype about the genome.
Popper emphasized falsifiability and experimentation to verify the reliability of a premise, and this Phillip Ball discussion looks at the problem of too much data and not enough analysis with an ample review of the tension between induction and deduction. Inductive science has a role in the process and we must avoid being slaves to data collection. It’s still about finding truth and testing theories, but also about a proper analysis of what the question should be so we can use the data properly.
Cargo Cult Science forgets caution and skepticism in favor of intellectual passion and sometimes outright fallacious and biased work. As Feynman said, scientists have to be their own most severe critics, which starts with prudence and integrity and rides on honesty, a most important social virtue.
Like any theory, Poppers has to be taken with some caveats.
http://aeon.co/magazine/nature-and-cosmos/science-is-becoming-a-cult-of-hi-tech-instruments/
Many problems stem from the bizarre psychological bias to assume that a thing did not exist prior to being observed. The most notable recent example of course being the hole in the ozone layer. How would we have known to be frightened of it without the new equipment revealing its existense?
Then there’s NASA’s approach to dealing with the problem of data overload: just discard (or accidentally lose) the data, at least if it is inconvenient data.
Consider sea ice. Most of the graphs of sea ice extent which you’ll see start with Nimbus-7 data in 1979. But Nimbus-5, Nimbus-6, and Seasat-1 all made sea ice measurements via passive microwave radiometry prior to 1979.
We still have good quality Nimbus-5 ESMR (passive microwave) measurement data of sea ice from December 11, 1972 through May 16, 1977, but NASA discarded or lost the Nimbus-6 and Seasat-1 measurements of sea ice extent. (Nimbus 6 was active from June 1975 until March, 1983; Seasat-1 was active from June 1978 until October 1978.)
Nimbus-5’s ESMR instrument continued to operate in a degraded mode through March 1983, but the 1977-1983 data doesn’t seem to be available on-line; perhaps it has been discarded or lost, too.
The inconvenient truth is that those early satellite measurements showed that sea ice extent was increasing, and peaked in the late 1970s, a fact which was reported in the first two IPCC Climate Assessment Reports, but has been “scrubbed” from the last three:
http://citebite.com/e1u8r0l1m6tnb
http://tinyurl.com/SAR-seaice-79peak
I can’t help wondering whether NASA is less careful to preserve data which contradicts the CAGW narrative than data which supports it.