Thank you for excellent reader responses to eyeonmiami. One reader noted the failure of scientific models to form the basis of public policies necessary to protect our health, welfare, and environment.
One of the unintended legacies of keystone federal legislation like the Clean Water Act, NEPA, and Clean Air Act has been the deterioration of judgment by elected officials subservient to a campaign finance system that supports predetermined outcomes based on industry and special interest contributions.
While there have been documented achievements in improving the nation's air and water, a closer examination of the balance sheet for our quality of life and environment looks like the same obscure, indecipherable gibberish as an annual report from Enron.
Consider, just for one second, the manifest failures of the US Army Corps of Engineers, despite one lawsuit after another to influence a demented permitting process and planning missions that turn the best minds and technologies into Rube Goldberg-style designs.
Science, based on models that are open to interpretation and litigation, is a convenient crutch for a crippled democracy.
The nation's premier science organizations, like the USGS or National Acadamies of Science, are diligent to a fault to avoid any contamination of their work by political interference. But the result of their studies, at great expense, is just the beginning of a process that results in the political elite using science to its own purpose.
In addition to expense, then, is also the matter of time.
The most overt example is the Bush White House red-lining and editing EPA draft climate change reports to keep Exxon Mobil and the fossil fuel lobby in place, as top predators. Literally decades have been lost while carbon dioxide emissions have poured into the atmosphere, absent any coherent US policy based on precaution.
But there are literally thousands of examples how science is used as a substitute for judgment.
To this point, one of our readers submitted the following: "Why Mathematical Models Just Don't Add Up". It's worth reading...
Why Mathematical Models Just Don't Add Up
By ORRIN H. PILKEY and LINDA PILKEY-JARVIS
Assurances by scientists that the outcome of nature's dynamic processes can be predicted by quantitative mathematical models have created the delusion that we can calculate our way out of our environmental crises. The common use of such models has, in fact, damaged society in a number of ways.
For instance, the 500-year-old cod fishery in the Grand Banks, off Newfoundland, was destroyed by overfishing. That happened in large part because politicians, unable to make painful decisions on their own to reduce fishing and throw thousands of people out of work, shielded themselves behind models of the cod population — models that turned out to be faulty. Now the fish are gone, and so are the jobs.
Other predictive models led administrators in the U.S. Bureau of Land Management — who did not fully understand the models — to permit open-pit mines that could eventually become sources of serious pollution. A 1998 article in Time referred to one of the mines as a "giant cup of poison."
Yet another model, called the Bruun Rule, is used to predict how much a shoreline will retreat as sea level rises. Although the model has no discernible basis in reality, it continues to be cited around the world because no other model even attempts to answer that important question.
And the Army Corps of Engineers uses quantitative mathematical models to predict how long an artificial beach will last, even though that prediction clearly requires knowing when the next big storm will occur. Of course, no one has that knowledge far ahead of time.
The predictions about artificial beaches involve obviously absurd models. The reason coastal engineers and geologists go through the futile exercise is that the federal government will not authorize the corps to build beaches without a calculation of the cost-benefit ratio, and that requires a prediction of the beaches' durability. The practice continues despite many failed predictions, which are explained away as cases of unusual and unexpected storms. No one ever admits that the models were flawed. Would we accept such excuses from engineers who built a bridge that collapsed?
At the other end of the spectrum are the models used by the Intergovernmental Panel on Climate Change, established by the World Meteorological Organization and the United Nations Environment Programme to predict the future of global climate change. In great detail, the IPCC lists and evaluates the assumptions, uncertainties, and simplifications behind the predictive models in its publications.
That level of openness is laudable, and the models have considerable value as qualitative indicators: Combined with field observations, they make it clear that global warming is upon us, that we are in part responsible for it, and that we need to reduce our output of carbon dioxide. But the models, like other academic and applied quantitative models, produce inaccurate quantitative predictions. Because of the many factors involved, accurately predicting the outcome of natural processes on the surface of the earth is impossible.
To use a simple example, a common quantitative mathematical model involves predicting the amount of sand that will be transported annually by waves breaking on a beach. In our new book, Useless Arithmetic, we list 49 surf-zone parameters, of varying importance, that might contribute to sand transport. The model uses only six of the most important parameters. But even if we could include all 49, and even if we understood them and their interactions fully, we could never know the order in which they will occur on a given beach over the next year, or the year after that. We can never know precisely the direction from which the next storm's waves will come, their intensity, or their duration.
Furthermore, quantitative modeling cannot improve in any basic way, or become more accurate with time and experience. It is a problem that cannot be solved with even the most sophisticated mathematics. We call the problem "ordering complexity."
Ordering complexity is also why so-called hind casting — demonstrating the validity of a model by using it to "predict" an event that has already occurred — does not work. Because we cannot solve the problem of ordering complexity, a quantitative model's successful "prediction" of the past has little bearing on the same model's ability to predict the future.
Qualitative models, on the other hand, can be very useful. Returning to the example of predicting sand transport on beaches, one need worry only about the major processes that are always important, such as wave height, angle of wave approach to the shoreline, and the size of the grains of beach sand. One can ignore the other parameters, even though they will sometimes be important. Thus the answer is never expected to be precise. Qualitative modelers attempt to determine the net direction of sand transport and perhaps the order of magnitude of its volume: Is it large (on the scale of a million cubic yards of sand per year) or small (closer to 10,000 cubic yards of sand per year)?
Quantitative models generally answer the questions of where, when, and how much: How much will the sea level rise in the next 100 years? If we reduce carbon-dioxide output, when will the temperatures begin to go down?
Qualitative models answer the questions of how, why, and what if: How will sea level change in the next 100 years — will it rise or fall? If we reduce carbon-dioxide output, will temperatures fall?
In spite of the fact that qualitative models produce better results, our society as a whole remains overconfident about quantitative modeling. Accustomed to firm predictions — even if they turn out to be wrong — people find qualitative models insufficient.
In 2004 a federal court declared that high-level radioactive waste could not be stored at the Yucca Mountain Nuclear Waste Repository until the site could be certified to be safe for a million years. Over the next million years at Yucca Mountain, there will be several ice ages, other vast climate changes, and perhaps an earthquake or two and even a volcanic eruption. No one can guarantee that the radioactive waste would remain in the repository through those events, or predict where it might flow if it escaped, or how much damage it would do to people in the region.
Yet we have come to the point where mathematical models that cannot accurately predict the outcomes of natural processes are widely used and accepted without question by our society. Such models are considered to be state of the art. Surely the federal court would not have imposed such an impossible requirement at Yucca Mountain if it had not been convinced by scientists that models could accurately predict the future.
We suggest applying the embarrassment test. If it would be embarrassing to state out loud a simplified version of a model's parameters or processes, then the model cannot accurately portray the process.
In the example of beach-sand transport, quantitative modelers assume that all waves are the same wavelength, all waves come from the same direction, only the highest one-third of each wave moves sand, and the size of sand grains and shape of the beach remain constant. A scientist who stated those assumptions in a public lecture would be hooted off the podium. But buried deep within a model, such absurdities are considered valid.
Mathematical models are wooden and inflexible compared with the beautifully complex and dynamic nature of the earth. In the 1960s and 1970s — with the arrival of powerful personal computers, governmental requirements for environmental-impact statements, and widespread applications of mathematical models — scientists thought that quantitative models would be the bridge to a better, more secure future in our relationship with the environment. But they have proved to be a bridge too far.
We now know that there are no precise answers to many of the important questions we must ask about the future of human interaction with our planet. We must use more-qualitative ways to answer them.
Predictive quantitative models should be relegated to the dustbin of failed ideas.
Orrin H. Pilkey is an emeritus professor of geology at Duke University's Nicholas School of the Environment and Earth Sciences. Linda Pilkey-Jarvis is manager of the preparedness section of the spills program of Washington State's Department of Ecology. They are co-authors of Useless Arithmetic: Why Environmental Scientists Can't Predict the Future (Columbia University Press, 2007).
http://chronicle.com
Section: The Chronicle Review
Volume 53, Issue 38, Page B12
3 comments:
we have great readers, don't we?
Since many religious leaders and national politicians do not believe in evolution, and think the earth is 6,000 years old, is it any wonder that real scientific studies and evidence is dismissed when it is not supporting religious dogma and governmental policies.
The whole religious thing makes me feel very uncomfortable. I have a BA in a Religous field and I don't admit it. Too much bad stuff has come out of religion. It is painful for me when people use GOD and religion to argue a point. GOD doesn't need us to defend him/her. He/she is fully capable of enforcing his/her own wishes, one way or another. And I don't want anyone else speaking for me when it comes to determining GODs intent.
Post a Comment