Saturday, May 26, 2007

Nuclear Power and The Real Estate Bust by Geniusofdespair


Memorial Day Weekend and nothing to do and drawing a blank on what to write about. The Herald ran it’s usual story on the bad real estate market — it was in the business section this time. Esslinger-Wooten-Maxwell reported that unsold inventory for condo’s in Miami Dade is at 53% - I am glad I dumped mine when this downturn started. The advice is the article: Sellers must lower their prices if they want to sell.

The Herald reported (Nuclear plant eyed for Dade)
on a third nuclear power reactor proposed at Turkey Point (to join the two already there, and the oil fired plant and the gas fired plant also there). If it is such a good idea, why does FP&L need 9 lobbyists to push it?

2 comments:

Anonymous said...

Thinking of the major life sustaining issues that we are trying to find solutions to in Miami Dade makes it really important that we closely look at all the expert studies used to support these huge and expensive projects. A recent article below from The Chronicle of Higher Education speaks to this very issue
---------------------

Why Mathematical Models Just Don't Add Up
By ORRIN H. PILKEY and LINDA PILKEY-JARVIS

Assurances by scientists that the outcome of nature's dynamic processes can be predicted by quantitative mathematical models have created the delusion that we can calculate our way out of our environmental crises. The common use of such models has, in fact, damaged society in a number of ways.

For instance, the 500-year-old cod fishery in the Grand Banks, off Newfoundland, was destroyed by overfishing. That happened in large part because politicians, unable to make painful decisions on their own to reduce fishing and throw thousands of people out of work, shielded themselves behind models of the cod population — models that turned out to be faulty. Now the fish are gone, and so are the jobs.

Other predictive models led administrators in the U.S. Bureau of Land Management — who did not fully understand the models — to permit open-pit mines that could eventually become sources of serious pollution. A 1998 article in Time referred to one of the mines as a "giant cup of poison."

Yet another model, called the Bruun Rule, is used to predict how much a shoreline will retreat as sea level rises. Although the model has no discernible basis in reality, it continues to be cited around the world because no other model even attempts to answer that important question.

And the Army Corps of Engineers uses quantitative mathematical models to predict how long an artificial beach will last, even though that prediction clearly requires knowing when the next big storm will occur. Of course, no one has that knowledge far ahead of time.

The predictions about artificial beaches involve obviously absurd models. The reason coastal engineers and geologists go through the futile exercise is that the federal government will not authorize the corps to build beaches without a calculation of the cost-benefit ratio, and that requires a prediction of the beaches' durability. The practice continues despite many failed predictions, which are explained away as cases of unusual and unexpected storms. No one ever admits that the models were flawed. Would we accept such excuses from engineers who built a bridge that collapsed?

At the other end of the spectrum are the models used by the Intergovernmental Panel on Climate Change, established by the World Meteorological Organization and the United Nations Environment Programme to predict the future of global climate change. In great detail, the IPCC lists and evaluates the assumptions, uncertainties, and simplifications behind the predictive models in its publications.

That level of openness is laudable, and the models have considerable value as qualitative indicators: Combined with field observations, they make it clear that global warming is upon us, that we are in part responsible for it, and that we need to reduce our output of carbon dioxide. But the models, like other academic and applied quantitative models, produce inaccurate quantitative predictions. Because of the many factors involved, accurately predicting the outcome of natural processes on the surface of the earth is impossible.

To use a simple example, a common quantitative mathematical model involves predicting the amount of sand that will be transported annually by waves breaking on a beach. In our new book, Useless Arithmetic, we list 49 surf-zone parameters, of varying importance, that might contribute to sand transport. The model uses only six of the most important parameters. But even if we could include all 49, and even if we understood them and their interactions fully, we could never know the order in which they will occur on a given beach over the next year, or the year after that. We can never know precisely the direction from which the next storm's waves will come, their intensity, or their duration.

Furthermore, quantitative modeling cannot improve in any basic way, or become more accurate with time and experience. It is a problem that cannot be solved with even the most sophisticated mathematics. We call the problem "ordering complexity."

Ordering complexity is also why so-called hind casting — demonstrating the validity of a model by using it to "predict" an event that has already occurred — does not work. Because we cannot solve the problem of ordering complexity, a quantitative model's successful "prediction" of the past has little bearing on the same model's ability to predict the future.

Qualitative models, on the other hand, can be very useful. Returning to the example of predicting sand transport on beaches, one need worry only about the major processes that are always important, such as wave height, angle of wave approach to the shoreline, and the size of the grains of beach sand. One can ignore the other parameters, even though they will sometimes be important. Thus the answer is never expected to be precise. Qualitative modelers attempt to determine the net direction of sand transport and perhaps the order of magnitude of its volume: Is it large (on the scale of a million cubic yards of sand per year) or small (closer to 10,000 cubic yards of sand per year)?

Quantitative models generally answer the questions of where, when, and how much: How much will the sea level rise in the next 100 years? If we reduce carbon-dioxide output, when will the temperatures begin to go down?

Qualitative models answer the questions of how, why, and what if: How will sea level change in the next 100 years — will it rise or fall? If we reduce carbon-dioxide output, will temperatures fall?

In spite of the fact that qualitative models produce better results, our society as a whole remains overconfident about quantitative modeling. Accustomed to firm predictions — even if they turn out to be wrong — people find qualitative models insufficient.

In 2004 a federal court declared that high-level radioactive waste could not be stored at the Yucca Mountain Nuclear Waste Repository until the site could be certified to be safe for a million years. Over the next million years at Yucca Mountain, there will be several ice ages, other vast climate changes, and perhaps an earthquake or two and even a volcanic eruption. No one can guarantee that the radioactive waste would remain in the repository through those events, or predict where it might flow if it escaped, or how much damage it would do to people in the region.

Yet we have come to the point where mathematical models that cannot accurately predict the outcomes of natural processes are widely used and accepted without question by our society. Such models are considered to be state of the art. Surely the federal court would not have imposed such an impossible requirement at Yucca Mountain if it had not been convinced by scientists that models could accurately predict the future.

We suggest applying the embarrassment test. If it would be embarrassing to state out loud a simplified version of a model's parameters or processes, then the model cannot accurately portray the process.

In the example of beach-sand transport, quantitative modelers assume that all waves are the same wavelength, all waves come from the same direction, only the highest one-third of each wave moves sand, and the size of sand grains and shape of the beach remain constant. A scientist who stated those assumptions in a public lecture would be hooted off the podium. But buried deep within a model, such absurdities are considered valid.

Mathematical models are wooden and inflexible compared with the beautifully complex and dynamic nature of the earth. In the 1960s and 1970s — with the arrival of powerful personal computers, governmental requirements for environmental-impact statements, and widespread applications of mathematical models — scientists thought that quantitative models would be the bridge to a better, more secure future in our relationship with the environment. But they have proved to be a bridge too far.

We now know that there are no precise answers to many of the important questions we must ask about the future of human interaction with our planet. We must use more-qualitative ways to answer them.

Predictive quantitative models should be relegated to the dustbin of failed ideas.

Orrin H. Pilkey is an emeritus professor of geology at Duke University's Nicholas School of the Environment and Earth Sciences. Linda Pilkey-Jarvis is manager of the preparedness section of the spills program of Washington State's Department of Ecology. They are co-authors of Useless Arithmetic: Why Environmental Scientists Can't Predict the Future (Columbia University Press, 2007).

http://chronicle.com
Section: The Chronicle Review
Volume 53, Issue 38, Page B12

Anonymous said...

There is absolutely no need for nuclear power in the US because there is a simple mature technology available that can deliver huge amounts of clean energy without any of the headaches of nuclear power.

I refer to 'concentrating solar power' (CSP), the technique of concentrating sunlight using mirrors to create heat, and then using the heat to raise steam and drive turbines and generators, just like a conventional power station. It is possible to store solar heat in melted salts so that electricity generation may continue through the night or on cloudy days. This technology has been generating electricity successfully in California since 1985 and currently provides power for about 100,000 Californian homes. CSP plants are now being planned or built in many parts of the world.

CSP works best in hot deserts and it is feasible and economic to transmit solar electricity over very long distances using highly-efficient 'HVDC' transmission lines. With transmission losses at about 3% per 1000 km, solar electricity may be transmitted to anywhere in the US. A recent report from the American Solar Energy Society says that CSP plants in the south western states of the US "could provide nearly 7,000 GW of capacity, or about seven times the current total US electric capacity".

In the 'TRANS-CSP' report commissioned by the German government, it is estimated that CSP electricity, imported from North Africa and the Middle East, could become one of the cheapest sources of electricity in Europe, including the cost of transmission. A large-scale HVDC transmission grid has also been proposed by Airtricity as a means of optimising the use of wind power throughout Europe.

Further information about CSP may be found at www.trec-uk.org.uk and www.trecers.net . Copies of the TRANS-CSP report may be downloaded from www.trec-uk.org.uk/reports.htm . The many problems associated with nuclear power are summarised at www.mng.org.uk/green_house/no_nukes.htm .