Sunday, March 31, 2013

AN EXXON VALDEZ EVERY YEAR

From The Telegraph


Tim Blair

Sunday, March 31, 2013 (12:54am)

According to the American Wind Energy Association, some 500 birds hit the rotors in the US every single day: 
The National Wind Coordinating Collaborative, a collaboration of government officials, conservationists and industry representatives, more accurately estimates, based on actual data collected from over 100 wind farms nationally, the loss to be 200,000 birds annually. 
That’s close to the number of birds estimated to have been killed by the 1989 Exxon Valdez oil spill – but wind fans believe this level of avian munching to be “most benign”


From The LA Times


Searing questions

Experts debate risks to birds, planes and motorists from what will be the largest solar plant of its kind.



At what temperature might a songbird vaporize?
Will the glare from five square miles of mirrors create a distraction for highway drivers?
Can plumes of superheated air create enough turbulence to flip a small airplane?
What happens if one of the Air Force's heat-seeking missiles confuses a solar power plant with a military training target?
No one knows for sure. But as the state and federal government push hard to build solar energy plants across the Mojave Desert -- there are more than 100 solar applications pending -- the military, birders, aviation officials and others are eager for answers.
When completed, a massive plant now under construction near the California-Nevada border will be the largest of its kind in the world. More than 170,000 garage-door sized mirrors will spread across this broad valley. Every 10 seconds, computers will align the mirrors -- each equipped with its own GPS device -- to track the sun across the desert sky, bouncing radiation to the tops of three 45-story towers. Water stored inside the towers will be heated to 1,000 degrees, creating steam power.
The project's whiz-bang technology has confounded government regulators' ability to analyze the facility, in part because nothing of its type and size exists anywhere else in the world.
Although it approved Ivanpah's permit in 2010, the California Energy Commission struggled to assess the public health effects that would be created by the vast field of mirrors and the volumes of hot air pushed upward by spinning turbines and condensers.
Much of the analysis came from computer modeling, most of it provided by the project owners, Oakland-based BrightSource Energy.
In extensive hearings before the Energy Commission, the firm argued that concerns about its plant were overblown and that the project posed no danger to the public. The power plant -- one of dozens being fast-tracked by the Interior Department -- is slated to open early next year.
Others have their doubts.
"It's an experiment on a grand scale," said Jeffrey Lovich, a scientist with the U.S. Geological Survey.
Most questions begin with birds, which almost certainly will die at Ivanpah, just as they do at many large outdoor industrial operations. There is already documentation linking solar power to bird deaths.
About 30 years ago, ornithologist Robert McKernan and a colleague conducted studies at the Solar One plant near Barstow. By collecting and analyzing bird carcasses, they found that some birds flying through the solar field were incinerated outright. Others perished after their feathers were singed or burned off, or when they collided with the mirror structures or the central tower.
That plant, which began producing power in 1982, had 1,800 mirrors. The Ivanpah facility has 100 times that number and occupies a significantly larger portion of creosote habitat critical to migrating birds.
But BrightSource officials contend that there is less risk to birds soaring above Ivanpah, in part because the reflected heat at the new plant there will be one-third as intense as at Solar One.
Birds aren't the only flying objects at risk.
The Defense Department has expressed concern about large-scale solar plants' compatibility with aviation and weapons training at the Mojave region's nine military installations.

The test pilot school at Edwards Air Force base said the most common problems are a result of "electromagnetic intrusion/reflection, vertical obstruction, frequency spectrum overlap, infrared footprint and glint/glare."
Maj. John G. Garza, who represents the Pentagon on a California renewable energy planning group, said potential conflicts with solar plants in the desert are not yet fully understood.
One worrisome possibility?
"The solar tower would be a heat source," Garza said. "A heat-seeking missile could confuse the source, and instead of going to a target on the range, it would go to the tower."
A buffer zone between artillery ranges and solar installations could guard against that scenario. But Garza said no one yet knows how much space would be required.
One known aviation hazard results from the plants' use of high-powered exhaust fans for steam turbines, which can create plumes of superheated air that rise skyward.
Small planes are especially vulnerable.
On the approach to the Blythe airport, for example, aircraft often fly through such superheated air from a fossil fuel power plant at the end of the runway -- causing them to buck and veer off course.
"If you hit a plume dead center, you have one wing in and one wing out of it. It would flip an airplane in a heartbeat," said Pat Wolfe, who operated the Blythe airport for 20 years.
Wolfe said he took complaints to the Federal Aviation Administration and the state Energy Commission. "They didn't care," he said. "The information the power companies gave the Energy Commission was computer-generated, non peer-reviewed. It was a joke."

Saturday, March 30, 2013

NY’S COMMON CORE-ALIGNED LESSONS USE SCIENTOLOGY VIDEOS TO TEACH STUDENTS THEY HAVE RIGHT TO FOOD, HOUSING, CLOTHING, MEDICINE, EVEN A JOB

From The Blaze

Update: Two small adjustments to this story were made to clarify the relationship between Common Core and school districts.

A parent in upstate New York is claiming there is some disturbing information being taught to his child as a result of a Common Core-aligned lesson on government and human rights. (Common Core is the controversial standardized curriculum program being advocated for by the federal government.)
The latest example, he says, is that his daughter and her classmates are being taught a section on the 30 “universal human rights” declared by the United Nations in 1945. Those rights include:
• The right to a nationality, and to change that nationality whenever you want to.
• The right to a job for everyone who wants one.
• The right to “social security” (to be taken care of by the government when you cannot do it yourself).
• The right to food, clothing, housing and medicine.
• The right to work and join a union. (One of the rights also states that you cannot be compelled to join an association.)
• The right to play.
Common Core Uses Scientology Produced Videos To Teach Students They Have A Right To Food, Clothing, Housing, Medicine, Even A Job
Image: UN.org
The father of a 5th grade student who attends public school emailed TheBlaze to alert us to the lessons being taught to his child. The dad (who happens to be an attorney) emailed his daughter’s teacher to question her about the lessons and received a call from the school’s principal to address his concerns. He said he had a phone conversation with that principal, and the principle revealed that:




• The lessons are tied to the Common Core guidelines found on EngageNY.
• The U.N.’s Universal Declaration of Human Rights was being taught for about an hour each day over an eight-week period.
• The U.S. Constitution was part of  another eight-week “government” section, however only three weeks were spent studying it.
• The principal believes that most public schools in the state are using this program as well as others from Common Core.
In a subsequent phone conversation with TheBlaze, the father — who asked to remain anonymous to protect his child’s identity — added that the school’s principal was not happy about the curriculum mandates, but was powerless to do anything about it. All of the decisions and directions came from the state.
All the “right” things
The video that triggered the man’s initial email to the school is a 9:30 “documentary” titled, “What Are Human Rights?”
The webpage that features the above historical explanation of the U.N.’s Universal Declaration of Human Rights also contains a few other sections worth mentioning. There are 30 additional short, public-service-style videos, one each on the U.N.’s “rights.”
In this clip, called “Universal Rights,” a man is being threatened with execution and declares, “I have rights!” The bad guys laugh at him, ask him where he thinks he is and prepare to kill him. That is until a United Nations SUV rides into the frame and the bad guys immediately drop their weapons. One SUV with a U.N. logo is enough to convince these blood-thirsty killers to change their ways:
In addition to the “Universal Rights” clip, there are others. For example, a video that teaches students that the U.N.’s declaration also includes a “right to work.” Under that right are four basic rules:
  1. Everyone has the right to work, to free choice of employment, to just and favourable conditions of work and to protection against unemployment.
  2. Everyone, without any discrimination, has the right to equal pay for equal work.
  3. Everyone who works has the right to just and favourable remuneration ensuring for himself and his family an existence worthy of human dignity, and supplemented, if necessary, by other means of social protection.
  4. Everyone has the right to form and to join trade unions for the protection of his interests.
If “work” is a human right, many would argue that the State has a responsibility to create work for those who do not have it.
Also among the 30 rights, is the right for food and shelter. The UN defines this right very specifically:
  1. Everyone has the right to a standard of living adequate for the health and well-being of himself and of his family, including food, clothing, housing and medical care and necessary social services, and the right to security in the event of unemployment, sickness, disability, widowhood, old age or other lack of livelihood in circumstances beyond his control.
  2. Motherhood and childhood are entitled to special care and assistance. All children, whether born in or out of wedlock, shall enjoy the same social protection.
According to this right, the government would be expected to provide just about anything a person needs from cradle to grave.
The videos’ curious origins
As we researched the videos featured on the U.N.’s website and part of the homework assignment from Common Core, we noticed a couple of unusual things.
  • The producer of these videos was the Church of Scientology.
  • The clips have been online since 2008, most of them were uploaded on April 3 of that year.
The non-profit Church of Scientology posted the following statement under one of the clips:
Central to our beliefs in Scientology is a conviction that all humankind is entitled to inalienable rights. So it is for more than 50 years, Scientologists have championed the Universal Declaration of Human Rights. Today, the Church of Scientology sponsors the largest non-governmental information campaign to make the Universal Declaration of Human Rights known the world over. To date, the Church has brought its campaign to more than 600 million.
When you watch the clips via Scientology’s YouTube channel, the church has disabled the commenting section under the videos, slightly considering rights No. 18 and No. 19 — freedom of thought and freedom of expression.
Common Core Uses Scientology Produced Videos To Teach Students They Have A Right To Food, Clothing, Housing, Medicine, Even A Job
Image: YouTube
Separation of church and state?
The short summary of the story isn’t hard to follow:
• A public school in New York state is teaching an eight-week course in the U.N.’s Universal Declaration of Human Rights while the U.S. Constitution is only taught for three weeks.
• A concerned father says he talked to the local principal who told him he believes the lesson is used widely in other schools.
• Within the assignments for the U.N. study section, students are instructed to watch a video produced by the Church of Scientology and are offered additional short videos about each of the Human Rights. Our source says that his daughter watched “15 or 16 of the 30 videos in classrooms.” All of these videos are also products of Scientology.
In our discussion with the concerned father, we asked if there is a potential legal issue with a church providing educational materials for public schools. In other words, Scientology’s involvement a possible violation of the oft-cited separation of church and state? While he was intrigued by that possibility, he was more concerned over the lesson’s political push.
Those concerns aren’t unfounded. On the same page as the first video we featured is an option to take part in a petition to implement the U.N.’s Declaration of Human Rights.
Common Core Uses Scientology Produced Videos To Teach Students They Have A Right To Food, Clothing, Housing, Medicine, Even A Job
Image: UN.org
The attorney’s concern here is that a specific competing political policy (from the U.N.) is being taught in greater detail than the U.S. Constitution. And that students are then being called to action and asked to sign a petition supporting a list of rights that have no legal standing in our country.
Since the school’s principal insisted that he had no say in the matter of how Common Core was used, we contacted the media spokesperson for the New York State Education Department and sent  a short list of five questions. We were referred to a basic information website for Common Core information and told, “I’ll see what I can do about answering them today; in the meantime, though, take a look at our recently-issued Common Core field memo.” No further information was sent prior to publication of this story.

From CATO


Look at No Child Left Behind. See, No Federal Control. Wait…


In what is either a case of blinders-wearing or just poor timing, today the Fordham Institute’s Kathleen Porter-Magee has an article on NRO, co-written with the Manhattan Institute’s Sol Stern, in which she and Stern take to task national curriculum standards critics who assert, among other things, that the Common Core is being pushed by President Obama. Yes, that’s the same Kathleen Porter-Magee whom it was announced a couple of days ago would be on a federal “technical review” panel to evaluate federally funded tests that go with the Common Core.
The ironic timing of the article alone is probably sufficient to rebut arguments suggesting that the Common Core isn’t very much a federal child. Still, let’s take apart a few of the specifics Porter-Magee and Stern offer on the federal aspect. (Other Core critics, I believe, will be addressing contentions about Common Core content).
Some argue that states were coerced into adopting Common Core by the Obama administration as a requirement for applying for its Race to the Top grant competition (and No Child Left Behind waiver program). But the administration has stated that adoption of “college and career readiness standards” doesn’t necessarily mean adoption of Common Core. At least a handful of states had K–12 content standards that were equally good, and the administration would have been hard-pressed to argue otherwise.
Ah, the power of parsing. While it is technically correct that in the Race to the Top regulations the administration did not write that states must specifically adopt the Common Core, it required that states adopt a “common set of K-12 standards,” and definedthat as “a set of content standards that define what students must know and be able to do and that are substantially identical across all States in a consortium.” How many consortia met that definition at the time of RTTT? Aside, perhaps, from the New England Common Assessment Program, only one: the Common Core.
NCLB waivers, for their part, gave states an additional option – having their state college systems confirm state standards as “college  and career ready” – but that came after RTTT had already pushed states to adopt Common Core, and offered only a single alternative. That’s probably why, to use Stern and Porter-Magee’s own words, “President Obama often tries to claim credit” for widespread adoption of the Core. He actually had a lot to do with it!
As for states having “equally good” standards somehow being able to get past RTTT commonality demands, well, that’s just not how it works. The rules were the rules, and states didn’t just get out of them by saying “I dare you to act like our standards aren’t super.”
Education policymaking — and 90 percent of funding — is still handled at the state and local levels. And tying strings to federal education dollars is nothing new. No Child Left Behind — George W. Bush’s signature education law — linked federal Title I dollars directly to state education policy, and states not complying risked losing millions in compensatory-education funding (that is, funding for programs for children at risk of dropping out of school).
This is a very curious, self-defeating argument. Basically, Porter-Magee and Stern are asserting that the Feds only supply a small fraction of education money, and yet all states got sucked into No Child Left Behind. Applied to Common Core, federal money needn’t be very large in percentage terms to be irresistible, illustrating the very point about compulsion that Stern and Porter-Magee hope to refute. And it’s not hard to see why relatively small bombs of federal money pack a big punch: Taxpayers – who live in states –had no choice about paying their federal taxes, and no matter how they look in relative terms, millions or billions of federal dollars seem like mammoth sums in most news stories.
Perhaps the clearest evidence that states can still set their own standards is the fact that five states have not adopted Common Core. Some that have adopted it might opt out, and they shouldn’t lose a dime if they do.
It’s true that five states have not fully signed on to Common Core (Minnesota has adopted the language arts, but not math, portions), but that’s likely in large part because Race to the Top did not put annual funding on the line, and waivers had a non-Core option. But forty-five state did sign on, suggesting that the push was still very forceful. And it is irrelevant whether Porter-Magee and Stern think that states that opt out shouldn’t lose a dime of federal money. The reality is that those that have opted out did lose a full chance to win Race to the Top money, and if Common Core and accompanying tests are made central to a reauthorized NCLB – and why wouldn’t they be, since almost every state has adopted them – then annual funding would be put at risk. Which is what Common Core supporters have probably wanted since before the Obama administration existed, writing in 2008 that the job of the federal government is to furnish “incentives” for state adoption of “common core” standards.
So please, do look at NCLB when thinking about possible federal control of the Common Core. It’s a clarion alarm about what’s likely coming.



Twenty-year hiatus in rising temperatures has climate scientists puzzled

From The Australian





Climate change
The fact that global surface temperatures have not followed the expected global warming pattern is now widely accepted. Picture: Ray Strange Source: The Australian
DEBATE about the reality of a two-decade pause in global warming and what it means has made its way from the sceptical fringe to the mainstream.
In a lengthy article this week, The Economist magazine said if climate scientists were credit-rating agencies, then climate sensitivity - the way climate reacts to changes in carbon-dioxide levels - would be on negative watch but not yet downgraded.
Another paper published by leading climate scientist James Hansen, the head of NASA's Goddard Institute for Space Studies, says the lower than expected temperature rise between 2000 and the present could be explained by increased emissions from burning coal.
For Hansen the pause is a fact, but it's good news that probably won't last.
International Panel on Climate Change chairman Rajendra Pachauri recently told The Weekend Australian the hiatus would have to last 30 to 40 years "at least" to break the long-term warming trend.


But the fact that global surface temperatures have not followed the expected global warming pattern is now widely accepted.
Research by Ed Hawkins of University of Reading shows surface temperatures since 2005 are already at the low end of the range projections derived from 20 climate models and if they remain flat, they will fall outside the models' range within a few years.
"The global temperature standstill shows that climate models are diverging from observations," says David Whitehouse of the Global Warming Policy Foundation.
"If we have not passed it already, we are on the threshold of global observations becoming incompatible with the consensus theory of climate change," he says.
Whitehouse argues that whatever has happened to make temperatures remain constant requires an explanation because the pause in temperature rise has occurred despite a sharp increase in global carbon emissions.
The Economist says the world has added roughly 100 billion tonnes of carbon to the atmosphere between 2000 and 2010, about one-quarter of all the carbon dioxide put there by humans since 1750. This mismatch between rising greenhouse gas emissions and not-rising temperatures is among the biggest puzzles in climate science just now, The Economist article says.
"But it does not mean global warming is a delusion."
The fact is temperatures between 2000 and 2010 are still almost 1C above their level in the first decade of the 20th century.
"The mismatch might mean that for some unexplained reason there has been a temporary lag between more carbon dioxide and higher temperatures in 2000-2010.
"Or it might mean that the 1990s, when temperatures were rising fast, was the anomalous period."
The magazine explores a range of possible explanations including higher emissions of sulphur dioxide, the little understood impact of clouds and the circulation of heat into the deep ocean.
But it also points to an increasing body of research that suggests it may be that climate is responding to higher concentrations of atmospheric carbon dioxide in ways that had not been properly understood before.
"This possibility, if true, could have profound significance both for climate science and for environmental and social policy," the article says.
There are now a number of studies that predict future temperature rises as a result of man-made carbon dioxide emissions at well below the IPCC best estimate of about 3C over the century.
The upcoming IPCC report is expected to lift the maximum possible temperature increase to 6C.
The Research Council of Norway says in a non-peer-reviewed paper that the best estimate concludes there is a 90 per cent probability that doubling CO2 emissions will increase temperatures by only 1.2C to 2.9C, the most likely figure being 1.9C.
Another study based on the way the climate behaved about 20,000 years ago has given a best guess of 2.3C.
Other forecasts, accepted for publication, have reanalysed work cited by the IPCC but taken account of more recent temperature data and given a figure of between 1C and 3C.
The Economist says understanding which estimate is true is vital to getting the best response.
"If as conventional wisdom has it, global temperatures could rise by 3C or more in response to a doubling of emissions, then the correct response would be the one to which most of the world pays lip service; rein in the warming and the greenhouse gases causing it," the article says.
"If, however, temperatures are likely to rise by only 2 degrees Celsius in response to a doubling of carbon emissions (and if the likelihood of a 6 degrees Celsius is trivial) the calculation might change," it says.
"Perhaps the world should seek to adjust to (rather than stop) the greenhouse-gas splurge.
"There is no point buying earthquake insurance if you don't live in an earthquake zone."
According to The Economist, "given the hiatus in warming and all the new evidence, a small reduction in estimates of climate sensitivity would seem to be justified." On face value, Hansen agrees the slowdown in global temperature rises can be seen as "good news".
But he is not ready to recalculate the Faustian bargain that weighs the future cost to humanity of continued carbon dioxide emissions.
Hansen argues that the impact of human carbon dioxide emissions has been masked by the sharp increase in coal use, primarily in China and India.
Increased particulate and nitrogen pollution has worked in the opposite direction of rising carbon dioxide levels in the atmosphere.
Another paper published in Geophysical Research Letters on research from the University of Colorado Boulder found small volcanoes, not more coal power stations in China, were responsible for the slowdown in global warming.
But this did not mean that climate change was not a problem.
"Emissions from volcanic gases go up and down, helping to cool or heat the planet, while greenhouse gases from human activity just continue to go up," author Ryan Neely says.
Hansen's bottom line is that increased short-term masking of greenhouse gas warming by fossil fuel particulate and nitrogen pollution represents a "doubling down" of the Faustian bargain, an increase in the stakes.
"The more we allow the Faustian debt to build, the more unmanageable the eventual consequences will be," he says.

From The Economist


A sensitive matter

The climate may be heating up less in response to greenhouse-gas emissions than was once thought. But that does not mean the problem is going away







OVER the past 15 years air temperatures at the Earth’s surface have been flat while greenhouse-gas emissions have continued to soar. The world added roughly 100 billion tonnes of carbon to the atmosphere between 2000 and 2010. That is about a quarter of all the CO₂ put there by humanity since 1750. And yet, as James Hansen, the head of NASA’s Goddard Institute for Space Studies, observes, “the five-year mean global temperature has been flat for a decade.”

The mismatch between rising greenhouse-gas emissions and not-rising temperatures is among the biggest puzzles in climate science just now. It does not mean global warming is a delusion. Flat though they are, temperatures in the first decade of the 21st century remain almost 1°C above their level in the first decade of the 20th. But the puzzle does need explaining.
Temperatures fluctuate over short periods, but this lack of new warming is a surprise. Ed Hawkins, of the University of Reading, in Britain, points out that surface temperatures since 2005 are already at the low end of the range of projections derived from 20 climate models (see chart 1). If they remain flat, they will fall outside the models’ range within a few years.
The mismatch might mean that—for some unexplained reason—there has been a temporary lag between more carbon dioxide and higher temperatures in 2000-10. Or it might be that the 1990s, when temperatures were rising fast, was the anomalous period. Or, as an increasing body of research is suggesting, it may be that the climate is responding to higher concentrations of carbon dioxide in ways that had not been properly understood before. This possibility, if true, could have profound significance both for climate science and for environmental and social policy.
The insensitive planet
The term scientists use to describe the way the climate reacts to changes in carbon-dioxide levels is “climate sensitivity”. This is usually defined as how much hotter the Earth will get for each doubling of CO₂ concentrations. So-called equilibrium sensitivity, the commonest measure, refers to the temperature rise after allowing all feedback mechanisms to work (but without accounting for changes in vegetation and ice sheets).
Carbon dioxide itself absorbs infra-red at a consistent rate. For each doubling of CO₂ levels you get roughly 1°C of warming. A rise in concentrations from preindustrial levels of 280 parts per million (ppm) to 560ppm would thus warm the Earth by 1°C. If that were all there was to worry about, there would, as it were, be nothing to worry about. A 1°C rise could be shrugged off. But things are not that simple, for two reasons. One is that rising CO₂ levels directly influence phenomena such as the amount of water vapour (also a greenhouse gas) and clouds that amplify or diminish the temperature rise. This affects equilibrium sensitivity directly, meaning doubling carbon concentrations would produce more than a 1°C rise in temperature. The second is that other things, such as adding soot and other aerosols to the atmosphere, add to or subtract from the effect of CO₂. All serious climate scientists agree on these two lines of reasoning. But they disagree on the size of the change that is predicted.
The Intergovernmental Panel on Climate Change (IPCC), which embodies the mainstream of climate science, reckons the answer is about 3°C, plus or minus a degree or so. In its most recent assessment (in 2007), it wrote that “the equilibrium climate sensitivity…is likely to be in the range 2°C to 4.5°C with a best estimate of about 3°C and is very unlikely to be less than 1.5°C. Values higher than 4.5°C cannot be excluded.” The IPCC’s next assessment is due in September. A draft version was recently leaked. It gave the same range of likely outcomes and added an upper limit of sensitivity of 6°C to 7°C.
A rise of around 3°C could be extremely damaging. The IPCC’s earlier assessment said such a rise could mean that more areas would be affected by drought; that up to 30% of species could be at greater risk of extinction; that most corals would face significant biodiversity losses; and that there would be likely increases of intense tropical cyclones and much higher sea levels.
New Model Army
Other recent studies, though, paint a different picture. An unpublished report by the Research Council of Norway, a government-funded body, which was compiled by a team led by Terje Berntsen of the University of Oslo, uses a different method from the IPCC’s. It concludes there is a 90% probability that doubling CO₂ emissions will increase temperatures by only 1.2-2.9°C, with the most likely figure being 1.9°C. The top of the study’s range is well below the IPCC’s upper estimates of likely sensitivity.
This study has not been peer-reviewed; it may be unreliable. But its projections are not unique. Work by Julia Hargreaves of the Research Institute for Global Change in Yokohama, which was published in 2012, suggests a 90% chance of the actual change being in the range of 0.5-4.0°C, with a mean of 2.3°C. This is based on the way the climate behaved about 20,000 years ago, at the peak of the last ice age, a period when carbon-dioxide concentrations leapt. Nic Lewis, an independent climate scientist, got an even lower range in a study accepted for publication: 1.0-3.0°C, with a mean of 1.6°C. His calculations reanalysed work cited by the IPCC and took account of more recent temperature data. In all these calculations, the chances of climate sensitivity above 4.5°C become vanishingly small.
If such estimates were right, they would require revisions to the science of climate change and, possibly, to public policies. If, as conventional wisdom has it, global temperatures could rise by 3°C or more in response to a doubling of emissions, then the correct response would be the one to which most of the world pays lip service: rein in the warming and the greenhouse gases causing it. This is called “mitigation”, in the jargon. Moreover, if there were an outside possibility of something catastrophic, such as a 6°C rise, that could justify drastic interventions. This would be similar to taking out disaster insurance. It may seem an unnecessary expense when you are forking out for the premiums, but when you need it, you really need it. Many economists, including William Nordhaus of Yale University, have made this case.
If, however, temperatures are likely to rise by only 2°C in response to a doubling of carbon emissions (and if the likelihood of a 6°C increase is trivial), the calculation might change. Perhaps the world should seek to adjust to (rather than stop) the greenhouse-gas splurge. There is no point buying earthquake insurance if you do not live in an earthquake zone. In this case more adaptation rather than more mitigation might be the right policy at the margin. But that would be good advice only if these new estimates really were more reliable than the old ones. And different results come from different models.
One type of model—general-circulation models, or GCMs—use a bottom-up approach. These divide the Earth and its atmosphere into a grid which generates an enormous number of calculations in order to imitate the climate system and the multiple influences upon it. The advantage of such complex models is that they are extremely detailed. Their disadvantage is that they do not respond to new temperature readings. They simulate the way the climate works over the long run, without taking account of what current observations are. Their sensitivity is based upon how accurately they describe the processes and feedbacks in the climate system.
The other type—energy-balance models—are simpler. They are top-down, treating the Earth as a single unit or as two hemispheres, and representing the whole climate with a few equations reflecting things such as changes in greenhouse gases, volcanic aerosols and global temperatures. Such models do not try to describe the complexities of the climate. That is a drawback. But they have an advantage, too: unlike the GCMs, they explicitly use temperature data to estimate the sensitivity of the climate system, so they respond to actual climate observations.
The IPCC’s estimates of climate sensitivity are based partly on GCMs. Because these reflect scientists’ understanding of how the climate works, and that understanding has not changed much, the models have not changed either and do not reflect the recent hiatus in rising temperatures. In contrast, the Norwegian study was based on an energy-balance model. So were earlier influential ones by Reto Knutti of the Institute for Atmospheric and Climate Science in Zurich; by Piers Forster of the University of Leeds and Jonathan Gregory of the University of Reading; by Natalia Andronova and Michael Schlesinger, both of the University of Illinois; and by Magne Aldrin of the Norwegian Computing Centre (who is also a co-author of the new Norwegian study). All these found lower climate sensitivities. The paper by Drs Forster and Gregory found a central estimate of 1.6°C for equilibrium sensitivity, with a 95% likelihood of a 1.0-4.1°C range. That by Dr Aldrin and others found a 90% likelihood of a 1.2-3.5°C range.
It might seem obvious that energy-balance models are better: do they not fit what is actually happening? Yes, but that is not the whole story. Myles Allen of Oxford University points out that energy-balance models are better at representing simple and direct climate feedback mechanisms than indirect and dynamic ones. Most greenhouse gases are straightforward: they warm the climate. The direct impact of volcanoes is also straightforward: they cool it by reflecting sunlight back. But volcanoes also change circulation patterns in the atmosphere, which can then warm the climate indirectly, partially offsetting the direct cooling. Simple energy-balance models cannot capture this indirect feedback. So they may exaggerate volcanic cooling.
This means that if, for some reason, there were factors that temporarily muffled the impact of greenhouse-gas emissions on global temperatures, the simple energy-balance models might not pick them up. They will be too responsive to passing slowdowns. In short, the different sorts of climate model measure somewhat different things.
Clouds of uncertainty
This also means the case for saying the climate is less sensitive to CO₂ emissions than previously believed cannot rest on models alone. There must be other explanations—and, as it happens, there are: individual climatic influences and feedback loops that amplify (and sometimes moderate) climate change.
Begin with aerosols, such as those from sulphates. These stop the atmosphere from warming by reflecting sunlight. Some heat it, too. But on balance aerosols offset the warming impact of carbon dioxide and other greenhouse gases. Most climate models reckon that aerosols cool the atmosphere by about 0.3-0.5°C. If that underestimated aerosols’ effects, perhaps it might explain the lack of recent warming.
Yet it does not. In fact, it may actually be an overestimate. Over the past few years, measurements of aerosols have improved enormously. Detailed data from satellites and balloons suggest their cooling effect is lower (and their warming greater, where that occurs). The leaked assessment from the IPCC (which is still subject to review and revision) suggested that aerosols’ estimated radiative “forcing”—their warming or cooling effect—had changed from minus 1.2 watts per square metre of the Earth’s surface in the 2007 assessment to minus 0.7W/m ² now: ie, less cooling.
One of the commonest and most important aerosols is soot (also known as black carbon). This warms the atmosphere because it absorbs sunlight, as black things do. The most detailed study of soot was published in January and also found more net warming than had previously been thought. It reckoned black carbon had a direct warming effect of around 1.1W/m ². Though indirect effects offset some of this, the effect is still greater than an earlier estimate by the United Nations Environment Programme of 0.3-0.6W/m ².
All this makes the recent period of flat temperatures even more puzzling. If aerosols are not cooling the Earth as much as was thought, then global warming ought to be gathering pace. But it is not. Something must be reining it back. One candidate is lower climate sensitivity.
A related possibility is that general-circulation climate models may be overestimating the impact of clouds (which are themselves influenced by aerosols). In all such models, clouds amplify global warming, sometimes by a lot. But as the leaked IPCC assessment says, “the cloud feedback remains the most uncertain radiative feedback in climate models.” It is even possible that some clouds may dampen, not amplify global warming—which may also help explain the hiatus in rising temperatures. If clouds have less of an effect, climate sensitivity would be lower.
So the explanation may lie in the air—but then again it may not. Perhaps it lies in the oceans. But here, too, facts get in the way. Over the past decade the long-term rise in surface seawater temperatures seems to have stalled (see chart 2), which suggests that the oceans are not absorbing as much heat from the atmosphere.
As with aerosols, this conclusion is based on better data from new measuring devices. But it applies only to the upper 700 metres of the sea. What is going on below that—particularly at depths of 2km or more—is obscure. A study in Geophysical Research Letters by Kevin Trenberth of America’s National Centre for Atmospheric Research and others found that 30% of the ocean warming in the past decade has occurred in the deep ocean (below 700 metres). The study says a substantial amount of global warming is going into the oceans, and the deep oceans are heating up in an unprecedented way. If so, that would also help explain the temperature hiatus.
Double-A minus
Lastly, there is some evidence that the natural (ie, non-man-made) variability of temperatures may be somewhat greater than the IPCC has thought. A recent paper by Ka-Kit Tung and Jiansong Zhou in the Proceedings of the National Academy of Sciences links temperature changes from 1750 to natural changes (such as sea temperatures in the Atlantic Ocean) and suggests that “the anthropogenic global-warming trends might have been overestimated by a factor of two in the second half of the 20th century.” It is possible, therefore, that both the rise in temperatures in the 1990s and the flattening in the 2000s have been caused in part by natural variability.
So what does all this amount to? The scientists are cautious about interpreting their findings. As Dr Knutti puts it, “the bottom line is that there are several lines of evidence, where the observed trends are pushing down, whereas the models are pushing up, so my personal view is that the overall assessment hasn’t changed much.”
But given the hiatus in warming and all the new evidence, a small reduction in estimates of climate sensitivity would seem to be justified: a downwards nudge on various best estimates from 3°C to 2.5°C, perhaps; a lower ceiling (around 4.5°C), certainly. If climate scientists were credit-rating agencies, climate sensitivity would be on negative watch. But it would not yet be downgraded.
Equilibrium climate sensitivity is a benchmark in climate science. But it is a very specific measure. It attempts to describe what would happen to the climate once all the feedback mechanisms have worked through; equilibrium in this sense takes centuries—too long for most policymakers. As Gerard Roe of the University of Washington argues, even if climate sensitivity were as high as the IPCC suggests, its effects would be minuscule under any plausible discount rate because it operates over such long periods. So it is one thing to ask how climate sensitivity might be changing; a different question is to ask what the policy consequences might be.
For that, a more useful measure is the transient climate response (TCR), the temperature you reach after doubling CO₂ gradually over 70 years. Unlike the equilibrium response, the transient one can be observed directly; there is much less controversy about it. Most estimates put the TCR at about 1.5°C, with a range of 1-2°C. Isaac Held of America’s National Oceanic and Atmospheric Administration recently calculated his “personal best estimate” for the TCR: 1.4°C, reflecting the new estimates for aerosols and natural variability.
That sounds reassuring: the TCR is below estimates for equilibrium climate sensitivity. But the TCR captures only some of the warming that those 70 years of emissions would eventually generate because carbon dioxide stays in the atmosphere for much longer.
As a rule of thumb, global temperatures rise by about 1.5°C for each trillion tonnes of carbon put into the atmosphere. The world has pumped out half a trillion tonnes of carbon since 1750, and temperatures have risen by 0.8°C. At current rates, the next half-trillion tonnes will be emitted by 2045; the one after that before 2080.
Since CO₂ accumulates in the atmosphere, this could increase temperatures compared with pre-industrial levels by around 2°C even with a lower sensitivity and perhaps nearer to 4°C at the top end of the estimates. Despite all the work on sensitivity, no one really knows how the climate would react if temperatures rose by as much as 4°C. Hardly reassuring.