Wednesday, September 2, 2009

Bjorn Lomborg on Technology

In this article, Bjorn Lomborg bemoans the reality that two decades of political posturing and treaty manufacture has accomplished zero. And let us consider that. Reducing our dependence on carbon combustion was simply difficult and is still difficult. It costs more even with present superior pricing. That it costs more is a major difficulty. So long as carbon based fuels are cheaper than alternatives or simply more convenient, they will naturally be used first.

That a universal tax might change the economics is true. Except it has to be universal. No nation can stand outside such a rule. That is the ongoing problem faced by humanity, no one wants to address the need for the creation of universal commissions of the environment who have authority to develop and implement best practice and oversee it.

Common action will solve just about all else. It may sound simplistic, but valid cheap solutions are lying around unimplemented. A lot is simply human inertia. After all, if your engineering career has been about building land fills, then all waste solutions are land fill projects. What is more, you have the reputation and political support all locked down.

The good news, provided one has patience is that better solutions will eventually find their champions and be made operational.

I have told you about acid rain in a pipe. The science is all known. The engineering components are all known. One engineer will someday bravely get a small operation going on an individual smoke stack to learn his business. This will take three to five years before everyone is satisfied. This will then lead to a single large application built out. It too will operate for five years allowing proper papers to be presented at the proper engineering conferences. Then, about ten years on several groups will use the technology and build out several separate plants. We now have well established technology that possibly becomes universal.

I have seen this happen with SAGD in the oil business and in many other situations. The process is so slow as to be almost invisible.

The only thing that can speed this up is crisis.

We are heading for an oil supply crisis. That means the drive to alternatives will become frantic. Some think it has already begun, although that is not true. Somehow we are continuing to sustain present levels of oil production. I do not think that we can lose any more production right now.

A supply crisis has already been prepared for over the past year as every shrewd operator has ramped up his capacity to replace hydrocarbons. It will not be enough but the onslaught of investment in alternatives will us turn the corner quickly.

A swift switch out in carbon dependence which is now setting up will make Kyoto obsolete.

Technology Can Fight Global Warming

Marine cloud whitening, and other ideas.

We have precious little to show for nearly 20 years of efforts to prevent global warming. Promises in Rio de Janeiro in 1992 to cut carbon emissions went unfulfilled. Stronger pledges in Kyoto five years later failed to keep emissions in check. The only possible lesson is that agreements to reduce carbon emissions are costly, politically arduous and ultimately ineffective.

But this is a lesson many are hell-bent on ignoring, as politicians plan to gather again—this time in Copenhagen, Denmark, in December—to negotiate a new carbon-emissions treaty. Even if they manage to bridge their differences and sign a deal, there is a strong likelihood that tomorrow's politicians will fail to deliver.

Global warming does not just require action; it requires effective action. Otherwise we are just squandering time.

To inform the debate, the Copenhagen Consensus Center has commissioned research looking at the costs and benefits of all the policy options. For example, internationally renowned climate economist Richard Tol of Ireland's Economic and Social Research Institute finds that a low carbon tax of $2 a metric ton (1.2 tons U.S.) is the only carbon reduction policy that would make economic sense. But his research demonstrates the futility of trying to use carbon cuts to keep temperature increases under 2 degrees Celsius (3.6 degrees Fahrenheit), which many argue would avoid the worst of climate change's impacts.

Some economic models find that target impossible to reach without drastic action, like cutting the world population by a third. Other models show that achieving the target by a high CO2 tax would reduce world GDP a staggering 12.9% in 2100—the equivalent of $40 trillion a year.

Some may claim that global warming will be so terrible that a 12.9% reduction in GDP is a small price to pay. But consider that the majority of economic models show that unconstrained global warming would cost rich nations around 2% of GDP and poor countries around 5% by 2100.

Even those figures are an overstatement. A group of climate economists at the University of Venice led by Carlo Carraro looked closely at how people will adapt to climate change. Their research for the Copenhagen Consensus Center showed that farmers in areas with less water for agriculture could use more drip irrigation, for example, while those with more water will grow more crops.

Taking a variety of natural, so-called market adaptations into account, the Carraro research shows we will acclimatize to the negative impacts of global warming and exploit the positive changes, actually creating 0.1% increase in GDP in 2100 among the member countries of the Organization for Economic Cooperation and Development. In poor countries, market adaptation will reduce climate change-related losses to 2.9% of GDP. This remains a significant, negative effect. The real challenge of global warming lies in tackling its impact on the Third World. Yet adaptation has other positive benefits. If we prepare societies for more ferocious hurricanes in the future, we also help them to cope better with today's extreme weather.

This does not mean, however, that we should ignore rising greenhouse-gas emissions. Research for the Copenhagen Consensus Center by Claudia Kemfert of German Institute for Economic Research in Berlin shows that in terms of reducing climate damage, reducing methane emissions is cheaper than reducing CO2 emissions, and—because methane is a much shorter-living gas—its mitigation could do a lot to prevent some of the worst of short-term warming. Other research papers highlight the advantages of planting more trees and protecting the forests we have to absorb CO2 and cut greenhouse gases.

Other more speculative approaches deserve consideration. In groundbreaking research, J. Eric Bickel, an economist and engineer at the University of Texas, and Lee Lane, a researcher at the American Enterprise Institute, study the costs and benefits of climate engineering. One proposal would have boats spray seawater droplets into clouds above the sea to make them reflect more sunlight back into space—augmenting the natural process where evaporating ocean sea salt helps to provide tiny particles for clouds to form around.

Remarkably, Mr. Bickel finds that about $9 billion spent developing this so-called marine cloud whitening technology might be able to cancel out this century's global warming. The benefits—from preventing the temperature increase—would add up to about $20 trillion.

Climate engineering raises ethical concerns. But if we care most about avoiding warmer temperatures, we cannot avoid considering a simple, cost-effective approach that shows so much promise.

Nothing short of a technological revolution is required to end our reliance on fossil fuel—and we are not even close to getting this revolution started. Economists Chris Green and Isabel Galiana from McGill University point out that nonfossil sources like nuclear, wind, solar and geothermal energy will—based on today's availability—get us less than halfway toward a path of stable carbon emissions by 2050, and only a tiny fraction of the way towards stabilization by 2100.

A high carbon tax will simply hurt growth if alternative technology is not ready, making us all worse off. Mr. Green proposes that policy makers abandon carbon-reduction negotiations and make agreements to seriously invest in research and development. Mr. Green's research suggests that investing about $100 billion annually in noncarbon based energy research could result in essentially stopping global warming within a century or so.

A technology-led effort would have a much greater chance of actually tackling climate change. It would also have a much greater chance of political success, since countries that fear signing on to costly emission targets are more likely to embrace the cheaper, smarter path of innovation.

Cutting emissions of greenhouse gases is not the only answer to global warming. Next week, a group of Nobel Laureate economists will gather at Georgetown University to consider all of the new research and identify the solutions that are most effective. Hopefully, their results will influence debate and help shift decision makers away from a narrow focus on one, deeply flawed response to global warming.

Our generation will not be judged on the brilliance of our rhetoric about global warming, or on the depth of our concern. We will be judged on whether or not we stop the suffering that global warming will cause. Politicians need to stop promising the moon, and start looking at the most effective ways to help planet Earth.

Mr. Lomborg teaches at the Copenhagen Business School and is director of the Copenhagen Consensus Center. He is the author of "Cool It: The Skeptical Environmentalist's Guide to Global Warming" (Knopf, 2007.)


Tuesday, September 1, 2009

XCOR Sustained Firing Begins


This appears to be proceeding well. One of the wonders of modern design technology is that so much can be simulated in the computer before cutting metal. Thus we have astonishing swift product development, or at least appear to. I am sure those who are hands on do not think so.


We are now headed for sustained firing. We are getting there. And will be flying this engine soon.

XCOR Aerospace Reaches Several Significant Milestones in the Lynx 5K18 Rocket Engine Test Program


http://origin.ih.constantcontact.com/fs046/1102502633005/img/44.jpg?a=1102689679183


September 02, 2009, Mojave, CA: XCOR Aerospace announced today that it has reached several significant milestones in the 5K18 rocket engine test program. This is the engine that powers XCOR's Lynx suborbital spacecraft. The engine can be seen running in
several newly released videos including a video demonstrating the very stable "shock diamond" pattern visible in the engine's supersonic exhaust.

"Like all of our rocket engines, this engine has demonstrated the ability to be stopped and re-started using our safe and reliable spark torch ignition system", said XCOR CEO Jeff Greason. "The basic cooling design has also been completed and the engine is able to run continuously at thermal equilibrium.


With those milestones reached, the 5K18 test program is now moving forward into a second phase of tuning and optimization, in which we will also greatly increase our cumulative run time."



Data and test results from the Lynx engine program are being used by XCOR and certain customers to develop a deeper understanding of operationally responsive spacelift procedures. These procedures can then be applied to future rocket powered vehicles. XCOR and its customers now have important information that will aid in the development of the unique requirements of operationally responsive high performance manned and unmanned rocket systems.


Testing of the 5K18 rocket engine is continuing in parallel with several other key Lynx system components, including wind tunnel testing at AFRL facilities and development of the Lynx pressure cabin at XCOR's main facilities in Mojave, CA.


"These additional firings and milestones continue to demonstrate XCOR's ability to deliver safe and truly innovative rocket propulsion technology that will one day revolutionize space access by enhancing readiness levels for flight from years to days or even hours, and driving down costs and increasing safety by orders of magnitude", said XCOR Chief Operating Officer, Andrew Nelson.XCOR Aerospace is a California corporation located in Mojave, California. The company is in the business of developing and producing safe, reliable and reusable rocket powered vehicles, propulsion systems, advanced non-flammable composites and other enabling technologies for responsive private space flight, scientific missions, upper atmospheric research, and small satellite launch to low earth orbit. Its web address is:
www.xcor.com. Advanced ticket sales have already commenced at www.rocketshiptours.com

Young Ganges


Another geographic consideration arising from the book by Prithvi Raj on the historical content of Indian scriptures is the representation that the Ganges arose in its present form about a thousand years or so after the primary event I have called the Pleistocene Nonconformity.
There was a time that when confronted with such a proposition, I would have dismissed it out of hand. Then I discvovered that one could get there from here.

So the question right of is can we reconcile this with our model as we have reconciled the submergence of the Maldive Archipelago in the Indian Ocean. Again this part of the crust was accommodating compression and the prior mountain building provided a natural fault system able to accomplish this. So it is fair to assume that the mountains rose several thousands of feet at least if not a lot more. After all they are presently the tallest in the world and are comparable only to their near equivalent in the Andes on the same arc and same effective position.

This produced a valley sub parallel to the newly raised mountain range and possibly upheaved Tibetan plateau (?). Massive precipitation at the eastern end of the range north of Burma began cutting several major river systems including the Mekong. Water began filling the valley and draining westward. This valley breaks out close to Pakistan and enters the present plain of the Ganges.

Prior to a final breakout, the valley was certainly blocked in multiple locations with massive landslides and intrusive structures. It is quite plausible that that primary blockage was a weak landslide that allowed the accumulation of water. It is thus reasonable to assume that the water accumulated for centuries behind this natural blockage. In fact it is highly reasonable considering the incredible terrain itself. A thousand or so years of water accumulation is very reasonable.

I also note that the geology of the area is incredibly young. I have reviewed a photograph of a major sheer cliff in those mountains that showed little accumulation of scree yet was formed from sediments. It was implausible and demanded a recent genesis. In short, if those mountains were a million years old, the valleys would be choked with material as is constantly coming off those mountains today. They are not particularly choked at all.

That such a water buildup took place appears almost inevitable. That its release would be dramatic is also inevitable as the rushing waters would swiftly scour out the valley bottom for its full length. The amount of water held could have approached that of a small great lake and produced a huge flow that took weeks to subside.

We thus have a mechanism for producing the river bottom of the Ganges as a one time event. Before this happened, the rains had already created an active riverine system flowing off the front of the Himalayas and possibly producing several rivers flowing along the Gangetic plain to its huge delta. When the impounded waters began their release, a huge torrent proceeded to scour out a deep and broad valley down through this established sedimentary plain. In short, it is possible to provide a valid explanation for the abrupt rise of the Ganges that fits the cultural record.

It should be easy to piece together the geologic record, and someone should already have had his eyes open because of the note in the cultural record.

I also noted that before the event of the nonconformity, that a barrier mountain range is reported to be cutting India across the center in an east west direction. There was an equally important plain north of this barrier. If we simply look to the Himalayas as the original barrier range, its location is solved. This means that the cultural sources were describing a much larger geographic area than supposed where the southern portion fell south of the equator and the northern portion, plausibly as large fell north of the equator.

I would like to note that Prithvi Raj’s historical reconstruction and my own work are arriving at the same conclusions from very different directions and are agreeing very well. The knowledge of the crustal shift and the derivative knowledge of the impact along the arc of maximum movement on local crustal curvature is easily providing guidance in understanding the reported events. More important, the right things are happening in the right place.

It is significant to observe that populations still survived although coastal damage must have climbed into the hills. It also becomes plausible that populations of herders began to migrate into these broken lands quickly to escape suddenly long bitter winters in their own homelands.

Space Debris Tamed




The problem has been quantized better and we have doable project that is able to harvest the objects that are in fact critical or I at least assume so. This suggests that objects not in this mix are at least survivable. Otherwise it is a good plan and puts hardware in orbit able to act as the local fire department

This is certainly a better scenario than previously espoused and based on little good data. A solution is available and it is cost effective. Loses of inaction will exceed that of implementation and if that is true then a common program needs to be put to work perhaps paid for on a per pound recovery charge so no one can squabble over whose fault.

I suspect that it will take some time for all this to be made to happen but the ability to charge back to the source programs will bring interest levels up. The problem is measurable and users can calculate their liability. This at least makes it a sufficiently solvable problem.

Debris - Problem Solved

http://www.spacedaily.com/reports/Space_Debris_Problem_Solved_999.html

Although space debris proliferation presents a long-term challenge that will require a long-term solution, the immediate problem is quite bounded. A study of debris distribution reveals the near-term troubled zone to be a spherically symmetric region between the altitudes of 700 km and 900 km.

by Launchspace Staff

Bethesda MD (SPX) Aug 31, 2009
There is no doubt that the topic of "space debris" is hot! It is a hot subject at
NASA, DARPA, Air Force Space Command, ESA and in the board rooms of all commercial satellite operators. High anxiety is running rampant among these groups. Every debris mitigation technique has been reviewed and pursued. New satellites must have the ability to either de-orbit or move out of the way at end-of-mission.

Upper stages must vent tanks to rid them of residual propellant that might later result in explosions. Many satellites are maneuvered to avoid close-conjunction events. JSpOC is beefing up its satellite and debris tracking capabilities. National and international working groups are meeting regularly to assess the threat and to recommend actions for all space-faring nations. The world is just one major satellite collision event away from panic.

Instances of close conjunction events in highly congested orbital bands have increased dramatically in the past few years. In fact, the frequency of close encounters between active satellites and large debris objects within the Iridium constellation has reached a frighteningly high level. Odds are that there will be another Iridium/Cosmos type of event in the near future.

Should such an event occur, several bad things will happen to many satellite operators. If another Iridium satellite is involved the company would be forced to replace the lost satellite. The frequency of close encounters in orbits near that of Iridium's constellation would suddenly increase to levels that would cause several operators to reassess the viability of existing space applications.

Satellite insurance providers might be forced to raise premiums on in-orbit performance to record high levels. Future launch plans for almost all low orbit satellites may be curtailed. Space-based services to the world would diminish over time. The economic impact is not even calculable. This is scary!

Not to fear. A solution is on the way.

Although space debris proliferation presents a long-term challenge that will require a long-term solution, the immediate problem is quite bounded. A study of debris distribution reveals the near-term troubled zone to be a spherically symmetric region between the altitudes of 700 km and 900 km.

This is where a great many operational satellites and large debris objects co-exist. Thus, the near-term challenge appears to be the removal of enough large debris objects in order to reduce collision risks to levels consistent with statistical times-between-debris-collisions that are much higher than expected satellite mission lifetimes.

Sounds simple, but it is not! Seems impossible, but it is not! So, what will it take to do the job?

Simply stated, all affected parties must collaborate and contribute to create a massive new space effort. There are literally well over 1,000 large debris objects that pose an immediate threat. Every one of these can be removed, and there are a number of removal techniques. One approach, as an example, would be to develop specially designed "Debris Collection Spacecraft."

Each DCS would be capable of maneuvering and rendezvousing with several objects, one at a time. Each object may be stored for later de-orbit, or fitted with an autonomous de-orbit unit that slows the object's orbital speed. If each DCS can deal with 100 objects, assuming only 1,000 objects need to be removed, the job will require 10 DCSs. This whole removal operation must be transparent to commercial, civil and security satellite operators.

In order to be effective, the removal program needs to start yesterday, because it will take several years before actual removal operations can begin. We don't have a lot of time here. If each of 100 objects being collected by one DCS takes three days of maneuvering to reach, then each DCS would require roughly 10 months to achieve its mission. However, it is likely that the DCSs will require in-orbit refueling after each 10 rendezvous completions.

The total mission span for each DCS seems to be roughly one year. If the program is started immediately, it could be completed in about five or six years. The program cost is estimated at $3 billion, based on developing the DCS, on-orbit refueling vehicles and operations, building 10 DCSs and one to two years of ground operations. This is cheap compared to the cost of not doing it.

For all those who are concerned and interested in the space debris crisis, your first step is to get smart on the issues and possible solutions. This is where Launchspace can help. If you are involved in space flight or want to better understand the new space crisis, you will want to sign up for the "must take" seminar on the subject, October 27th in Washington, DC.

Oceanic Mascons


One day after reporting on the apparent existence of a switching mechanism in the heat flux of the ocean we get this. It is the one major variable that we would like to see closely mapped and here we are making it possible.

A shift in volumes of deep sea cold waters from one locale to the next would be really important. That it could actually explain the known sudden switching of global climate conditions is important because none of the other options work well at all.

Left on its own, the northern climate would warm up to a pleasant level warmer than today and stay there forever as demonstrated by the Bronze Age optimum.

So I return to the hypothesis that what we are dealing with is a mature equilibrium between the two hemispheres that became fully operational about three thousand years ago. Prior to that two changes were working to completion. The first was that the northern ice age was fully completing the process of deglaciation and the melt waters were both mixing with the ocean waters and the ocean itself was achieving a stable level of warmth. This was largely done perhaps as soon as 5,000 BP but certainly by 3000 BP.

The second event was that the southern polar cap was expanding to a new stable configuration possibly reached around 3000 BP. What began then was a cycle of deep sea cold water been injected periodically into the Northern Hemisphere in order to balance the two hemispheres.

These cycles are not properly mapped as yet but the indicators show some form of minor variation that occurs over a thirty to fifty year cycle or so. If this is generated mostly in the Pacific then it could be pretty benign. I think that we may have a similarly sized event occasionally happening in the Atlantic which would be several times more pronounced.

Anyway we are much closer to having measuring tools.

New Look At Gravity Data Sheds Light On Ocean And Climate

http://www.spacedaily.com/reports/New_Look_At_Gravity_Data_Sheds_Light_On_Ocean_And_Climate_999.html


by Rosemary SullivantPasadena CA (SPX) Aug 28, 2009

A discovery about the moon made in the 1960s is helping researchers unlock secrets about Earth's ocean today. By applying a method of calculating gravity that was first developed for the moon to data from NASA's Gravity Recovery and Climate Experiment, known as Grace, JPL researchers have found a way to measure the pressure at the bottom of the ocean.

Just as knowing atmospheric pressure allows meteorologists to predict winds and weather patterns, measurements of ocean bottom pressure provide oceanographers with fundamental information about currents and global ocean circulation. They also hold clues to questions about sea level and climate.

"Oceanographers have been measuring ocean bottom pressure for a long time, but the measurements have been limited to a few spots in a huge ocean for short periods of time," says JPL oceanographer Victor Zlotnicki.

Launched in 2002, the twin Grace satellites map Earth’s gravity field from orbit 500 kilometers (310 miles) above the surface. They respond to how mass is distributed in the Earth and on Earth's surface -the greater the mass in a given area, the stronger the pull of gravity from that area.

The pressure at the bottom of the ocean is determined by the amount of mass above it. "Ocean bottom pressure is the sum of the weight of the whole atmosphere and the whole ocean," says Zlotnicki. "When winds move water on the surface, ocean bottom pressure changes. When glaciers melt and add water to the ocean, the ocean's mass increases and bottom pressure increases, either at one place or globally."

"Measuring ocean bottom pressure was one of the things we said we wanted to do from the very beginning of the mission," says Grace project scientist Michael Watkins, "but it has been a challenge. The signal is very small and hard to detect."

Gravity changes over the ocean are miniscule compared to those over land. The ocean is a fluid. It yields to pressure and spreads the effect over a vast area. Nothing in the ocean gives as big a gravity signal as a flooding Amazon River or melting glaciers in Greenland or Alaska, changes that Grace can measure fairly easily, says Watkins. "Those hydrology signals are huge in comparison," he says.

However, as the mission progressed, Watkins explains, the science team has found better ways to process Grace data. And by turning to a technique developed for the lunar world, Grace researchers are getting the precise measurements of ocean bottom pressure they were hoping for.

From the moon to the ocean bottomIn the days leading up to the Apollo missions, JPL scientists discovered that certain areas of the moon had higher concentrations of mass than others. The result of these "mass concentrations" was marked differences in the moon's gravity field.

The researchers then devised a new way to calculate the gravity field called a "mascon" (for mass concentration) solution. Mascon solutions break the gravity field into small, individual regions. The more traditional ways of computing gravity, often called harmonic solutions, smooth everything together and calculate gravity for a whole large area or body.

Recently scientists have begun developing mascon solutions for Grace data for use in a variety of studies, and they are revealing fascinating new details about Earth's gravity field. These mascon solutions are also proving to be a key to Grace's ability to measure ocean bottom pressure.

"Some of the very best harmonic solutions show some bottom pressure signals, but the mascon solutions appear to do a better job and provide much higher resolution," says Watkins.

"Using a mascon solution with Grace data is a way of weighing each little piece of the ocean," he says. The result is a new view of the gravity field - one that reveals sharp contrasts in gravity precise enough to calculate variations in ocean bottom pressure.

A large field experiment off the coast of Japan provided an unusual and welcomed opportunity to put Grace mascon estimates of ocean bottom pressure to the test. There are few places in the ocean where there are enough data on ocean bottom pressure to validate the satellite's observations.

Oceanographer Jae-Hun Park and his colleagues at the University of Rhode Island compared the Grace measurements with data collected by a large array of pressure-reading instruments stationed on the ocean bottom as part of the Kuroshio Extension System Study. This two-year observational program to study deep ocean currents and fronts ran from 2004 to 2006.

"Our site covered a very wide area of 600 by 600 kilometers (370 miles) with 43 available bottom pressure sensors," says Park. He and his colleagues found that while some of the individual sensors had very high correlations with Grace measurements, others were very low. "These low correlations were small-scale eddies that Grace cannot catch," explains Park. Grace's resolution is about 200 kilometers (125 miles).

However, when they compared the spatially averaged monthly mean ocean bottom pressure measured by the ocean sensors with the latest JPL Grace mascon solution for the center of the array, "we found a high correlation between the Grace measurements and our in-situ measurements," says Park.

"This experiment gave us the opportunity to validate the Grace data." The results of the study appeared last year in Geophysical Research Letters.

Grace's new ability to detect small changes in ocean mass - reflected in ocean bottom pressure - will help scientists answer ongoing questions about sea level and climate change. It will help clarify, for example, just how much of sea level change is due to differences in ocean mass, the result of evaporation, precipitation, melting land ice, or river run-off and how much is due to temperature and salinity.

"Now, for the first time with these new mascon solutions," say Zlotnicki, "Grace will allow us to measure changes in ocean bottom pressure globally for long periods of time. This is a new tool for oceanography."

Monday, August 31, 2009

Deforestration Abates


This is a welcome bit of information to act as an anecdote to the more hysterical reports of past years. In a way it is unsurprising. Central governments need their tax revenue and illegal cutting is all about operating out of government oversight and taxation. Obviously, if the local government cannot collect taxes on these logs and new fields, then there is slim chance they can hope to regulate the practice.

Thus the economic necessity of central governments is doing what all the laws and police can never quite do.

Slash and burn will continue until the farmers are encouraged to adopt biochar and are given homestead rights on that basis. At which point it will disappear in a hurry.

It was my lot to once visit a site in the jungles of Borneo a couple of decades ago. It was situated on a small river with a good flow a few miles inland. I saw a steady stream of logs tied up in small booms of perhaps several logs each with a logger riding each boom down to the sea. In the river mouth, there was a tramp ship collecting these logs and loading them. At best the local constabulary had speed boats thirty miles away and easier fish to fry. I got the distinct impression that no one asked too many questions.

I am sure today that the mill exists and that the tramp is no longer collecting logs and who ever comes down that river may even be paying taxes.

INTERVIEW-Global forest destruction seen overestimated
Fri Aug 21, 2009 3:46pm EDT

http://www.reuters.com/article/latestCrisis/idUSN2165866

By Stuart Grudgings

RIO DE JANEIRO, Aug 21 (Reuters) - The amount of carbon emissions caused by world forest destruction is likely far less than the 20 percent figure being widely used before global climate talks in December, said the head of the Brazilian institute that measures Amazon deforestation.

Gilberto Camara, the director of Brazil's respected National Institute for Space Research, said the 20 percent tally was based on poor science but that rich countries had no interest in questioning it because the number put more pressure on developing countries to stem greenhouse gases.

"I'm not in favor of conspiracy theories," Camara told Reuters in a telephone interview on Friday.

"But I should only state that the two people who like these figures are developed nations, who would like to overstress the contribution of developing nations to global carbon, and of course environmentalists."
A lower estimate for carbon emissions from deforestation would have an impact on the Copenhagen talks, where preserving forests is a top item on the agenda.

The summit will negotiate a follow-up to the Kyoto climate change treaty that could introduce forest credit trade to cut developing nation deforestation.

Camara, who stressed that he thought Brazil's deforestation rates remain too high, said recent calculations by his institute using detailed satellite data showed clearing of the world's biggest forest accounted for about 2.5 percent of annual global carbon emissions.

Given that the Amazon accounts for about a quarter of deforestation globally, a figure of about 10 percent for total emissions caused by forest destruction is likely to be more accurate, Camara said.

The 20 percent figure used by the Intergovernmental Panel on Climate Change was based on calculations from sampling of forests by the United Nations Food and Agriculture Organization (FAO), he said.

The FAO method came up with an average annual figure of 31,000 sq km (12,000 sq miles) deforested in the Amazon from 2000-2005. But Brazil's method of using satellite images to measure deforestation "pixel by pixel" was far more accurate and showed a figure of 21,500 sq km for the period, Camara said.

DEFORESTATION HEADING LOWER

For 2005-2009, the FAO estimate was double the correct figure, Camara said.

"The FAO grossly overestimated deforestation in Brazil and there are papers that show that such overestimation is also true for many other countries, including of course Indonesia."

Indonesia is among the world's biggest deforesters.

Camara said he was skeptical of any deal involving Brazil being rewarded for "avoided deforestation" because the average rate of destruction remained far too high.

"Deforestation in 2004 was 27,000 sq km and let's say in 2009 it is 10,000 sq km. It is not fair to say that we avoided 17,000 sq km of deforestation in as much as our current level is still too much, and 90 percent of that is illegal," he said.

"The concept of avoided deforestation is a weak concept. It would not stand up to scrutiny."Deforestation of the Amazon, which makes Brazil one of the biggest global carbon emitters, is on course to fall sharply in the August-to-July annual period in which it is measured.
Satellite data shows that new, large deforested areas are about half the area they were in the previous year, when total deforestation was 12,000 sq km.

"We are hopeful that deforestation will go down. In areas where deforestation had been high in previous years, like Mato Grosso and Rondonia state, it is relatively under control," Camara said.

The government has taken steps to crack down on illegal deforestation over the past year. Falling deforestation may also be due to the fall in commodity prices over the past year, reducing the incentive for farmers and ranchers to clear land. (Editing by John O'Callaghan)

Monday, January 26, 2009

Superflare Superthreat

The only good thing about a super flare is that it is brief. This article is a reminder that they really exist. And it will still take a lot of time to recover services, particularly if all the transformers are fried.

Which truly begs the question regarding how well the system is protected? This is not difficult, but certainly costs money. It is surely not impossible to protect transformers in particular and those are the things that take time. Breakers protect cables surely even though most everything else is likely to be fried.

I doubt is any of our computers are protected. So while protecting the grid is a case of avoiding design negligence, the rest of the system needs regulatory standards.

This report is a loud warning that we have not done what common sense tells us to do. We need to pay attention. Why are our transformers and motors not wrapped simply in foil? Or is that just too cheap and brain dead easy? Of course most computers are in metal casings which do most of the job.

However, the mere fact that 130 main transformers are even vulnerable tells me that this issue is not on any design engineer’s radar.

It is simple to put the rules in place to lower exposure and simple obsolescence will resolve it all over twenty years. The only thing that requires immediate attention is the transformer inventory. There we are talking about Hurricane Katrina style negligence

Severe Space Weather

01.21.2009 January 21, 2009: Did you know a solar flare can make your toilet stop working?

That's the surprising conclusion of a NASA-funded study by the National Academy of Sciences entitled Severe Space Weather Events—Understanding Societal and Economic Impacts. In the 132-page report, experts detailed what might happen to our modern, high-tech society in the event of a "super solar flare" followed by an extreme geomagnetic storm. They found that almost nothing is immune from space weather—not even the water in your bathroom.

The problem begins with the electric power grid. "Electric power is modern society's cornerstone technology on which virtually all other infrastructures and services depend," the report notes. Yet it is particularly vulnerable to bad space weather. Ground currents induced during geomagnetic storms can actually melt the copper windings of transformers at the heart of many power distribution systems.
Sprawling power lines act like antennas, picking up the currents and spreading the problem over a wide area. The most famous geomagnetic power outage happened during a space storm in March 1989 when six million people in Quebec lost power for 9 hours: image.

According to the report, power grids may be more vulnerable than ever. The problem is interconnectedness. In recent years, utilities have joined grids together to allow long-distance transmission of low-cost power to areas of sudden demand. On a hot summer day in California, for instance, people in Los Angeles might be running their air conditioners on power routed from Oregon. It makes economic sense—but not necessarily geomagnetic sense. Interconnectedness makes the system susceptible to wide-ranging "cascade failures."

To estimate the scale of such a failure, report co-author John Kappenmann of the Metatech Corporation looked at the great geomagnetic storm of May 1921, which produced ground currents as much as ten times stronger than the 1989 Quebec storm, and modeled its effect on the modern power grid. He found more than 350 transformers at risk of permanent damage and 130 million people without power. The loss of electricity would ripple across the social infrastructure with "water distribution affected within several hours; perishable foods and medications lost in 12-24 hours; loss of heating/air conditioning, sewage disposal, phone service, fuel re-supply and so on."

"The concept of interdependency," the report notes, "is evident in the unavailability of water due to long-term outage of electric power--and the inability to restart an electric generator without water on site."

http://science.nasa.gov/headlines/y2009/images/severespaceweather/collapse.jpg


Above: What if the May 1921 superstorm occurred today? A US map of vulnerable transformers with areas of probable system collapse encircled. A state-by-state map of transformer vulnerability is also available: click here. Credit: National Academy of Sciences.

The strongest geomagnetic storm on record is the Carrington Event of August-September 1859, named after British astronomer Richard Carrington who witnessed the instigating solar flare with his unaided eye while he was projecting an image of the sun on a white screen. Geomagnetic activity triggered by the explosion electrified telegraph lines, shocking technicians and setting their telegraph papers on fire; Northern Lights spread as far south as Cuba and Hawaii; auroras over the Rocky Mountains were so bright, the glow woke campers who began preparing breakfast because they thought it was morning. Best estimates rank the Carrington Event as 50% or more stronger than the superstorm of May 1921.

"A contemporary repetition of the Carrington Event would cause … extensive social and economic disruptions," the report warns. Power outages would be accompanied by radio blackouts and satellite malfunctions; telecommunications, GPS navigation, banking and finance, and transportation would all be affected. Some problems would correct themselves with the fading of the storm: radio and GPS transmissions could come back online fairly quickly. Other problems would be lasting: a burnt-out multi-ton transformer, for instance, can take weeks or months to repair. The total economic impact in the first year alone could reach $2 trillion, some 20 times greater than the costs of a Hurricane Katrina or, to use a timelier example, a few TARPs.

What's the solution? The report ends with a call for infrastructure designed to better withstand geomagnetic disturbances, improved GPS codes and frequencies, and improvements in space weather forecasting. Reliable forecasting is key. If utility and satellite operators know a storm is coming, they can take measures to reduce damage—e.g., disconnecting wires, shielding vulnerable electronics, powering down critical hardware. A few hours without power is better than a few weeks.

NASA has deployed a fleet of spacecraft to study the sun and its eruptions. The Solar and Heliospheric Observatory (SOHO), the twin STEREO probes, ACE, Wind and others are on duty 24/7. NASA physicists use data from these missions to understand the underlying physics of flares and geomagnetic storms; personnel at NOAA's Space Weather Prediction Center use the findings, in turn, to hone their forecasts.
At the moment, no one knows when the next super solar storm will erupt. It could be 100 years away or just 100 days. It's something to think about the next time you flush.