Friday, September 4, 2009

Solar Energy Conversion Sprint




As mentioned this is a sprint race in the labs to pass the fifty percent mark. Forty three percent is pretty well down the road.

What no one bothers to mention is that commercial product is still running a marathon. The best is shipping with conversion levels closer to fifteen. See last story.

In fact, we cannot be too optimistic that this will improve soon. After all, we have been trying for years to make it better. We have unending lab improvements but a very slow transmission into deliverable hardware.

I would like to see more focus on structured quantum waveguides to see if that can change things.

The good news is that once you set aside the effort to maximize yield and turn to lowering costs, we are getting progress. Nanosolar likely achieves yields of around 10 to 13% but using print methods is bringing costs down to under $1.00 per watt. So we need 400 square miles of desert instead of 100 square miles of desert. No one will care because the cost is right and any improvement is easily implemented on a plug and play protocol.

The real sprint race is in the real world as the race is on to supply solar power at $1.00 per watt to everyone.

We are entering a new energy world in which we have a huge national grid, solar roofing and siding perhaps, geothermal and ample wind power and were needed and small nuclear were necessary to supply a central heat supply.
There are three separate stories appended.

Australians Break Solar Power Record

August 25, 2009

In a record reminiscent of a 100-meter dash, scientists at the University of South Wales in Sydney, Australia, have created the world's most efficient solar power cell ever...by a hair.

Professor Martin Green and his colleague Anita Ho-Baillie led a team of U.S. researchers to victory with a multi-cell combination that is able to convert 43 percent of sunlight into electricity. The previous record was 42.7 percent.

To capture light at the red and infrared end of the spectrum, the researchers threw everything into the cells--gallium, phosphorous, indium, and arsenic, plus silicon. While a bunch of the semiconductors used are expensive, the scientists did raise the efficiency bar.

Ho-Baillie and Green broke a different solar record with a silicon solar cell last October. If they continue to combine their efficient cells with technology from the folks at the National Renewable Energy Lab and Emcore, maybe they'll make ones that can convert 50 percent. I can't wait for the sunny day when that happens.

New solar cell efficiency record set

By Noel McKeegan
22:36 January 26, 2009 PST

New world record solar cell (Image: Fraunhofer ISE)

Researchers at the Fraunhofer Institute for Solar Energy Systems ISE have set a new record for solar cell efficiency. Using concentrated sunlight on a specially constructed multi-junction solar cell, the research group lead by Frank Dimroth has achieved 41.1% efficiency for the conversion of sunlight into electricity.

The breakthrough, which surpasses the 40.7 percent efficiency previously demonstrated by Spectrolab , involved the use of sunlight concentrated by a factor of 454 and focused onto a small 5 mm? multi-junction solar cell made out of GaInP/GaInAs/ Ge (gallium indium phosphide, gallium indium arsenide on a germanium substrate). Even at a higher sunlight concentration of 880, an efficiency of 40.4% was measured.

“We are elated by this breakthrough,” says Frank Dimroth, head of the group “III-V – Epitaxy and Solar Cells” at Fraunhofer ISE. “At all times the entire team believed in our concept of the metamorphic triple-junction solar cells and our success today is made possible only through their committed work over the past years.”

Multi-junction solar cells combine semiconductor compounds in layers to absorb almost all of the solar spectrum. The problem is that in combining these materials in a process known as metamorphic growth, defects occur in the lattice structure making it difficult to grow the III-V semiconductor layers with a high crystal quality. The Fraunhofer ISE researchers have overcome this issue by discovering a way to localize these defects in a region of the solar cell that is not electrically active, meaning that the active regions stay relatively defect free and higher efficiencies can be achieved.

“The high efficiencies of our solar cells are the most effective way to reduce the electricity generation costs for concentrating PV systems,” says Dr. Andreas Bett, Department Head at Fraunhofer ISE. “We want that photovoltaics becomes competitive with conventional methods of electricity production as soon as possible. With our new efficiency results, we have moved a big step further towards achieving this goal!”

Via: Fraunhofer Institute for Solar Energy Systems ISE (http://www.ise.fraunhofer.de/)

Suntech Claims New World Record in Silicon Panel Efficiency

The Fraunhofer Institute verifies that a Suntech Power multicrystalline silicon panel has beaten Sandia’s record. Suntech intends to have a 300MW capacity to produce its new Pluto cells in 2010.

Suntech Power said Wednesday it now holds the world record in producing the most efficiency multicrystalline silicon panels, beating a record previously held by Sandia National Laboratories.

A panel sporting the company's newly developed Pluto cells was able to convert 15.6 percent of the sunlight that strike it into electricity, Suntech said.

The Fraunhofer Institute of Solar Energy Systems in Germany, one of the few labs in the world whose test results are recognized by the industry, verified the efficiency of the panel. The panel rolled off a new factory line China-based Suntech set up to start shipping Pluto panels earlier this year.

The new record will be included by the science journal Progress in Photovoltaics (PIP) that periodically publishes a list of record-holding efficiency for different types of solar cells and panels.

"Improving the conversion efficiency of multicrystalline silicon modules has proven particularly challenging and this is a very impressive achievement for such a large module from a commercial supplier," said Martin Green, research director of the ARC Photovoltaics Centre of Excellence at the University of New South Wales in Australia, in a statement.

"I can confirm that the 15.6% multicrystalline module result is the highest known conversion efficiency measured by a PIP-recognized test center," added Green, who is on the journal's committee.

Suntech's efficiency number isn't much higher than the 15.5 percent record previously held by Sandia. But Suntech contends its panel could have surpassed 16 percent if it were tested without its frame, as was the case with Sandia's panel.

The new record is a boost to Suntech's plan to market panels assembled with Pluto cells, which it developed with technology licensed from the University of New South Wales. Suntech's founder and CEO Zhengrong Shi taught at the university for years.

The university holds the world record for silicon cells made in a lab, which were tested by Sandia and yielded 25 percent efficiency. Cells made in the labs tend to be able to achieve higher efficiencies than those from commercial production lines.

The Pluto technology focuses on improving the cell's ability to trap light to boost electricity production. Pluto cells also use copper instead of silver for its collector and bus lines, which act as highways for transporting the electricity produced by the cells.

Silver is the common material from these lines, but it can be pricey. Copper has similar conductivity but is cheaper. Suntech also uses less copper to further reduce cost, said Steve Chan, Suntech's chief strategy officer, in an interview. Chan who declined to disclose Pluto's manufacturing costs.

Pluto can be used to make either monocrystalline or multicrystalline cells. The technology has produced monocrystalline cells with close to 19 percent efficiency and multicrystalline cells over 17 percent, Suntech said.

In general, monocrystalline cells are more expensive to make partly because growing single-crystal silicon is more time consuming and energy intensive, but they yield higher efficiencies. Most of the silicon panels on the market today are of the multicrystalline variety.

SunPower, in San Jose, Calif., is known for producing the most efficient monocrystalline silicon cells for the market today. It is making cells with 22.5 percent efficiency. Its panels could achieve a little over 19 percent efficiency.

Suntech started shipping Pluto panels earlier this year, but the volume has been small. Suntech is producing them at about 1 megawatt to 2 megawatts per month, Chan said.

The company expects to ship 10 megawatts to 15 megawatts of Pluto panels by the end of 2009. Pluto panels have been installed in China and Australia. Suntech is waiting for IEC and UL certification to sell them in Europe and the United States.

Suntech is ramping up its production to mass produce them in 2010, when Suntech is set to have the manufacturing capacity to produce 300 megawatts of Pluto cells per year, Chan said.

Suntech already has a 1-gigawatt capacity to produce silicon cells with an older technology, making it one of the few in the world with that much production capability.

The company plans to convert its existing lines to make Pluto products, a process that would take about three years, Chan said. Suntech has historically produced mostly multicrystalline silicon cells. Chan declined to say whether the company would shift that strategy with its Pluto lines.

Suntech is scheduled to announce its second-quarter earnings on Thursday.

Thursday, September 3, 2009

Nitrogen and the Ozone Layer


I find this particular conclusion rather odd and am rather curious how it is supported at all. There are two issues.

The first is that the bulk of atmospheric nitrous oxide is produced by lightning and a lot of that is at a high altitude making it easy enough to support injection into the stratosphere. The only high altitude nitrous oxide we produce would come from jet engines and it is hard to see how we could get reliable measurements to describe the effect globally.

The second is that it is a heavy enough molecule that combines with water vapor and becomes acid and is heading down to the earth as precipitation sooner than later. For this reason, it was left out of every one’s calculations when it came to treaty making. A more likely scenario, is that ammonia is oxidizing somehow as it rises to the troposphere and this factor may have been locally increased by human activity.

Beyond that, it again begs the question of why we need to be concerned. Here it is represented that it is affecting the ozone layer, but only in a way that is naturally occurring anyway and possibly going through major peaks every time a volcano blows up. Mother Nature is set up to process this particular gas. That was never true for the refrigerant that started accumulating up there.

The advent of the universal application of biochar in agriculture will lock up unused agricultural nitrogen as it is been deployed over the next few decades and this will bring all such types of concerns under proper control. Nitrogen will no longer escape into the hydraulic system and that which is there will be naturally consumed. Global agriculture will be able to operate with vastly less nitrogen been manufactured on a per acre basis with the biochar protocol.

If you are unfamiliar with biochar, search my blog. Otherwise, it is safe to assume that every acre of agricultural soil will by having a minimum of several tons of carbon in the form of biochar applied to the soil. The soil benefits are singular and the carbon naturally grabs free ions of nitrogen to hold until taken up by the root system.

Nitrous oxide is top destroyer of ozone layer: study

http://www.terradaily.com/reports/Nitrous_oxide_is_top_destroyer_of_ozone_layer_study_999.html


by Staff Writers
Washington (AFP) Aug 31, 2009

Nitrous oxide emissions caused by human activity have become the largest contributor to ozone depletion and are likely to remain so for the rest of the 21st century, a US study has concluded.

The study by the National Oceanographic and Atmospheric Agency said efforts to reduce chlorofluorocarbons (CFCs) in the atmosphere over the past two decades were "an environmental success story.

"But manmade nitrous oxide is now the elephant in the room among ozone-depleting substances," said A. R. Ravishankara, lead author of the study, which was published Friday in the journal Science.

While nitrous oxide's role in depleting the ozone layer has been known for decades, the study marks the first time that its impact has been measured using the same methods as CFCs and other ozone depleting substances.

Emissions and production of those substances are regulated under the 1987 Montreal Protocol.

But the treaty excludes nitrous oxides, which are emitted by agricultural fertilizers, livestock manure, sewage treatment, combustion and certain other industrial processes.

Since nitrous oxide is also a greenhouse gas, the scientists said reducing emissions from manmade sources would be good for the ozone layer and help temper climate change.

Market Fears

I try not to comment on markets themselves because it is quickly a mug’s game once you begin paying attention to detail. However we are one full year past the 2008 market break. Since then the markets themselves have largely deleveraged. After all, brokers are never fond of extending much market debt in the best of times.

The first primary bottom shows on the charts in October of last year. After that the market deteriorated slowly toward a second bottom reached in the spring of this year a few percentage points lower. The joy got well spread around and during this phase markets consolidated as everyone checked their corporate health.

Since spring we have recovered back the few percentage points to the original primary bottom established during the initial collapse. At this point information is properly flowing again and corporate numbers are slowly improving. I emphasize the word slow here for the moment. The companies are still working with their banking arrangements to restore their own core liquidity.

This process needs a bit of time and we can also expect to see a lot of corporate debt offerings been peddled as companies replace gaps created in their balance sheets.

So the fear mongering presently been heard is a bit of too little too late. People see a rising market and think there is a building exposure while there is nothing of the sort. This rebound is reflecting a simple recovery of confidence to oversold markets. Not everyone is participating but those with strong balance sheets certainly are.

In the meantime, the US continues to avoid resolving the rolling foreclosure crisis by simply letting it accumulate. It will naturally reach the end of the road and we will have a massive inventory unsalable to those folks who have all lost their borrowing power. It will be long road back and we could well have a lost decade in terms of consumer borrowing. Also the consumer will be naturally cautious for a generation because of these events.

There is enough support out there to have a strong market quarter. It just needs a confidence trigger and we seem to be getting all the negatives expressed and eliminated. In short, it is time to be bullish.

Nano Diamonds Deliver Gene Therapy


Anyone following my blog knows that I am tracking anything to do with nanoparticles of carbon. I got involved with this in the early nineties when piecing together an explanation for a wide range of empirical results been derived from work on super fine soot. It was all suggestive and it awoke an appreciation regarding the potentialities of nano sized particles in general and in particular that of carbon.

Now we can use a spec of diamond to hold a target protein and use it to safely inject into a cell.

This reminds me of our work in gathering hyralauronic acid about a speck of carbon to provide a super small droplet of the active ingredient. The acid is a very large organic molecule and naturally agglomerates. This made the product small enough to penetrate skin. It was actually a neat trick and derived from pioneering work carried out to make artificial blood during the Korean War.

We are obviously getting better at it. This is also a method of dose delivery that would plausibly reduce loses and perhaps allow a fine tuning of the dose delivery for a lot of ordinary meds.

September 01, 2009

Nanodiamonds Safely Deliver Gene Therapy with 70 times Greater Efficiency

http://nextbigfuture.com/2009/09/nanodiamonds-safely-deliver-gene.html


https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhPgPM8aEPkMPNYLTb_LgKCR19FjIGdSoiUzHZGDqVxkN9xdcQrHAEJYma5wBIAROh-pK8vTZZAk_-s2BtLoOSYBDS7CDY89nW7Diq2jnqGDcGULkspJ8iGEHyH7uVXfLQJX4UIGu8DIO4/s1600-h/nanodiamond.gif



A research team engineered surface-modified nanodiamond particles that successfully and efficiently delivered DNA into mammalian cells. The delivery efficiency was 70 times greater than that of a conventional standard for gene delivery. The new hybrid material could impact many facets of nanomedicine. The title of the ACS Nano paper is "Polymer-Functionalized Nanodiamond Platforms as Vehicles for Gene Delivery."


"A low molecular weight polymer called polyethyleneimine-800 (PEI800) currently is a commercial approach for DNA delivery," said Xue-Qing Zhang, a postdoctoral researcher in Ho's group and the paper's first author. "It has good biocompatibility but unfortunately is not very efficient at delivery. Forms of high molecular weight PEI have desirable high DNA delivery efficiencies, but they are very toxic to cells."

Multiple barriers confront conventional approaches, making it difficult to integrate both high-efficiency delivery and biocompatibility into one gene delivery system. But the Northwestern researchers were able to do just that by functionalizing the nanodiamond surface with PEI800.

The combination of PEI800 and nanodiamonds produced a 70 times enhancement in delivery efficiency over PEI800 alone, and the biocompatibility of PEI800 was preserved. The process is highly scalable, which holds promise for translational capability.The researchers used a human cervical cancer cell line called HeLa to test the efficiency of gene delivery using the functionalized nanodiamonds. Glowing green cells confirmed the delivery and insertion into the cells of a "Green Fluorecent Protein (GFP)"-encoding DNA sequence. This served as a demonstrative model of how specific disease-fighting DNA strands could be delivered to cells. As a platform, the nanodiamond system can carry a broad array of DNA strands.

Regarding toxicity measurements, cellular viability assays showed that low doses of the toxic high-molecular PEI resulted in significant cell death, while doses of nanodiamond-PEI800 that were three times higher than that of the high-molecular weight PEI revealed a highly biocompatible complex.
Ho and his research team originally demonstrated the application of nanodiamonds for chemotherapeutic delivery and subsequently discovered that the nanodiamonds also are extremely effective at delivering therapeutic proteins. Their work further has shown that nanodiamonds can sustain delivery while enhancing their specificity as well.

Gene therapy holds great promise for treating diseases ranging from inherited disorders to acquired conditions and cancers. Nonetheless, because a method of gene delivery that is both effective and safe has remained elusive, these successes were limited. Functional nanodiamonds (NDs) are rapidly emerging as promising carriers for next-generation therapeutics with demonstrated potential. Here we introduce NDs as vectors for in vitro gene delivery via surface-immobilization with 800 Da polyethyleneimine (PEI800) and covalent conjugation with amine groups. We designed PEI800-modified NDs exhibiting the high transfection efficiency of high molecular weight PEI (PEI25K), but without the high cytotoxicity inherent to PEI25K. Additionally, we demonstrated that the enhanced delivery properties were exclusively mediated by the hybrid ND−PEI800 material and not exhibited by any of the materials alone.
This platform approach represents an efficient avenue toward gene delivery via DNA-functionalized NDs, and serves as a rapid, scalable, and broadly applicable gene therapy strategy.

Wednesday, September 2, 2009

Nuclear Mass Production

We are getting further down the road to having several mass produced nuclear reactors much smaller than the large plants built decades ago. This is also good news as we are on the verge of a massive grid build out that will be needed to support the coming automotive power market.

Having many reactors situated in zones of high demand is becoming necessary and is no longer a market for natural gas. Also small reactors are far less intrusive and may even be better integrated.

The industry has obviously shaken itself of the mega project mindset and has seen its future in having many small facilities.

I will make one other observation. These devices should be built around a thermal supply market. That is not out in the country. In a city center, production of hot water can be fed directly into surrounding infrastructure. In effect the hot water is free and the city itself provides the needed heat sink.

This is a certain improvement over a thermal plant using natural gas as is done in Vancouver city center. There a small nuclear plant would nicely provide city center power and city center heat for the large buildings. All the necessary infrastructure is already built to handle plug and play.

Again, mass production will also drive cost down and soon cities with ambition and insufficient size will be also building.


August 31,2009

Sandia Designing Factory Mass Producable Right Sized Reactor

http://nextbigfuture.com/2009/08/sandia-designing-factory-mass.html

https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiNR5KEZhblZKI60a2izEPGR04qteI_A9aoOEY6PNr9XZwa9-RQiPTvUdYaraSFSrjpCuu3PhiZqCLGUxkpEHBUDEifJ3kIj4pTkBIGzDTDq-n1fKEPEN4GmCIpbAA5HpLVDTk65XAvtmg/s1600-h/sandia1.jpg


Tom Sanders, Vice President/President Elect American Nuclear Society, is promoting "Global Energy Needs: Defining a Role for a “Right Sized Reactor” [32 page pdf]

There is some conflicting information in the 32 page presentation and a Sandia press release. The presentation (May, 2009) talks about a variety of reactor technologies include light water, gas cooled, liquid metal cooled (breeders), and molten salt. The Sandia press release talks only about a breeder reactor with a design that is 85% completed. There are no specifics about the 85% completed design other than it is proliferation resistant and has integrated safety.The presentation talks about a goal of $1500/KW for construction and the press release a financial target of 5 cents per kwh. Those are price levels that already being achieved in China. China is already making progress to factory built modular nuclear reactor (230 modules for Toshiba/Westinghosue AP1000) and factory mass produced pebble bed reactors [Pebble bed would be right sized 200 MWe]. China's PM-HTR breaks ground Sept 2009 and China is expecting to follow up with dozens of reactors as the technology and design are proven.Russia is also looking to produce a "right sized" factory mass produced breeder reactor.


The guidelines for developing large-scale nuclear power in Russia were set out as follows early in the decade: - Power costs not more than 3 cents/kWh, - Capital costs under US$ 1000/kW, - Service life at least 50 years, - Utilisation rate at least 90%. The technology future for Russia was focused on four elements:- Serial construction of AES-2006 units, - Fast breeder BN-800, - Small and medium reactors - KLT-40 and VBER-300 (100-300 MWe), - HTGR. (High Temperature Gas Reactor)


The new reactors in China and Russia would enable the low cost manufacturing of nuclear reactors to be exported.There are several companies and nations proposing factory built modular nuclear reactorsConsistent aspects of Sandia's plan:* Use super critical CO2 (SC-CO2) to make a smaller turbine to convert heat to electricity* Factory Mass produced reactors in the 100-300 MWe range* Costs in the $1500/KW and 5 cents per kwh* Proliferation resistant and exportable* This is where the global energy market is headed* This will displace the natural gas reactors of the same size and costTom led a team at Sandia which has completed 85% of the reactor core design.


A smaller scale, economically efficient nuclear reactor that could be mass-assembled in factories and supply power for a medium-size city or military base has been designed by Sandia National Laboratories. The exportable, proliferation-resistant “right-sized reactor” was conceived by a Sandia research team led by Tom Sanders. Sanders has been collaborating with numerous Sandians on advancing the small reactor concept to an integrated design that incorporates intrinsic safeguards, security and safety. This opens the way for possible exportation of the reactor to developing countries that do not have the infrastructure to support large power sources. The smaller reactor design decreases the potential need for countries to develop an advanced nuclear regulatory framework. Incorporated into the design, said team member Gary Rochau, is what is referred to as “nuke-star,” an integrated monitoring system that provides the exporters of such technologies a means of assuring the safe, secure, and legitimate use of nuclear technology. “This small reactor would produce somewhere in the range of 100 to 300 megawatts of thermal power and could supply energy to remote areas and developing countries at lower costs and with a manufacturing turnaround period of two years as opposed to seven for its larger relatives,” Sanders said. “It could also be a more practical means to implement nuclear base load capacity comparable to natural gas-fired generating stations and with more manageable financial demands than a conventional power plant.” About the size of half a fairly large office building, a right-sized reactor facility will be considerably smaller than conventional nuclear power plants in the U.S. that typically produce 3,000 megawatts of power. With approximately 85 percent of the design efforts completed for the reactor core, Sanders and his team are seeking an industry partner through a cooperative research and development agreement (CRADA). The CRADA team will be able to complete the reactor design and enhance the plant side, which is responsible for turning the steam into electricity. Team member Steve Wright is doing research using internal Sandia Laboratory Directed Research and Development (LDRD) program funding.These smaller reactors would be factory built and mass-assembled, with potential production of 50 a year. They all would have the exact same design, allowing for quick licensing and deployment. Mass production will keep the costs down, possibly to as low as $250 million per unit.Because the right-sized reactors are breeder reactors — meaning they generate their own fuel as they operate — they are designed to have an extended operational life and only need to be refueled once every couple of decades, which helps alleviate proliferation concerns. The reactor core is replaced as a unit and “in effect is a cartridge core for which any intrusion attempt is easily monitored and detected,” Sanders said. The reactor system has no need for fuel handling. Conventional nuclear power plants in the U.S. have their reactors refueled once every 18 months. The goal of the right-sized reactors is to produce electricity at less than five cents per kilowatt hour, making them economically comparable to gas turbine systems.


https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjur6kRyIn7dy_8JKceKyYsIMXLeeWVcO6rvp0hh4WzFL5cqgiowsiwklQkmQXJG8zmqpcWWcS6x-wYYCYE7GhUB4W-V_E5m9422az6DRxOLhg_lt1JZ6zgMfUtE6bHPLAcx3Gi9GGS_ok/s1600-h/sandia5.jpg


https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiEkUWeyUWfzv6QnlSM_XE59kldhCNvR1WVSGSKlepHi8N0Xe6kecxrWvIePnJjO5eJk8svqcHDrit0uZdavf9Fom4qzTgQSnoov6JeWQXzMGfVWCd0zQzRlGbm5dPFt7KOoo7mvQkvE_M/s1600-h/sandia3.jpg


https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiEkUWeyUWfzv6QnlSM_XE59kldhCNvR1WVSGSKlepHi8N0Xe6kecxrWvIePnJjO5eJk8svqcHDrit0uZdavf9Fom4qzTgQSnoov6JeWQXzMGfVWCd0zQzRlGbm5dPFt7KOoo7mvQkvE_M/s1600-h/sandia3.jpg


https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhgafAu8iCccZqMbitf1aSbIFTGJZyXK2-81eczkltJQOJxC8rvyr4Ue98CCeZadGroRpLAVNVT9oiBDEtPiklnjI1amNWzlcK9mdm76iP8Io7uga4FMAAGj5ulBjd8ho4Xjiy3jg-v7iE/s1600-h/sandia4.jpg




Solar Linkages

Here is some more connecting the dots in the business of climate modeling. This is hardly a reassurance that core players think that the science is settled. Most important is the apparent complexity of the linked events. This is very sophisticated and promises years of sleuthing. Surely no one thinks we are about to run out of connections to link up.

And that is the point. These type of links are seductive, yet a sudden change elsewhere might totally change all of them as we disappear into the world of fine detail.

At least the climate models must be getting more realistic.

This is still neat work any may hold up. That it shows early signs of having predictive power is very encouraging


Scientists Uncover Solar Cycle, Stratosphere And Ocean Connections

by Staff Writers
Boulder CO (SPX) Sep 01, 2009


http://www.spacedaily.com/reports/Scientists_Uncover_Solar_Cycle_Stratosphere_And_Ocean_Connections_999.html


Subtle connections between the 11-year solar cycle, the stratosphere, and the tropical Pacific Ocean work in sync to generate periodic weather patterns that affect much of the globe, according to research appearing in the journal Science. The study can help scientists get an edge on eventually predicting the intensity of certain climate phenomena, such as the Indian monsoon and tropical Pacific rainfall, years in advance.

An international team of scientists led by the National Center for Atmospheric Research (NCAR) used more than a century of weather observations and three powerful computer models to tackle one of the more difficult questions in meteorology: if the total energy that reaches Earth from the Sun varies by only 0.1 percent across the approximately 11-year solar cycle, how can such a small variation drive major changes in weather patterns on Earth?

The answer, according to the new study, has to do with the Sun's impact on two seemingly unrelated regions. Chemicals in the stratosphere and sea surface temperatures in the Pacific Ocean respond during solar maximum in a way that amplifies the Sun's influence on some aspects of air movement. This can intensify winds and rainfall, change sea surface temperatures and cloud cover over certain tropical and subtropical regions, and ultimately influence global weather.

"The Sun, the stratosphere, and the oceans are connected in ways that can influence events such as winter rainfall in North America," says NCAR scientist Gerald Meehl, the lead author. "Understanding the role of the solar cycle can provide added insight as scientists work toward predicting regional weather patterns for the next couple of decades."

The study was funded by the National Science Foundation, NCAR's sponsor, and by the Department of Energy. It builds on several recent papers by Meehl and colleagues exploring the link between the peaks in the solar cycle and events on Earth that resemble some aspects of La Nina events, but are distinct from them. The larger amplitude La Nina and El Nino patterns are associated with changes in surface pressure that together are known as the Southern Oscillation.

The connection between peaks in solar energy and cooler water in the equatorial Pacific was first discovered by Harry Van Loon of NCAR and Colorado Research Associates, who is a co-author of the new paper.

Top Down and Bottom Up

The new contribution by Meehl and his colleagues establishes how two mechanisms that physically connect changes in solar output to fluctuations in the Earth's climate can work together to amplify the response in the tropical Pacific.

The team first confirmed a theory that the slight increase in solar energy during the peak production of sunspots is absorbed by stratospheric ozone. The energy warms the air in the stratosphere over the tropics, where sunlight is most intense, while also stimulating the production of additional ozone there that absorbs even more solar energy.

Since the stratosphere warms unevenly, with the most pronounced warming occurring at lower latitudes, stratospheric winds are altered and, through a chain of interconnected processes, end up strengthening tropical precipitation.

At the same time, the increased sunlight at solar maximum causes a slight warming of ocean surface waters across the subtropical Pacific, where Sun-blocking clouds are normally scarce. That small amount of extra heat leads to more evaporation, producing additional water vapor. In turn, the moisture is carried by trade winds to the normally rainy areas of the western tropical Pacific, fueling heavier rains and reinforcing the effects of the stratospheric mechanism.

The top-down influence of the stratosphere and the bottom-up influence of the ocean work together to intensify this loop and strengthen the trade winds. As more sunshine hits drier areas, these changes reinforce each other, leading to less clouds in the subtropics, allowing even more sunlight to reach the surface, and producing a positive feedback loop that further magnifies the climate response.

These stratospheric and ocean responses during solar maximum keep the equatorial eastern Pacific even cooler and drier than usual, producing conditions similar to a La Nina event. However, the cooling of about 1-2 degrees Fahrenheit is focused farther east than in a typical La Nina, is only about half as strong, and is associated with different wind patterns in the stratosphere.

Earth's response to the solar cycle continues for a year or two following peak sunspot activity. The La Nina-like pattern triggered by the solar maximum tends to evolve into a pattern similar to El Nino as slow-moving currents replace the cool water over the eastern tropical Pacific with warmer water. The ocean response is only about half as strong as with El Nino and the lagged warmth is not as consistent as the La Nina-like pattern that occurs during peaks in the solar cycle.

Enhancing Ocean Cooling

Solar maximum could potentially enhance a true La Nina event or dampen a true El Nino event. The La Nina of 1988-89 occurred near the peak of solar maximum. That La Nina became unusually strong and was associated with significant changes in weather patterns, such as an unusually mild and dry winter in the southwestern United States.

The Indian monsoon, Pacific sea surface temperatures and precipitation, and other regional climate patterns are largely driven by rising and sinking air in Earth's tropics and subtropics. Therefore the new study could help scientists use solar-cycle predictions to estimate how that circulation, and the regional climate patterns related to it, might vary over the next decade or two.

Three Views, One Answer

To tease out the elusive mechanisms that connect the Sun and Earth, the study team needed three computer models that provided overlapping views of the climate system. One model, which analyzed the interactions between sea surface temperatures and lower atmosphere, produced a small cooling in the equatorial Pacific during solar maximum years.

The second model, which simulated the stratospheric ozone response mechanism, produced some increases in tropical precipitation but on a much smaller scale than the observed patterns. The third model contained ocean-atmosphere interactions as well as ozone. It showed, for the first time, that the two combined to produce a response in the tropical Pacific during peak solar years that was close to actual observations.

"With the help of increased computing power and improved models, as well as observational discoveries, we are uncovering more of how the mechanisms combine to connect solar variability to our weather and climate," Meehl says.

Bjorn Lomborg on Technology

In this article, Bjorn Lomborg bemoans the reality that two decades of political posturing and treaty manufacture has accomplished zero. And let us consider that. Reducing our dependence on carbon combustion was simply difficult and is still difficult. It costs more even with present superior pricing. That it costs more is a major difficulty. So long as carbon based fuels are cheaper than alternatives or simply more convenient, they will naturally be used first.

That a universal tax might change the economics is true. Except it has to be universal. No nation can stand outside such a rule. That is the ongoing problem faced by humanity, no one wants to address the need for the creation of universal commissions of the environment who have authority to develop and implement best practice and oversee it.

Common action will solve just about all else. It may sound simplistic, but valid cheap solutions are lying around unimplemented. A lot is simply human inertia. After all, if your engineering career has been about building land fills, then all waste solutions are land fill projects. What is more, you have the reputation and political support all locked down.

The good news, provided one has patience is that better solutions will eventually find their champions and be made operational.

I have told you about acid rain in a pipe. The science is all known. The engineering components are all known. One engineer will someday bravely get a small operation going on an individual smoke stack to learn his business. This will take three to five years before everyone is satisfied. This will then lead to a single large application built out. It too will operate for five years allowing proper papers to be presented at the proper engineering conferences. Then, about ten years on several groups will use the technology and build out several separate plants. We now have well established technology that possibly becomes universal.

I have seen this happen with SAGD in the oil business and in many other situations. The process is so slow as to be almost invisible.

The only thing that can speed this up is crisis.

We are heading for an oil supply crisis. That means the drive to alternatives will become frantic. Some think it has already begun, although that is not true. Somehow we are continuing to sustain present levels of oil production. I do not think that we can lose any more production right now.

A supply crisis has already been prepared for over the past year as every shrewd operator has ramped up his capacity to replace hydrocarbons. It will not be enough but the onslaught of investment in alternatives will us turn the corner quickly.

A swift switch out in carbon dependence which is now setting up will make Kyoto obsolete.

Technology Can Fight Global Warming

Marine cloud whitening, and other ideas.

We have precious little to show for nearly 20 years of efforts to prevent global warming. Promises in Rio de Janeiro in 1992 to cut carbon emissions went unfulfilled. Stronger pledges in Kyoto five years later failed to keep emissions in check. The only possible lesson is that agreements to reduce carbon emissions are costly, politically arduous and ultimately ineffective.

But this is a lesson many are hell-bent on ignoring, as politicians plan to gather again—this time in Copenhagen, Denmark, in December—to negotiate a new carbon-emissions treaty. Even if they manage to bridge their differences and sign a deal, there is a strong likelihood that tomorrow's politicians will fail to deliver.

Global warming does not just require action; it requires effective action. Otherwise we are just squandering time.

To inform the debate, the Copenhagen Consensus Center has commissioned research looking at the costs and benefits of all the policy options. For example, internationally renowned climate economist Richard Tol of Ireland's Economic and Social Research Institute finds that a low carbon tax of $2 a metric ton (1.2 tons U.S.) is the only carbon reduction policy that would make economic sense. But his research demonstrates the futility of trying to use carbon cuts to keep temperature increases under 2 degrees Celsius (3.6 degrees Fahrenheit), which many argue would avoid the worst of climate change's impacts.

Some economic models find that target impossible to reach without drastic action, like cutting the world population by a third. Other models show that achieving the target by a high CO2 tax would reduce world GDP a staggering 12.9% in 2100—the equivalent of $40 trillion a year.

Some may claim that global warming will be so terrible that a 12.9% reduction in GDP is a small price to pay. But consider that the majority of economic models show that unconstrained global warming would cost rich nations around 2% of GDP and poor countries around 5% by 2100.

Even those figures are an overstatement. A group of climate economists at the University of Venice led by Carlo Carraro looked closely at how people will adapt to climate change. Their research for the Copenhagen Consensus Center showed that farmers in areas with less water for agriculture could use more drip irrigation, for example, while those with more water will grow more crops.

Taking a variety of natural, so-called market adaptations into account, the Carraro research shows we will acclimatize to the negative impacts of global warming and exploit the positive changes, actually creating 0.1% increase in GDP in 2100 among the member countries of the Organization for Economic Cooperation and Development. In poor countries, market adaptation will reduce climate change-related losses to 2.9% of GDP. This remains a significant, negative effect. The real challenge of global warming lies in tackling its impact on the Third World. Yet adaptation has other positive benefits. If we prepare societies for more ferocious hurricanes in the future, we also help them to cope better with today's extreme weather.

This does not mean, however, that we should ignore rising greenhouse-gas emissions. Research for the Copenhagen Consensus Center by Claudia Kemfert of German Institute for Economic Research in Berlin shows that in terms of reducing climate damage, reducing methane emissions is cheaper than reducing CO2 emissions, and—because methane is a much shorter-living gas—its mitigation could do a lot to prevent some of the worst of short-term warming. Other research papers highlight the advantages of planting more trees and protecting the forests we have to absorb CO2 and cut greenhouse gases.

Other more speculative approaches deserve consideration. In groundbreaking research, J. Eric Bickel, an economist and engineer at the University of Texas, and Lee Lane, a researcher at the American Enterprise Institute, study the costs and benefits of climate engineering. One proposal would have boats spray seawater droplets into clouds above the sea to make them reflect more sunlight back into space—augmenting the natural process where evaporating ocean sea salt helps to provide tiny particles for clouds to form around.

Remarkably, Mr. Bickel finds that about $9 billion spent developing this so-called marine cloud whitening technology might be able to cancel out this century's global warming. The benefits—from preventing the temperature increase—would add up to about $20 trillion.

Climate engineering raises ethical concerns. But if we care most about avoiding warmer temperatures, we cannot avoid considering a simple, cost-effective approach that shows so much promise.

Nothing short of a technological revolution is required to end our reliance on fossil fuel—and we are not even close to getting this revolution started. Economists Chris Green and Isabel Galiana from McGill University point out that nonfossil sources like nuclear, wind, solar and geothermal energy will—based on today's availability—get us less than halfway toward a path of stable carbon emissions by 2050, and only a tiny fraction of the way towards stabilization by 2100.

A high carbon tax will simply hurt growth if alternative technology is not ready, making us all worse off. Mr. Green proposes that policy makers abandon carbon-reduction negotiations and make agreements to seriously invest in research and development. Mr. Green's research suggests that investing about $100 billion annually in noncarbon based energy research could result in essentially stopping global warming within a century or so.

A technology-led effort would have a much greater chance of actually tackling climate change. It would also have a much greater chance of political success, since countries that fear signing on to costly emission targets are more likely to embrace the cheaper, smarter path of innovation.

Cutting emissions of greenhouse gases is not the only answer to global warming. Next week, a group of Nobel Laureate economists will gather at Georgetown University to consider all of the new research and identify the solutions that are most effective. Hopefully, their results will influence debate and help shift decision makers away from a narrow focus on one, deeply flawed response to global warming.

Our generation will not be judged on the brilliance of our rhetoric about global warming, or on the depth of our concern. We will be judged on whether or not we stop the suffering that global warming will cause. Politicians need to stop promising the moon, and start looking at the most effective ways to help planet Earth.

Mr. Lomborg teaches at the Copenhagen Business School and is director of the Copenhagen Consensus Center. He is the author of "Cool It: The Skeptical Environmentalist's Guide to Global Warming" (Knopf, 2007.)


Tuesday, September 1, 2009

XCOR Sustained Firing Begins


This appears to be proceeding well. One of the wonders of modern design technology is that so much can be simulated in the computer before cutting metal. Thus we have astonishing swift product development, or at least appear to. I am sure those who are hands on do not think so.


We are now headed for sustained firing. We are getting there. And will be flying this engine soon.

XCOR Aerospace Reaches Several Significant Milestones in the Lynx 5K18 Rocket Engine Test Program


http://origin.ih.constantcontact.com/fs046/1102502633005/img/44.jpg?a=1102689679183


September 02, 2009, Mojave, CA: XCOR Aerospace announced today that it has reached several significant milestones in the 5K18 rocket engine test program. This is the engine that powers XCOR's Lynx suborbital spacecraft. The engine can be seen running in
several newly released videos including a video demonstrating the very stable "shock diamond" pattern visible in the engine's supersonic exhaust.

"Like all of our rocket engines, this engine has demonstrated the ability to be stopped and re-started using our safe and reliable spark torch ignition system", said XCOR CEO Jeff Greason. "The basic cooling design has also been completed and the engine is able to run continuously at thermal equilibrium.


With those milestones reached, the 5K18 test program is now moving forward into a second phase of tuning and optimization, in which we will also greatly increase our cumulative run time."



Data and test results from the Lynx engine program are being used by XCOR and certain customers to develop a deeper understanding of operationally responsive spacelift procedures. These procedures can then be applied to future rocket powered vehicles. XCOR and its customers now have important information that will aid in the development of the unique requirements of operationally responsive high performance manned and unmanned rocket systems.


Testing of the 5K18 rocket engine is continuing in parallel with several other key Lynx system components, including wind tunnel testing at AFRL facilities and development of the Lynx pressure cabin at XCOR's main facilities in Mojave, CA.


"These additional firings and milestones continue to demonstrate XCOR's ability to deliver safe and truly innovative rocket propulsion technology that will one day revolutionize space access by enhancing readiness levels for flight from years to days or even hours, and driving down costs and increasing safety by orders of magnitude", said XCOR Chief Operating Officer, Andrew Nelson.XCOR Aerospace is a California corporation located in Mojave, California. The company is in the business of developing and producing safe, reliable and reusable rocket powered vehicles, propulsion systems, advanced non-flammable composites and other enabling technologies for responsive private space flight, scientific missions, upper atmospheric research, and small satellite launch to low earth orbit. Its web address is:
www.xcor.com. Advanced ticket sales have already commenced at www.rocketshiptours.com

Young Ganges


Another geographic consideration arising from the book by Prithvi Raj on the historical content of Indian scriptures is the representation that the Ganges arose in its present form about a thousand years or so after the primary event I have called the Pleistocene Nonconformity.
There was a time that when confronted with such a proposition, I would have dismissed it out of hand. Then I discvovered that one could get there from here.

So the question right of is can we reconcile this with our model as we have reconciled the submergence of the Maldive Archipelago in the Indian Ocean. Again this part of the crust was accommodating compression and the prior mountain building provided a natural fault system able to accomplish this. So it is fair to assume that the mountains rose several thousands of feet at least if not a lot more. After all they are presently the tallest in the world and are comparable only to their near equivalent in the Andes on the same arc and same effective position.

This produced a valley sub parallel to the newly raised mountain range and possibly upheaved Tibetan plateau (?). Massive precipitation at the eastern end of the range north of Burma began cutting several major river systems including the Mekong. Water began filling the valley and draining westward. This valley breaks out close to Pakistan and enters the present plain of the Ganges.

Prior to a final breakout, the valley was certainly blocked in multiple locations with massive landslides and intrusive structures. It is quite plausible that that primary blockage was a weak landslide that allowed the accumulation of water. It is thus reasonable to assume that the water accumulated for centuries behind this natural blockage. In fact it is highly reasonable considering the incredible terrain itself. A thousand or so years of water accumulation is very reasonable.

I also note that the geology of the area is incredibly young. I have reviewed a photograph of a major sheer cliff in those mountains that showed little accumulation of scree yet was formed from sediments. It was implausible and demanded a recent genesis. In short, if those mountains were a million years old, the valleys would be choked with material as is constantly coming off those mountains today. They are not particularly choked at all.

That such a water buildup took place appears almost inevitable. That its release would be dramatic is also inevitable as the rushing waters would swiftly scour out the valley bottom for its full length. The amount of water held could have approached that of a small great lake and produced a huge flow that took weeks to subside.

We thus have a mechanism for producing the river bottom of the Ganges as a one time event. Before this happened, the rains had already created an active riverine system flowing off the front of the Himalayas and possibly producing several rivers flowing along the Gangetic plain to its huge delta. When the impounded waters began their release, a huge torrent proceeded to scour out a deep and broad valley down through this established sedimentary plain. In short, it is possible to provide a valid explanation for the abrupt rise of the Ganges that fits the cultural record.

It should be easy to piece together the geologic record, and someone should already have had his eyes open because of the note in the cultural record.

I also noted that before the event of the nonconformity, that a barrier mountain range is reported to be cutting India across the center in an east west direction. There was an equally important plain north of this barrier. If we simply look to the Himalayas as the original barrier range, its location is solved. This means that the cultural sources were describing a much larger geographic area than supposed where the southern portion fell south of the equator and the northern portion, plausibly as large fell north of the equator.

I would like to note that Prithvi Raj’s historical reconstruction and my own work are arriving at the same conclusions from very different directions and are agreeing very well. The knowledge of the crustal shift and the derivative knowledge of the impact along the arc of maximum movement on local crustal curvature is easily providing guidance in understanding the reported events. More important, the right things are happening in the right place.

It is significant to observe that populations still survived although coastal damage must have climbed into the hills. It also becomes plausible that populations of herders began to migrate into these broken lands quickly to escape suddenly long bitter winters in their own homelands.

Space Debris Tamed




The problem has been quantized better and we have doable project that is able to harvest the objects that are in fact critical or I at least assume so. This suggests that objects not in this mix are at least survivable. Otherwise it is a good plan and puts hardware in orbit able to act as the local fire department

This is certainly a better scenario than previously espoused and based on little good data. A solution is available and it is cost effective. Loses of inaction will exceed that of implementation and if that is true then a common program needs to be put to work perhaps paid for on a per pound recovery charge so no one can squabble over whose fault.

I suspect that it will take some time for all this to be made to happen but the ability to charge back to the source programs will bring interest levels up. The problem is measurable and users can calculate their liability. This at least makes it a sufficiently solvable problem.

Debris - Problem Solved

http://www.spacedaily.com/reports/Space_Debris_Problem_Solved_999.html

Although space debris proliferation presents a long-term challenge that will require a long-term solution, the immediate problem is quite bounded. A study of debris distribution reveals the near-term troubled zone to be a spherically symmetric region between the altitudes of 700 km and 900 km.

by Launchspace Staff

Bethesda MD (SPX) Aug 31, 2009
There is no doubt that the topic of "space debris" is hot! It is a hot subject at
NASA, DARPA, Air Force Space Command, ESA and in the board rooms of all commercial satellite operators. High anxiety is running rampant among these groups. Every debris mitigation technique has been reviewed and pursued. New satellites must have the ability to either de-orbit or move out of the way at end-of-mission.

Upper stages must vent tanks to rid them of residual propellant that might later result in explosions. Many satellites are maneuvered to avoid close-conjunction events. JSpOC is beefing up its satellite and debris tracking capabilities. National and international working groups are meeting regularly to assess the threat and to recommend actions for all space-faring nations. The world is just one major satellite collision event away from panic.

Instances of close conjunction events in highly congested orbital bands have increased dramatically in the past few years. In fact, the frequency of close encounters between active satellites and large debris objects within the Iridium constellation has reached a frighteningly high level. Odds are that there will be another Iridium/Cosmos type of event in the near future.

Should such an event occur, several bad things will happen to many satellite operators. If another Iridium satellite is involved the company would be forced to replace the lost satellite. The frequency of close encounters in orbits near that of Iridium's constellation would suddenly increase to levels that would cause several operators to reassess the viability of existing space applications.

Satellite insurance providers might be forced to raise premiums on in-orbit performance to record high levels. Future launch plans for almost all low orbit satellites may be curtailed. Space-based services to the world would diminish over time. The economic impact is not even calculable. This is scary!

Not to fear. A solution is on the way.

Although space debris proliferation presents a long-term challenge that will require a long-term solution, the immediate problem is quite bounded. A study of debris distribution reveals the near-term troubled zone to be a spherically symmetric region between the altitudes of 700 km and 900 km.

This is where a great many operational satellites and large debris objects co-exist. Thus, the near-term challenge appears to be the removal of enough large debris objects in order to reduce collision risks to levels consistent with statistical times-between-debris-collisions that are much higher than expected satellite mission lifetimes.

Sounds simple, but it is not! Seems impossible, but it is not! So, what will it take to do the job?

Simply stated, all affected parties must collaborate and contribute to create a massive new space effort. There are literally well over 1,000 large debris objects that pose an immediate threat. Every one of these can be removed, and there are a number of removal techniques. One approach, as an example, would be to develop specially designed "Debris Collection Spacecraft."

Each DCS would be capable of maneuvering and rendezvousing with several objects, one at a time. Each object may be stored for later de-orbit, or fitted with an autonomous de-orbit unit that slows the object's orbital speed. If each DCS can deal with 100 objects, assuming only 1,000 objects need to be removed, the job will require 10 DCSs. This whole removal operation must be transparent to commercial, civil and security satellite operators.

In order to be effective, the removal program needs to start yesterday, because it will take several years before actual removal operations can begin. We don't have a lot of time here. If each of 100 objects being collected by one DCS takes three days of maneuvering to reach, then each DCS would require roughly 10 months to achieve its mission. However, it is likely that the DCSs will require in-orbit refueling after each 10 rendezvous completions.

The total mission span for each DCS seems to be roughly one year. If the program is started immediately, it could be completed in about five or six years. The program cost is estimated at $3 billion, based on developing the DCS, on-orbit refueling vehicles and operations, building 10 DCSs and one to two years of ground operations. This is cheap compared to the cost of not doing it.

For all those who are concerned and interested in the space debris crisis, your first step is to get smart on the issues and possible solutions. This is where Launchspace can help. If you are involved in space flight or want to better understand the new space crisis, you will want to sign up for the "must take" seminar on the subject, October 27th in Washington, DC.

Oceanic Mascons


One day after reporting on the apparent existence of a switching mechanism in the heat flux of the ocean we get this. It is the one major variable that we would like to see closely mapped and here we are making it possible.

A shift in volumes of deep sea cold waters from one locale to the next would be really important. That it could actually explain the known sudden switching of global climate conditions is important because none of the other options work well at all.

Left on its own, the northern climate would warm up to a pleasant level warmer than today and stay there forever as demonstrated by the Bronze Age optimum.

So I return to the hypothesis that what we are dealing with is a mature equilibrium between the two hemispheres that became fully operational about three thousand years ago. Prior to that two changes were working to completion. The first was that the northern ice age was fully completing the process of deglaciation and the melt waters were both mixing with the ocean waters and the ocean itself was achieving a stable level of warmth. This was largely done perhaps as soon as 5,000 BP but certainly by 3000 BP.

The second event was that the southern polar cap was expanding to a new stable configuration possibly reached around 3000 BP. What began then was a cycle of deep sea cold water been injected periodically into the Northern Hemisphere in order to balance the two hemispheres.

These cycles are not properly mapped as yet but the indicators show some form of minor variation that occurs over a thirty to fifty year cycle or so. If this is generated mostly in the Pacific then it could be pretty benign. I think that we may have a similarly sized event occasionally happening in the Atlantic which would be several times more pronounced.

Anyway we are much closer to having measuring tools.

New Look At Gravity Data Sheds Light On Ocean And Climate

http://www.spacedaily.com/reports/New_Look_At_Gravity_Data_Sheds_Light_On_Ocean_And_Climate_999.html


by Rosemary SullivantPasadena CA (SPX) Aug 28, 2009

A discovery about the moon made in the 1960s is helping researchers unlock secrets about Earth's ocean today. By applying a method of calculating gravity that was first developed for the moon to data from NASA's Gravity Recovery and Climate Experiment, known as Grace, JPL researchers have found a way to measure the pressure at the bottom of the ocean.

Just as knowing atmospheric pressure allows meteorologists to predict winds and weather patterns, measurements of ocean bottom pressure provide oceanographers with fundamental information about currents and global ocean circulation. They also hold clues to questions about sea level and climate.

"Oceanographers have been measuring ocean bottom pressure for a long time, but the measurements have been limited to a few spots in a huge ocean for short periods of time," says JPL oceanographer Victor Zlotnicki.

Launched in 2002, the twin Grace satellites map Earth’s gravity field from orbit 500 kilometers (310 miles) above the surface. They respond to how mass is distributed in the Earth and on Earth's surface -the greater the mass in a given area, the stronger the pull of gravity from that area.

The pressure at the bottom of the ocean is determined by the amount of mass above it. "Ocean bottom pressure is the sum of the weight of the whole atmosphere and the whole ocean," says Zlotnicki. "When winds move water on the surface, ocean bottom pressure changes. When glaciers melt and add water to the ocean, the ocean's mass increases and bottom pressure increases, either at one place or globally."

"Measuring ocean bottom pressure was one of the things we said we wanted to do from the very beginning of the mission," says Grace project scientist Michael Watkins, "but it has been a challenge. The signal is very small and hard to detect."

Gravity changes over the ocean are miniscule compared to those over land. The ocean is a fluid. It yields to pressure and spreads the effect over a vast area. Nothing in the ocean gives as big a gravity signal as a flooding Amazon River or melting glaciers in Greenland or Alaska, changes that Grace can measure fairly easily, says Watkins. "Those hydrology signals are huge in comparison," he says.

However, as the mission progressed, Watkins explains, the science team has found better ways to process Grace data. And by turning to a technique developed for the lunar world, Grace researchers are getting the precise measurements of ocean bottom pressure they were hoping for.

From the moon to the ocean bottomIn the days leading up to the Apollo missions, JPL scientists discovered that certain areas of the moon had higher concentrations of mass than others. The result of these "mass concentrations" was marked differences in the moon's gravity field.

The researchers then devised a new way to calculate the gravity field called a "mascon" (for mass concentration) solution. Mascon solutions break the gravity field into small, individual regions. The more traditional ways of computing gravity, often called harmonic solutions, smooth everything together and calculate gravity for a whole large area or body.

Recently scientists have begun developing mascon solutions for Grace data for use in a variety of studies, and they are revealing fascinating new details about Earth's gravity field. These mascon solutions are also proving to be a key to Grace's ability to measure ocean bottom pressure.

"Some of the very best harmonic solutions show some bottom pressure signals, but the mascon solutions appear to do a better job and provide much higher resolution," says Watkins.

"Using a mascon solution with Grace data is a way of weighing each little piece of the ocean," he says. The result is a new view of the gravity field - one that reveals sharp contrasts in gravity precise enough to calculate variations in ocean bottom pressure.

A large field experiment off the coast of Japan provided an unusual and welcomed opportunity to put Grace mascon estimates of ocean bottom pressure to the test. There are few places in the ocean where there are enough data on ocean bottom pressure to validate the satellite's observations.

Oceanographer Jae-Hun Park and his colleagues at the University of Rhode Island compared the Grace measurements with data collected by a large array of pressure-reading instruments stationed on the ocean bottom as part of the Kuroshio Extension System Study. This two-year observational program to study deep ocean currents and fronts ran from 2004 to 2006.

"Our site covered a very wide area of 600 by 600 kilometers (370 miles) with 43 available bottom pressure sensors," says Park. He and his colleagues found that while some of the individual sensors had very high correlations with Grace measurements, others were very low. "These low correlations were small-scale eddies that Grace cannot catch," explains Park. Grace's resolution is about 200 kilometers (125 miles).

However, when they compared the spatially averaged monthly mean ocean bottom pressure measured by the ocean sensors with the latest JPL Grace mascon solution for the center of the array, "we found a high correlation between the Grace measurements and our in-situ measurements," says Park.

"This experiment gave us the opportunity to validate the Grace data." The results of the study appeared last year in Geophysical Research Letters.

Grace's new ability to detect small changes in ocean mass - reflected in ocean bottom pressure - will help scientists answer ongoing questions about sea level and climate change. It will help clarify, for example, just how much of sea level change is due to differences in ocean mass, the result of evaporation, precipitation, melting land ice, or river run-off and how much is due to temperature and salinity.

"Now, for the first time with these new mascon solutions," say Zlotnicki, "Grace will allow us to measure changes in ocean bottom pressure globally for long periods of time. This is a new tool for oceanography."

Monday, August 31, 2009

Deforestration Abates


This is a welcome bit of information to act as an anecdote to the more hysterical reports of past years. In a way it is unsurprising. Central governments need their tax revenue and illegal cutting is all about operating out of government oversight and taxation. Obviously, if the local government cannot collect taxes on these logs and new fields, then there is slim chance they can hope to regulate the practice.

Thus the economic necessity of central governments is doing what all the laws and police can never quite do.

Slash and burn will continue until the farmers are encouraged to adopt biochar and are given homestead rights on that basis. At which point it will disappear in a hurry.

It was my lot to once visit a site in the jungles of Borneo a couple of decades ago. It was situated on a small river with a good flow a few miles inland. I saw a steady stream of logs tied up in small booms of perhaps several logs each with a logger riding each boom down to the sea. In the river mouth, there was a tramp ship collecting these logs and loading them. At best the local constabulary had speed boats thirty miles away and easier fish to fry. I got the distinct impression that no one asked too many questions.

I am sure today that the mill exists and that the tramp is no longer collecting logs and who ever comes down that river may even be paying taxes.

INTERVIEW-Global forest destruction seen overestimated
Fri Aug 21, 2009 3:46pm EDT

http://www.reuters.com/article/latestCrisis/idUSN2165866

By Stuart Grudgings

RIO DE JANEIRO, Aug 21 (Reuters) - The amount of carbon emissions caused by world forest destruction is likely far less than the 20 percent figure being widely used before global climate talks in December, said the head of the Brazilian institute that measures Amazon deforestation.

Gilberto Camara, the director of Brazil's respected National Institute for Space Research, said the 20 percent tally was based on poor science but that rich countries had no interest in questioning it because the number put more pressure on developing countries to stem greenhouse gases.

"I'm not in favor of conspiracy theories," Camara told Reuters in a telephone interview on Friday.

"But I should only state that the two people who like these figures are developed nations, who would like to overstress the contribution of developing nations to global carbon, and of course environmentalists."
A lower estimate for carbon emissions from deforestation would have an impact on the Copenhagen talks, where preserving forests is a top item on the agenda.

The summit will negotiate a follow-up to the Kyoto climate change treaty that could introduce forest credit trade to cut developing nation deforestation.

Camara, who stressed that he thought Brazil's deforestation rates remain too high, said recent calculations by his institute using detailed satellite data showed clearing of the world's biggest forest accounted for about 2.5 percent of annual global carbon emissions.

Given that the Amazon accounts for about a quarter of deforestation globally, a figure of about 10 percent for total emissions caused by forest destruction is likely to be more accurate, Camara said.

The 20 percent figure used by the Intergovernmental Panel on Climate Change was based on calculations from sampling of forests by the United Nations Food and Agriculture Organization (FAO), he said.

The FAO method came up with an average annual figure of 31,000 sq km (12,000 sq miles) deforested in the Amazon from 2000-2005. But Brazil's method of using satellite images to measure deforestation "pixel by pixel" was far more accurate and showed a figure of 21,500 sq km for the period, Camara said.

DEFORESTATION HEADING LOWER

For 2005-2009, the FAO estimate was double the correct figure, Camara said.

"The FAO grossly overestimated deforestation in Brazil and there are papers that show that such overestimation is also true for many other countries, including of course Indonesia."

Indonesia is among the world's biggest deforesters.

Camara said he was skeptical of any deal involving Brazil being rewarded for "avoided deforestation" because the average rate of destruction remained far too high.

"Deforestation in 2004 was 27,000 sq km and let's say in 2009 it is 10,000 sq km. It is not fair to say that we avoided 17,000 sq km of deforestation in as much as our current level is still too much, and 90 percent of that is illegal," he said.

"The concept of avoided deforestation is a weak concept. It would not stand up to scrutiny."Deforestation of the Amazon, which makes Brazil one of the biggest global carbon emitters, is on course to fall sharply in the August-to-July annual period in which it is measured.
Satellite data shows that new, large deforested areas are about half the area they were in the previous year, when total deforestation was 12,000 sq km.

"We are hopeful that deforestation will go down. In areas where deforestation had been high in previous years, like Mato Grosso and Rondonia state, it is relatively under control," Camara said.

The government has taken steps to crack down on illegal deforestation over the past year. Falling deforestation may also be due to the fall in commodity prices over the past year, reducing the incentive for farmers and ranchers to clear land. (Editing by John O'Callaghan)

Monday, January 26, 2009

Superflare Superthreat

The only good thing about a super flare is that it is brief. This article is a reminder that they really exist. And it will still take a lot of time to recover services, particularly if all the transformers are fried.

Which truly begs the question regarding how well the system is protected? This is not difficult, but certainly costs money. It is surely not impossible to protect transformers in particular and those are the things that take time. Breakers protect cables surely even though most everything else is likely to be fried.

I doubt is any of our computers are protected. So while protecting the grid is a case of avoiding design negligence, the rest of the system needs regulatory standards.

This report is a loud warning that we have not done what common sense tells us to do. We need to pay attention. Why are our transformers and motors not wrapped simply in foil? Or is that just too cheap and brain dead easy? Of course most computers are in metal casings which do most of the job.

However, the mere fact that 130 main transformers are even vulnerable tells me that this issue is not on any design engineer’s radar.

It is simple to put the rules in place to lower exposure and simple obsolescence will resolve it all over twenty years. The only thing that requires immediate attention is the transformer inventory. There we are talking about Hurricane Katrina style negligence

Severe Space Weather

01.21.2009 January 21, 2009: Did you know a solar flare can make your toilet stop working?

That's the surprising conclusion of a NASA-funded study by the National Academy of Sciences entitled Severe Space Weather Events—Understanding Societal and Economic Impacts. In the 132-page report, experts detailed what might happen to our modern, high-tech society in the event of a "super solar flare" followed by an extreme geomagnetic storm. They found that almost nothing is immune from space weather—not even the water in your bathroom.

The problem begins with the electric power grid. "Electric power is modern society's cornerstone technology on which virtually all other infrastructures and services depend," the report notes. Yet it is particularly vulnerable to bad space weather. Ground currents induced during geomagnetic storms can actually melt the copper windings of transformers at the heart of many power distribution systems.
Sprawling power lines act like antennas, picking up the currents and spreading the problem over a wide area. The most famous geomagnetic power outage happened during a space storm in March 1989 when six million people in Quebec lost power for 9 hours: image.

According to the report, power grids may be more vulnerable than ever. The problem is interconnectedness. In recent years, utilities have joined grids together to allow long-distance transmission of low-cost power to areas of sudden demand. On a hot summer day in California, for instance, people in Los Angeles might be running their air conditioners on power routed from Oregon. It makes economic sense—but not necessarily geomagnetic sense. Interconnectedness makes the system susceptible to wide-ranging "cascade failures."

To estimate the scale of such a failure, report co-author John Kappenmann of the Metatech Corporation looked at the great geomagnetic storm of May 1921, which produced ground currents as much as ten times stronger than the 1989 Quebec storm, and modeled its effect on the modern power grid. He found more than 350 transformers at risk of permanent damage and 130 million people without power. The loss of electricity would ripple across the social infrastructure with "water distribution affected within several hours; perishable foods and medications lost in 12-24 hours; loss of heating/air conditioning, sewage disposal, phone service, fuel re-supply and so on."

"The concept of interdependency," the report notes, "is evident in the unavailability of water due to long-term outage of electric power--and the inability to restart an electric generator without water on site."

http://science.nasa.gov/headlines/y2009/images/severespaceweather/collapse.jpg


Above: What if the May 1921 superstorm occurred today? A US map of vulnerable transformers with areas of probable system collapse encircled. A state-by-state map of transformer vulnerability is also available: click here. Credit: National Academy of Sciences.

The strongest geomagnetic storm on record is the Carrington Event of August-September 1859, named after British astronomer Richard Carrington who witnessed the instigating solar flare with his unaided eye while he was projecting an image of the sun on a white screen. Geomagnetic activity triggered by the explosion electrified telegraph lines, shocking technicians and setting their telegraph papers on fire; Northern Lights spread as far south as Cuba and Hawaii; auroras over the Rocky Mountains were so bright, the glow woke campers who began preparing breakfast because they thought it was morning. Best estimates rank the Carrington Event as 50% or more stronger than the superstorm of May 1921.

"A contemporary repetition of the Carrington Event would cause … extensive social and economic disruptions," the report warns. Power outages would be accompanied by radio blackouts and satellite malfunctions; telecommunications, GPS navigation, banking and finance, and transportation would all be affected. Some problems would correct themselves with the fading of the storm: radio and GPS transmissions could come back online fairly quickly. Other problems would be lasting: a burnt-out multi-ton transformer, for instance, can take weeks or months to repair. The total economic impact in the first year alone could reach $2 trillion, some 20 times greater than the costs of a Hurricane Katrina or, to use a timelier example, a few TARPs.

What's the solution? The report ends with a call for infrastructure designed to better withstand geomagnetic disturbances, improved GPS codes and frequencies, and improvements in space weather forecasting. Reliable forecasting is key. If utility and satellite operators know a storm is coming, they can take measures to reduce damage—e.g., disconnecting wires, shielding vulnerable electronics, powering down critical hardware. A few hours without power is better than a few weeks.

NASA has deployed a fleet of spacecraft to study the sun and its eruptions. The Solar and Heliospheric Observatory (SOHO), the twin STEREO probes, ACE, Wind and others are on duty 24/7. NASA physicists use data from these missions to understand the underlying physics of flares and geomagnetic storms; personnel at NOAA's Space Weather Prediction Center use the findings, in turn, to hone their forecasts.
At the moment, no one knows when the next super solar storm will erupt. It could be 100 years away or just 100 days. It's something to think about the next time you flush.