Plants unpack winter coats when days get shorter

Michael Thomashow, University Distinguished Professor of molecular genetics, studies how plants adapt to freezing and drought. (Credit: Photo by Kurt Stepnitz)

Mechanisms that protect plants from freezing are placed in storage during the summer and wisely unpacked when days get shorter.

In the current issue of the Proceedings of the National Academy of Sciences, Michael Thomashow, University Distinguished Professor of molecular genetics, demonstrates how the CBF (C-repeat binding factor) cold response pathway is inactive during warmer months when days are long, and how it’s triggered by waning sunlight to prepare plants for freezing temperatures.

The CBF cold response pathway was discovered by Thomashow’s team, and it has been shown to be active in crop species as they ready themselves for cold weather.

“We knew that when plants are exposed to cold, nonfreezing temperatures, they can better survive below-freezing temperatures,” said Thomashow, who co-authored the study with Chin-Mei Lee, MSU plant biologist. “What this new research demonstrates, though, is that plants’ defense mechanisms are also triggered by shortening daylight.”

It’s widely known that waning daylight triggers trees’ defenses against freezing, but this has never been demonstrated in crops and other annual plants. The paper not only shows that such plants use shorter days as a cue for the impending winter, but that the mechanism also is turned off during the warm growing season.

“The CBF pathway is actively turned off during the summer to prevent the allocation of precious resources toward unneeded frost protection,” Thomashow said.

Identifying the genes involved in this process gives researchers the potential tools to fine tune this regulation and increase crop productivity, he added.

Thomashow’s research is supported in part by the U.S. Department of Energy (Division of Chemical Sciences, Geosciences, and Biosciences, Office of Basic Energy Sciences), the National Science Foundation (Plant Genome Project) and MSU AgBioResearch.

 

Journal Reference:

  1. Chin-Mei Lee and Michael F. Thomashow. Photoperiodic regulation of the C-repeat binding factor (CBF) cold acclimation pathway and freezing tolerance in Arabidopsis thaliana. PNAS, 2012 DOI: 10.1073/pnas.1211295109

Twin satellites will help improve space weather forecasts

An artist's rendering shows the twin Radiation Belt Storm Probes (RBSP) satellites in tandem orbit above the Earth. (Credit: NASA)

On Aug. 24, NASA will launch two identical satellites from Cape Canaveral, Fla., to begin its Radiation Belt Storm Probes (RBSP) mission to study the extremes of space weather and help scientists improve space weather forecasts.

Why should you care?

Because, says a University of Iowa space physics researcher, if you've ever used a cell phone, traveled by plane, or stayed up late to catch a glimpse of the northern lights, then you have been affected by space weather without even knowing about it.

Scientists want to better understand how the Van Allen radiation belts — named after UI astrophysicist James A. Van Allen — react to solar changes, thereby contributing to Earth's space weather. Changes in space weather can disable satellites, overload power grids, and disrupt GPS service.

In addition, coronal mass ejections (CMEs) periodically release billions of tons of charged particles from the sun into space. And, with the 11-year solar cycle expected to peak in 2013, there is an increased potential for CME-caused power surges to knock out electric transformers that support lighting, heating, air conditioning, sewage treatment, and many other necessities of daily life.

Space weather storms are made up of gusts of electrically charged particles — atoms that have been stripped of electrons — that constantly flow outward from the sun. When these particles reach Earth, some become trapped in Earth's magnetosphere to form the Van Allen radiation belts, two donut-shaped regions that encircle Earth. The RBSP mission will collect data on particles, magnetic and electric fields, and waves to reveal how the belts change in space and over time.

Craig Kletzing [KLET-zing], F. Wendell Miller Professor of physics and astronomy in the University of Iowa College of Liberal Arts and Sciences, is the principal investigator for the UI team that designed the Electromagnetic Instrument Suite with Integrated Science (EMFISIS). One of five different RBSP instrument pairs, or suites, EMFISIS is a $30 million NASA project to study how various amounts of space radiation form and change during space storms.

The other four instrument suites are directed by teams from the University of New Hampshire, the University of Minnesota, the New Jersey Institute of Technology, and the National Reconnaissance Office. The two RBSP spacecraft — each weighing 1,455 pounds — were constructed for NASA at the Johns Hopkins University Applied Physics Laboratory (APL) in Laurel, Md.

Says Kletzing: "The Radiation Belt Storm Probes is actually the first NASA mission to be launched in more than two decades that's going back to revisit the radiation belts since they were discovered by the late University of Iowa professor James A. Van Allen over 50 years ago. There are still lots of things we don't understand about how they work, about how the sun delivers energy to the local environment around the Earth, and particularly about how it creates these two bands of very energetic particles that we call the radiation belts."

Like many other NASA projects, the Radiation Belt Storm Probes mission has two main reasons for existing: it will gather practical information and it is a part of humanity's continuing exploration of space.

"The practical reason is: that's a part of space that we utilize. The outer radiation belts are where all our communication satellites exist, the various things that make sure that GPS works, as well as telephone communications," Kletzing says. "They can be affected by these particles, and, in fact, it has happened that those satellites have actually been knocked out by radiation.

"So, understanding these effects and how they happen and, hopefully, get beyond to where we can do some level of prediction is a very important practical reason.

"Additionally, the various manned missions that NASA has planned to go beyond the Space Station to places like the moon or Mars also require transiting through this region," he says. "So, understanding the right time to go — when the particles are fewest so that you don't impact human health — is a very important thing to understand."

The less practical reason for undertaking the RBSP project is familiar to mountain climbers and other explorers — because it's there.

"We want to know how the heck the darn thing works," he says. "We've learned from science over the years that you can't always predict that one thing you learn here will influence another field and allow whole breakthroughs to occur. So it's really both. The practical, direct reasons, but also if we understand the physics of the radiation belts, that helps us understand physics in other stellar systems and all sorts of other phenomena that are related."

Here's how the RBSP project will work:

  • One rocket will launch two satellites.
  • The two satellites will orbit Earth from about 300 miles above Earth out to as far as 25,000 miles at apogee.
  • The satellites will be given slightly different orbits so that over time, one will run ahead of the other.
  • They will fly nearly identical orbits that cover the entire radiation belt region, lapping each other during the course of the two-year mission.

"We talk about one spacecraft lapping the other," says Kletzing. "What makes that exciting is that both spacecraft are exactly the same — all the same sets of measurements are on the two different spacecraft. So, for the first time, we'll have completely identical sets of instruments on both sides that we can compare between the two satellites. And actually say, 'Oh, this is happening here, and that's happening there.' Maybe they work together or maybe they're different things. But we've never had a pair of identical spacecraft in this region before."

The RBSP project is a collaborative effort between research teams at several universities, with the Johns Hopkins Applied Physics Laboratory having constructed the spacecraft. The UI's EMFISIS instruments will measure the various kinds of waves the spacecraft will encounter.

At the UI, Kletzing, together with UI collaborator and co-investigator Bill Kurth, built the Waves and search coil magnetometer sections for the EMFISIS investigation. The UI also worked with the Goddard Space Flight Center, which built a magnetometer as a part of the UI instrument suite. Also, the University of New Hampshire provided the computer that controls all of the EMFASIS measurements. So, the hardware part of the UI project is actually a three-institution collaboration, Kletzing says, and the theory and modeling teams at UCLA and Los Alamos National Laboratory bring the total collaboration to five institutions.

It's not surprising that the UI is helping give scientists a clearer picture of space weather in the Van Allen radiation belts. The study of space weather really began at Iowa in 1958, when UI space physicist James A. Van Allen discovered the radiation belts using data from Explorer 1, the first successful U.S. spacecraft. Van Allen's discovery improved our understanding of Earth and the solar system and created a new field of research called magnetospheric physics.

The RBSP mission is part of NASA's Living With a Star program, an effort to learn more about the sun-Earth interaction, which is managed by Goddard Space Flight Center, Greenbelt, Md. APL built the RBSP spacecraft and will manage the mission for NASA. More information on RBSP is available at rbsp.jhuapl.edu and www.nasa.gov/rbsp.

Flood risk ranking reveals vulnerable cities

Shanghai skyline at dusk. (Credit: © chungking / Fotolia)

A new study of nine coastal cities around the world suggests that Shanghai is most vulnerable to serious flooding. European cities top the leader board for their resilience.

These finding are based on a new method to calculate the flood vulnerability of cities, developed by a team of researchers from the Netherlands and the University of Leeds. The work is published in the latest edition of the journal Natural Hazards.

The index does not just look at the likelihood of a city's exposure to a major 'once in a hundred years' flood. The researchers have been careful to include social and economic factors in their calculations too.

The index incorporates 19 components, including measures of the level of economic activity in a city, its speed of recovery, and social issues such as the number of flood shelters, the awareness of people about flood risks, and the number of disabled people in the population. Several index components also look at the level of administrative involvement in flood management.

The researchers used their index to analyse the vulnerability to coastal flooding of nine cities built on river deltas: Casablanca (Morocco), Calcutta (India), Dhaka (Bangladesh), Buenos Aires (Argentina), Osaka (Japan), Shanghai (China), Manila (Philippines), Marseille (France) and Rotterdam (the Netherlands).

The results of the analysis reveal that the highly prosperous megapolis of Shanghai, in China, is more vulnerable than much poorer cities such as Dhaka in Bangladesh.

"Vulnerability is a complex issue," explains Professor Nigel Wright, who led the team from the University of Leeds. "It is not just about your exposure to flooding, but the effect it actually has on communities and business and how much a major flood disrupts economic activity. Our index looks at how cities are prepared for the worst — for example, do they have flood defences, do they have buildings that are easy to clean up and repair after the flood? It is important to know how quickly a city can recover from a major flood."

Shanghai's is particularly vulnerable because it is exposed to powerful storm surges and the land is subsiding as sea levels rise. Moreover, although a large population lives along the coast in flood-prone areas, but the city is poorly prepared, with little resilience to a major flood and insufficient flood shelters for victims.

"A 1-in-100 year flood in Shanghai would lead to widespread damage, with serious consequences for the city, across China and, through wider economic links, for the whole world," Professor Wright comments.

The vulnerability index also revealed that Dhaka, which sits just metres above current sea levels, is regularly hit by tropical cyclones and floods, yet it has few defences in place and little resilience. Manila in the Philippines and Calcutta in India are also highly vulnerable largely because of their large populations and degree of expose to storms.

The European cities of Marseille and Rotterdam are also exposed to flood risks, with violent storms, high river levels and significant low-lying areas. But the cities are least vulnerable with good flood management infrastructure and tight building regulations for flood-prone areas, for example. "When a big flood hits you will still get flooding," says Professor Wright, "but these European cities will bounce back quickly."

The researchers also used their vulnerability index to assess how climate change would affect the vulnerability of these cities in the future. With sea levels predicted to rise over the next 100 years, the study found that Shanghai and Dhaka will remain the most vulnerable cities in 2100, although the vulnerability of all the cities will increase (doubled in the case of Manila).

"Our index provides a flexible tool for cities to explore how they are currently exposed to flooding and how this may change in the future. It will help them to prioritise their flood risk and resilience strategies," says Professor Wright.

This research was funded by the Government of the Netherlands.

 

Journal Reference:

  1. S. F. Balica, N. G. Wright, F. Meulen. A flood vulnerability index for coastal cities and its use in assessing climate change impacts. Natural Hazards, 2012; DOI: 10.1007/s11069-012-0234-1

Summer weather could mean fall colors pop in Northeast U.S.

The summer's dry weather, combined with recent cool nights, could combine for a colorful fall foliage season in the Northeast. (Credit: © Andrzej Tokarski / Fotolia)

The summer's dry weather, combined with recent cool nights, could combine for a colorful fall foliage season in the Northeast, U.S..

"Right now, without knowing what's going to happen in the middle of October when the fall colors start to peak regionally, it looks like it's going to be a good year for fall colors," said Dr. Donald J. Leopold, a dendrologist and Distinguished Teaching Professor at the SUNY College of Environmental Science and Forestry.

Fall colors depend largely on weather conditions that occur closer to the peak foliage season, Leopold noted, but early indications are that 2012 could feature a bright fall.

A combination of factors affect fall color, including the amount of rainfall a region receives, the number of sunny days and nighttime temperatures, with cooler temperatures boosting colors. Recent lows in Central New York, where ESF is located, have dipped into the 50s.

Although homeowners might see the plants in their landscaping, particularly trees that were planted recently, suffer effects of the regional drought this summer, Leopold said plants in natural settings are less likely to be affected by dry weather.

Have Swedish forests recovered from the storm Gudrun?

— In January 2005, the storm Gudrun hit Sweden. It has been estimated to have caused an overall economic damage of 2.4 billion euros in Swedish forestry alone. But has there been more damage to the forest than was clearly visible? A recently published study by Seidl and Blennow shows that Gudrun caused not only immediate damage corresponding to 110% of the average annual harvest in Sweden from only 16% of the country's forest area but also pervasive effects in terms of growth reduction.

In recent decades, the frequency and severity of natural disturbances by e.g., strong winds and insect outbreaks has increased considerably in many forest ecosystems around the world. Future climate change is expected to further intensify disturbance regimes, which makes addressing disturbances in ecosystem management a top priority. As a prerequisite a broader understanding of disturbance impacts and ecosystem responses is needed. With regard to the effects of strong winds — the most detrimental disturbance agent in central and northern Europe — monitoring and management has focused on structural damage, i.e., tree mortality from uprooting and stem breakage. Effects on the functioning of trees surviving the storm (e.g., their productivity and allocation) have been rarely accounted for to date.

Seidl and Blennow show that growth reduction following the storm was significant and pervasive in a 6.79 million hectare forest landscape. Wind-related growth reduction in Norway spruce forests surviving the storm exceeded 10% in the worst hit regions. At the landscape scale, wind-related growth reduction amounted to 3.0 million m3 in the three years following Gudrun. It thus exceeds the annual long-term average storm damage from uprooting and stem breakage in Sweden and is in the same order of magnitude as the volume damaged by spruce bark beetles after Gudrun.

Seidl and Blennow conclude that the impact of strong winds on forest ecosystems is not limited to the immediately visible area of structural damage, and call for a broader consideration of disturbance effects on ecosystem structure and functioning in the context of forest management and climate change mitigation.


Journal Reference:

  1. Rupert Seidl, Kristina Blennow. Pervasive Growth Reduction in Norway Spruce Forests following Wind Disturbance. PLoS ONE, 2012; 7 (3): e33301 DOI: 10.1371/journal.pone.0033301

Major world interests at stake in Canada's vast Mackenzie River Basin

The Mackenzie is Canada’s longest river — about 1,800 km — and pours a staggering 10.3 million liters (enough to fill four Olympic swimming pools) into the Arctic Ocean every second, along with 100 million tons of sediment per year. That’s slightly more than the St. Lawrence River discharges into the Atlantic, experts estimate. (Credit: Walter & Duncan Gordon Foundation)

The governance of Canada's massive Mackenzie River Basin holds enormous national but also global importance due to the watershed's impact on the Arctic Ocean, international migratory birds and climate stability, say experts convening a special forum on the topic.

"Relevant parties in western Canada have recognized the need for a multi-party transboundary agreement that will govern land and water management in the Mackenzie River watershed. Successful collaboration will effectively determine the management regime for a watershed covering 1.8 million square kilometers or about 20 percent of Canada — an area roughly three times the size of France — and include the country's vast oil sands," says University of California Prof. Henry Vaux, Chair of the Rosenberg Forum, which meets Sept. 5-7 at Vancouver's Simon Fraser University with the support of the Walter and Duncan Gordon Foundation.

The Forum's goals include identifying legal and scientific principles relevant to the processes leading ultimately to a coordinated basin-wide approach to management, as well as prioritizing knowledge gaps.

The Mackenzie is Canada's longest river — about 1,800 km — and pours a staggering 10.3 million liters (enough to fill four Olympic swimming pools) into the Arctic Ocean every second, along with 100 million tons of sediment per year. That's slightly more water than the St. Lawrence River discharges into the Atlantic, according to estimates.

The Mackenzie Basin includes three major lakes (Great Slave, Great Bear and Athabasca, which together contain almost 4,000 cubic km of water) and many major rivers, including the Peace, Athabasca, Liard, Hay, Peel, South Nahanni and Slave.

Complex challenges confront this immense territory rich in natural assets, which include intact forests, vital habitat for wildlife and for birds that migrate as far as South America, deep stores of trapped carbon, and vast deposits of oil, oil sands, natural gas and minerals.

A 2011 report published by the Gordon Foundation urges the federal government to work with jurisdictions in the basin to implement a world-class water monitoring program and support credible, independent water research.

Says Thomas S. Axworthy, President and CEO of the Toronto-based Gordon Foundation, "The starting point of good water policy is knowledge and the starting point of knowledge is to monitor on a regular basis the quality of water in the Mackenzie Basin — for the health of the North, Canada and the world."

Through bi-lateral and multi-lateral discussions, Canada's three westernmost provinces — British Columbia, Alberta and Saskatchewan — its Yukon and Northwest Territories, and the federal government are seeking to set objectives for surface and groundwater quality and quantity, emergency notification requirements, information exchange protocols and dispute resolution processes.

"Anything less than a basin-wide program with strict water quality and quantity standards, backed by binding requirements for prior notification and consultation and dispute resolution, will squander an opportunity to finally give the Mackenzie Basin a governance regime that will protect it for future generations," adds J. Owen Saunders, adjunct law professor at the University of Calgary and former Executive Director of the Canadian Institute of Resources Law.

Says renowned Canadian water scientist James Bruce: "Development activity in British Columbia and Alberta is intensifying adverse impacts of climate change. Agreements must take into account the growing regional impacts of climate change and the need for an adaptive management strategy."

The refrigerator-like cooling effect of ice and annual snow cover in the northern Mackenzie basin plays a vital role in weather and climate patterns in Canada and throughout the northern hemisphere.

Rosenberg Forum panelist Prof. John Pomeroy of the University of Saskatchewan expects climate change to impact northern basin hydrology significantly by causing more precipitation in the form of rain and a shorter snow-cover period, reducing snow's beneficial insulating effect on permafrost. Those new conditions will also increase the prevalence of ice layers, which can increase spring runoff and streamflow but restrict grazing by caribou and muskox.

Expanding shrub cover in formerly open tundra is resulting in warmer air, soils and greater streamflow generation, he notes. The loss of spring snow cover and expansion of shrubs warm the northern air and cause changes in weather patterns and climate throughout the world.

Dr. Pomeroy notes that permafrost thaw in the southern Northwest Territories is causing the large-scale collapse of black spruce forests and increasing stream flow. Coincident with increased stream flow generation in the northern Mackenzie River Basin, climate warming is decreasing stream flow generation in the basin's south tributaries, due to declining mountain snowpack and increases in evaporation.

Trapped within the permafrost, meanwhile, are ancient stores of greenhouse gases, the release of which could transform the region from sink to a source of greenhouse gases.

When ice layers thaw, slumping of the land results in the discharge of sediments to rivers and allows perched ponds and lakes to drain. Tributary river courses and groundwater flows can alter, leaving spawning areas disrupted. Melting permafrost can also severely damage drainage facilities, roads, buildings, and pipelines.

The Mackenzie Delta — where the river meets the Arctic Ocean — is increasingly subject to storm surges from the Beaufort Sea and salt water intrusion due to three factors: reduced nearshore ice, sea levels rising at accelerated rates, and more frequent severe winter storms.

Ecosystems in this productive area will increasingly be affected, and buildings and infrastructure in low-lying areas will be flooded more frequently.

The Rosenberg Forum will also review studies that reveal the economic significance of the Basin, including one completed by researchers Mark Anielski and Sara Wilson in 2009 that estimated the 2005 market value of economic activities in the Mackenzie watershed at $41.9 billion.

At the time, the growing industrial footprint in the region covered about 25.6 million hectares, twice the combined area of Canada's maritime provinces New Brunswick, Nova Scotia and Prince Edward Island.

The researchers quoted a non-market value of the watershed (ecosystem goods and services provided by nature such as carbon storage, water filtration, water supply and 14 others) of almost $571 billion per year (2005), some 59% of which — $339 billion — was attributed to the storage and annual absorption of carbon by the basin's forests, peatlands, wetlands and tundra.

"If the environmental services of the basin were compromised, the loss would be very large," says Alberta-based water policy analyst Bob Sandford, who notes that climate change is occurring in the north at a rate three times that of the rest of Canada.

He and many scientific colleagues are expressing concern about persistent toxic compounds appearing in aquatic ecosystems of the Mackenzie River Basin, most derived from air pollution. Alberta's David Schindler and colleagues have shown that atmospheric emissions from oil sands developments are significant contributors to contamination of waters in the Mackenzie River Basin. These contaminants are transported by wind and deposited on snow, land and tree needles, eventually washing into rivers and lakes.

The Forum will consider recently announced Canadian government measures to monitor and mitigate oil sands-related pollution.

Objectives for the Rosenberg Forum:

To identify and summarize pertinent scientific principles and findings that should be acknowledged in processes leading to a Mackenzie River management agreement, as well as pertinent legal principles that may apply, and to address the following questions:

1. What is the state of scientific knowledge of the Mackenzie River Basin? What are the major scientific questions to be addressed to ensure that the waters and lands of the basin are managed in a way that protects their integrity? To what extent does scientific uncertainty need to be addressed and specifically acknowledged in any transboundary agreement? What does science tell us about the continental and global significance of the basin?

2. To what extent does indigenous knowledge supplement or reinforce typical western science or social science? To what extent does indigenous knowledge need to be acknowledged or incorporated in any agreement? Are there examples of transboundary agreements that rely upon indigenous knowledge?

3. Given prevailing levels of uncertainty, what should be the role of adaptive management in scoping and implementing any transboundary agreement? What are the positive and negative lessons learned from experience with adaptive management? Are there examples of transboundary agreements that rely on adaptive management?

4. Is it possible to revamp existing cooperative governance structures for the Mackenzie Basin so as to build upon rather than infringing upon the jurisdictions of the federal, provincial, territorial and indigenous governments? Are there examples where this has been successfully accomplished in a federal system? Are there examples of where it has been attempted but failed to work effectively?

5. Could an existing layer of government or a regional governmental entity be given regulatory authority related to the purely basin-level aspects of such overarching issues as climate change, cumulative environmental impacts, as well as transboundary indigenous treaties and governance agreements?

Wednesday, September 5th will be devoted to "fact finding." On that day, senior provincial and territorial officials as well as researchers working in the Mackenzie Basin have been invited to make presentations.

Thursday, September 6th and Friday, September 7th will be devoted to deliberations of the panel and the development of conclusions and recommendations.

Arctic sea ice reaches lowest extent ever recorded

The extent of Arctic sea ice reached a record low in the satellite record on Aug. 26 and is expected to continue dropping for the next several weeks, according to a University of Colorado Boulder research team. (Credit: NSIDC, University of Colorado Boulder)

The blanket of sea ice floating on the Arctic Ocean melted to its lowest extent ever recorded since satellites began measuring it in 1979, according to the University of Colorado Boulder's National Snow and Ice Data Center.

On Aug. 26, the Arctic sea ice extent fell to 1.58 million square miles, or 4.10 million square kilometers. The number is 27,000 square miles, or 70,000 square kilometers below the record low daily sea ice extent set Sept. 18, 2007. Since the summer Arctic sea ice minimum normally does not occur until the melt season ends in mid- to late September, the CU-Boulder research team expects the sea ice extent to continue to dwindle for the next two or three weeks, said Walt Meier, an NSID scientist.

"It's a little surprising to see the 2012 Arctic sea ice extent in August dip below the record low 2007 sea ice extent in September," he said. "It's likely we are going to surpass the record decline by a fair amount this year by the time all is said and done."

On Sept. 18, 2007, the September minimum extent of Arctic sea ice shattered all satellite records, reaching a five-day running average of 1.61 million square miles, or 4.17 million square kilometers. Compared to the long-term minimum average from 1979 to 2000, the 2007 minimum extent was lower by about a million square miles — an area about the same as Alaska and Texas combined, or 10 United Kingdoms.

While a large Arctic storm in early August appears to have helped to break up some of the 2012 sea ice and helped it to melt more quickly, the decline seen in in recent years is well outside the range of natural climate variability, said Meier. Most scientists believe the shrinking Arctic sea ice is tied to warming temperatures caused by an increase in human-produced greenhouse gases pumped into Earth's atmosphere.

CU-Boulder researchers say the old, thick multi-year ice that used to dominate the Arctic region has been replaced by young, thin ice that has survived only one or two melt seasons — ice which now makes up about 80 percent of the ice cover. Since 1979, the September Arctic sea ice extent has declined by 12 percent per decade.

The record-breaking Arctic sea ice extent in 2012 moves the 2011 sea ice extent minimum from the second to the third lowest spot on record, behind 2007. Meier and his CU-Boulder colleagues say they believe the Arctic may be ice-free in the summers within the next several decades.

"The years from 2007 to 2012 are the six lowest years in terms of Arctic sea ice extent in the satellite record," said Meier. "In the big picture, 2012 is just another year in the sequence of declining sea ice. We have been seeing a trend toward decreasing minimum Arctic sea ice extents for the past 34 years, and there's no reason to believe this trend will change."

The Arctic sea ice extent as measured by scientists is the total area of all Arctic regions where ice covers at least 15 percent of the ocean surface, said Meier.

Scientists say Arctic sea ice is important because it keeps the polar region cold and helps moderate global climate — some have dubbed it "Earth's air conditioner." While the bright surface of Arctic sea ice reflects up to 80 percent of the sunlight back to space, the increasing amounts of open ocean there — which absorb about 90 percent of the sunlight striking the Arctic — have created a positive feedback effect, causing the ocean to heat up and contribute to increased sea ice melt.

Earlier this year, a national research team led by CU embarked on a two-year effort to better understand the impacts of environmental factors associated with the continuing decline of sea ice in the Arctic Ocean. The $3 million, NASA-funded project led by Research Professor James Maslanik of aerospace engineering sciences includes tools ranging from unmanned aircraft and satellites to ocean buoys in order to understand the characteristics and changes in Arctic sea ice, including the Beaufort Sea and Canada Basin that are experiencing record warming and decreased sea ice extent.

NSIDC is part of CU-Boulder's Cooperative Institute for Research in Environmental Sciences — a joint institute of CU-Boulder and the National Oceanic and Atmospheric Administration headquartered on the CU campus — and is funded primarily by NASA. NSIDC's sea ice data come from the Special Sensor Microwave Imager/Sounder sensor on the Defense Meteorological Satellite Program F17 satellite using methods developed at NASA's Goddard Space Flight Center in Greenbelt, Md.

Advanced tornado/hurricane shelter panels from recycled materials

Panels for a new high-tech shelter created at the University of Alabama at Birmingham have passed the National Storm Shelter Association’s tornado threat test. In the NSSA test, 15-pound two-by-fours fired from a pressure cannon were unable to penetrate the panels, made of recycled materials, in a dozen attempts. The wooden missiles hit the panels at 100 mph, the speed at which projectiles typically exit a tornado funnel spinning at more than 200 mph. Such a storm would rate EF5 on the Enhanced Fujita scale and be capable of leveling well-built homes. (Credit: UAB News)

Panels for a new high-tech shelter created at the University of Alabama at Birmingham have passed the National Storm Shelter Association's tornado threat test.

In the NSSA test, 15-pound two-by-fours fired from a pressure cannon were unable to penetrate the panels, made of recycled materials, in a dozen attempts. The wooden missiles hit the panels at 100 mph, the speed at which projectiles typically exit a tornado funnel spinning at more than 200 mph. Such a storm would rate EF5 on the Enhanced Fujita scale and be capable of leveling well-built homes. Passing the tornado test means that the panels also exceed the NSSA hurricane threat standard, which fires 9-pound two-by-fours at 60 to 75 mph.

The successful test represents a first step toward commercial availability, which the team hopes to achieve by the 2013 tornado season. The final hurdle comes this fall when the assembled structure will undergo testing.

"Our effort to apply modern materials science to storm shelters started in the wake of Hurricane Katrina and grew more urgent after we saw 62 Alabama tornadoes in one day this past April," says Uday Vaidya, Ph.D., professor within the UAB Department of Materials Science & Engineering and project leader. In 2011, tornadoes caused 551 deaths nationally — including 245 in Alabama — and property damage exceeding $28 billion.

"With an average of more than 1,370 tornadoes per year for the past three years in the United States, it's time we changed they way storm shelters are built with the goal of saving more lives," Vaidya says.

In the Aug. 1 tests, the UAB panels met the NSSA standards, which are based on Federal Emergency Management Association and International Council Code (ICC 500) requirements. Based on these early results, Vaidya and his team have lined up Sioux Manufacturing to fabricate the tabletop-size panels should the final approvals come through.

The team estimated that if merely 30 percent of the roughly 600,000 homes in the Southeast United States were to opt for a storm-shelter retrofit, it would represent a $500 million market. UAB spinoff Innovative Composite Solutions, led by Vaidya and winner of the 2009 Alabama Launchpad Competition, would oversee aspects of panel assembly in Birmingham.

No gaps in the armor The recipe of thermoplastic and fiberglass resins and fibers used in the panels are stronger per-unit density than the steel used in many current shelters and weigh 80 percent less, Vaidya says. Some of the same foams and fibers are used in the latest armored military vehicles.

The panels, connected to each other and the floor of an interior room, are designed to keep a family from being crushed or becoming airborne and to protect against flying debris. They also leave the assembly line looking like typical interior walls; they do not need paint and never will corrode.

Made from discarded liner once used to wrap offshore oil-rig pipes, the panels also embrace green engineering techniques. Recycled materials used in the experimental phase itself kept thousands of pounds of waste from landfills.

The design team is continuing to refine the shelter roof and its armored door, which will be sheathed in the same paneling as the walls. The door also will feature a custom three-deadbolt locking system and piano hinges.

"To see panels pass our most extreme test the first time is very impressive," says Larry Tanner, P.E., manager of the NSSA/Texas Tech Debris Impact Test Facility. "This material is lightweight and sustainable and looks to have a bright future in the storm-shelter industry. If it saves even one life, it will have been worth the effort to design it."

Selvum Pillay, Ph.D., associate professor in the UAB School of Engineering and team member at ICS, says the shelter represents one of many potential applications for a new generation of materials across many fields. "Related efforts under way at UAB seek to re-engineer the pilings that failed during Hurricane Katrina to flood New Orleans, dampen sound for quieter cities and better fortify combat helmets," Pillay says.

NASA sees Hurricane Isaac affecting the Northern Gulf Coast

The MODIS instrument on NASA's Terra satellite captured a visible image of Hurricane Isaac as it approached Louisiana on Aug. 28 at 12:30 p.m. EDT. A large band of showers and thunderstorms stretched from the Carolinas, west over Georgia, Alabama, Mississippi, Florida and into Louisiana, wrapping into Isaac's center of circulation when it was centered about 100 miles south of the mouth of the Mississippi River. (Credit: NASA Goddard MODIS Rapid Response Team)

 NASA and NOAA satellites continue to provide detailed information on Hurricane Isaac as the storm bears down on the U.S. Gulf coast. NASA's TRMM and Terra satellites captured imagery, and NOAA's GOES-13 satellite provided animations of Isaac's march toward the coast today, Aug. 28.

Residents along the northern Gulf coast are bracing for the arrival of Isaac, which was recently upgraded to a hurricane by the National Hurricane Center as of 1:00 p.m. CDT. At that time, the center of Isaac was located about 55 miles (~85 km) south-southeast of the Mississippi and was moving northwest at 10 mph and was nearing southeastern coast of Louisiana.

After crossing the southwestern tip of Haiti during the early morning hours of the 25th of August, Isaac paralleled the northern coast of Cuba the following day and moved through the Florida Straits with the center passing about 40 miles (~65 km) south of Key West, Florida on the afternoon of the 26th. All the while, Isaac remained a tropical storm despite passing over warm water. As it entered the southeastern Gulf of Mexico on the afternoon of August 26th, Isaac seemed poised to intensify with plenty of over warm Gulf water ahead and relatively low wind shear. However, even as Isaac moved northwest through this favorable environment into the central Gulf of Mexico, it was slow to intensify, becoming a stronger tropical storm but not a hurricane until just before landfall. Several factors seemed to inhibit Isaac's intensification. Being a large storm, Isaac's wind field is spread over a large area, making it less responsive to changes in central pressure. Also, dry air intrusions hindered the development of an inner core. The lack of an inner core was the main reason Isaac failed to really intensify.

TRMM captured an image of Isaac on August 28 at 4:01 UTC (12:01 a.m. EDT) as it was approaching the northern Gulf coast. The image was TRMM shows a broad area of moderate (shown in green) to heavy rain (shown in red) wrapping around the southwestern side of the storm with only moderate to light rain (shown in blue) on the opposite side and no heavy rain near the center. The cloud shield (shown in white) is also well pronounced in the southwestern half of Isaac but inhibited along the northern edge. At the time of this image Isaac was a strong tropical storm with sustained winds of 60 knots (~70 mph). Because of its large size, Isaac stills poses a threat for storm surge and it's expected slower movement over Louisiana brings the risk of flooding.

An animation of NOAA's GOES-13 satellite imagery from Aug. 26-28, 2012 of Hurricane Isaac's track through the Gulf of Mexico was animated by NASA's GOES Project at the NASA Goddard Space Flight Center in Greenbelt, Md. The animation shows Isaac is headed for New Orleans, exactly 7 years after hurricane Katrina.

The Moderate Resolution Imaging Spectroradiometer (MODIS) instrument on NASA's Terra satellite captured a visible image of Hurricane Isaac as it approached Louisiana on Aug. 28 at 12:30 p.m. EDT. A large band of showers and thunderstorms stretched from the Carolinas, west over Georgia, Alabama, Mississippi, Florida and into Louisiana, wrapping into Isaac's center of circulation when it was centered about 100 miles south of the mouth of the Mississippi River.

On Aug. 28 at 2 p.m. EDT, Hurricane Isaac's maximum sustained winds were near 75 mph (120 kmh). Isaac is a category one hurricane on the Saffir-Simpson scale. It was centered about 55 miles (85 km) south-southeast of the mouth of the Mississippi River near latitude 28.4 north and longitude 88.7 west. Isaac is moving toward the northwest near 10 mph (17 kmh). The National Hurricane Center expects Hurricane Isaac should reach the coastline of southeastern Louisiana as early as this evening.

At 2 p.m. EDT, the National Hurricane Center noted that tropical-storm-force winds were occurring at the mouth of the Mississippi river. That's where a NOAA observing site located at Southwest Pass, Louisiana reported sustained winds of 60 mph (93 kmh) and a gust to 76 mph (122 kmh) at an elevation of 80 feet. For full warnings, watches and locations, visit the National Hurricane Center's website at: www.nhc.noaa.gov. For storm history and NASA satellite images and animations, go to: NASA's Hurricane page (http://www.nasa.gov/hurricane).

Peek-a-blue Moon

The second full Moon of the month – known as a ‘blue’ Moon – just before it disappeared from the MSG-3 satellite’s sight behind the southern hemisphere. The image was captured by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) instrument at 11:20 GMT on 31 August 2012. (Credit: Eumetsat)

 Europe's latest weather satellite got a glimpse of the Moon before our celestial neighbour disappeared from view behind Earth on Friday. Since its launch two months ago, MSG-3 has been working well and is on its way to entering service.

The image shows the second full Moon of the month — known as a 'blue' Moon — just before it disappeared from the MSG-3 satellite's sight behind the southern hemisphere.

Brazil's eastern coast along the South Atlantic Ocean is also visible, with clouds forming over the water.

The image was captured by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) instrument at 11:20 GMT.

The imager scans Earth's surface and atmosphere every 15 minutes in 12 different wavelengths to track cloud development and measure temperatures.

Launched on 5 July, the third Meteosat Second Generation satellite is in a six-month commissioning phase by Eumetsat, the European Organisation for Exploitation of Meteorological Satellites.

This includes checking that the imaging service works fully and delivers high-quality products for weather forecasting.

ESA developed the satellite in close cooperation with Eumetsat, and was responsible for initial operations after launch. It was then handed over to Eumetsat on 16 July.

The first satellite in the series, MSG-1 — also known as Meteosat-8 — was launched in 2002. MSG-2 followed three years later. Both have continued the legacy of the operational meteorological satellites that started with Meteosat-1 in 1977.

The MSGs offer more spectral channels and are sensing Earth more frequently and at a higher resolution than their predecessors.