Ozone hole closing for the year, but full recovery is decades away

These clouds – formed high in the Antarctic atmosphere during spring – provide a place where ozone-destroying chemicals can form. Image: sandwich/Flickr, CC BY-NC-ND

These clouds – formed high in the Antarctic atmosphere during spring – provide a place where ozone-destroying chemicals can form. Image: sandwich/Flickr, CC BY-NC-ND

By Paul Krummel, CSIRO and Paul Fraser, CSIRO

Imagine an environmental crisis caused by a colourless, odourless gas, in minute concentrations, building up in the atmosphere. There is no expert consensus, but in the face of considerable uncertainty and strong resistance to the science, global regulation of these emissions succeeds.

Subsequently, the science is established and the damage, though already apparent, begins to be mitigated.

Trends in the size of the ozone hole (top), amount of ozone (middle) and ozone deficit (bottom) show the ozone hole may be recovering. The green lines show the amount of chlorine in the stratosphere relative to the other measurements.

No, this is not fantasy. It’s history. We’re talking about the ozone hole.

In two or three weeks the Antarctic’s seasonal ozone hole will close for the year. The ozone hole has formed every spring since the 1970s. This year’s is among the smaller ones over the past 20 years — since ozone-depleting substances began declining.

The United Nations Environment Program and World Meteorological Organisation’s Scientific Assessment of Ozone Depletion: 2014 states that global total column ozone has shown a small increase in recent years.

However, it may take another few years before we can definitively say the Antarctic ozone hole has recovered, and several decades until full recovery to pre-1980 conditions.

Radical discovery

Atmospheric ozone isn’t a single layer at a certain altitude above the Earth’s surface; it’s dispersed — there is even a significant amount of ozone at the Earth’s surface.

Even the stratospheric ozone known as “the ozone layer” is not a single layer of pure ozone, but a region where ozone is more abundant than it is at other altitudes. Satellite sensors and other ozone-measuring devices monitor the total ozone concentration for an entire column of the atmosphere, and whether there is more or less than normal.

Throughout the 1970s, scientists began to observe two separate but related phenomena: the total amount of ozone in the stratosphere — the region 10 to 50 kilometres above the earth’s surface — was declining steadily at about 4% every ten years. And in spring there was a much larger decrease in stratospheric ozone over the polar regions.

By the mid-1980s, they reached the conclusion that the cause was a chemical reaction between ozone and halogen (chlorine and bromine). This halogen came from man-made substances: chlorine/bromine-containing refrigerants, solvents, propellants and foam-blowing agents (chlorofluorocarbons or CFCs, halons and hydrochlorofluorocarbons or HCFCs).

When exposed to UV light and in the presence of polar stratospheric clouds, these molecules break down, releasing radical chemicals that destroy ozone atoms at an alarming rate.

CFCs, one of the most prominent culprits, were first synthesised in the 1890s, but it wasn’t until the 1950s that they began to be widely used as refrigerants.

The most unfortunate scientist

Thomas Midgely, the chemical engineer who improved their synthesis and demonstrated their potential uses, was probably the most unfortunate scientist ever to rise to an influential position.

In 1921, he discovered that adding tetra-ethyl lead to fuel improved the efficiency of internal combustion engines. Unfortunately, this discovery was commercialised. Lead persists in the atmosphere today. It also accumulates in animals, sometimes to toxic levels, particularly those at the top of food chains.

Subsequently, Midgely set himself to solving the problem of the refrigerants in use in the earlier part of the 20th century. These were uniformly dangerous — either flammable, explosive or toxic. CFCs weren’t and were soon widely adopted, not only as refrigerants but also later as propellants and blowing agents.

The best thing about CFCs – their low reactivity – is also the worst. Because they’re so unreactive, they’re very long-lived (often in excess of 100 years). This gives them time to get into the stratosphere. One of the components of CFCs is chlorine. Very little chlorine exists naturally in the stratosphere, but CFCs are a very effective way of introducing significant amounts of chlorine into the ozone layer.

Midgely’s efforts to do good had dire unintended consequences: he’s been described as having had more impact on the atmosphere than any other single organism in Earth’s history.

Recognising the threat

By the late 1960s, scientists had detected growth in the level of CFCs in the atmosphere. By 1974 researchers published the first paper predicting that the increase in CFCs would cause significant ozone loss.

The ozone hole hypothesis was strongly disputed by some industry representatives.

Nonetheless, the reality of a possible depleted ozone layer and the threat to human health it implied so alarmed the international community that by 1985 the Vienna Convention for the Protection of the Ozone Layer was agreed on, even before the significant ozone depletion was detected. The convention came into force in 1988 and was ratified over subsequent decades by 197 nations, making it one of the most successful treaties of all time.

The following year (1989), the Montreal Protocol on Substances that Deplete the Ozone Layer (which falls under the Vienna Convention) also came into force. This treaty was designed to enact the spirit of the Vienna Convention – i.e. to protect the ozone layer – and achieved it by phasing out the production and consumption of numerous substances that are responsible for ozone depletion.

Long time to recovery

Repairing the ozone hole is a long-term process. CSIRO has been monitoring the hole over Antarctica since the late 1970s. The ozone hole first appeared in spring over Antarctica and subsequently over the Arctic, as the ozone-destroying chemical processes require very cold conditions and the onset of sunlight (following the polar winter).

In Antarctica, the hole lasts for two to three months before breaking up and mixing with ozone–richer air from mid-latitudes. It’s not constant in size — except in the sense that it’s consistently very large — although it waxes and wanes. The record so far is 29.5 million square kilometres, set in 2006.

For comparison, the land mass of Australia (including Tasmania) is 7.7 million square kilometres.

Although some reports claim that the ozone layer over Antarctica is recovering, it’s too early to make a definitive call. Measurements at surface monitoring stations show that the amount of ozone-destroying chemicals at the surface has been dropping since about 1994-1995. The amount is now about 10-15% down on that peak.

The stratosphere lags behind the surface, and the effects of this will take some time to play out. However, satellite measurements show that the decline in ozone amount in the stratosphere has stopped, and perhaps begun recovery.

The size and depth of the ozone hole each year shows quite large variability due to different meteorological conditions, in particular stratospheric temperatures.

How the 2014 hole measures up

Overall, out of the 35 years of satellite data analysed, the 2014 ozone hole is one of the smaller ones since the late 1980s. It ranks as the 18th largest in daily area; 16th largest for daily ozone deficit; and 21st lowest for minimum ozone.

The 2014 ozone hole appeared in the first week of August and grew rapidly in size from mid-August through to the second week of September, reaching 23.5 million square kilometres on September 15.

The 2014 hole is among the smaller since the mid-1990s.

The 2014 hole is among the smaller since the mid-1990s.

During the third and fourth weeks of September the ozone hole area decreased to 17.5 million square kilometres. Then, in a final flurry, the daily ozone hole area grew sharply again during the last days of September to peak at 23.9 million square kilometres on October 1.

This is the peak daily ozone hole area for 2014, larger than in 2010, 2012 and 2013, about the same as 2009, and smaller than in 2011.

The ozone hole is now in the recovery phase, and had shrunk to about 9 million square kilometres by November 14. It is expected to recover this year in two to three weeks.

You can find the full details on the 2014 ozone hole here.

The Conversation

Paul Krummel receives funding from MIT, NASA, Australian Bureau of Meteorology, Department of the Environment, & Refrigerant Reclaim Australia.

Paul Fraser receives funding from receives funding from MIT, NASA, Australian Bureau of Meteorology, Department of the Environment, & Refrigerant Reclaim Australia.

This article was originally published on The Conversation.
Read the original article.


Along the Murray-Darling in a ‘bot

Launch of the Museum Robot, Landmarks Gallery, ActonIt’s a big place, the Murray-Darling Basin. Over a million km2 – about one-seventh of the whole of Australia. There’s a lot to know about it, and we’re helping students find out more for themselves, using a novel CSIRO innovation.

The National Museum of Australia and the Murray-Darling Basin Authority have teamed up to let students learn about this vast area, taking students on an interactive, customised tour of the Museum’s Murray-Darling Basin exhibits. But the really cool part? The students never have to leave their classrooms.

Using our Telepresence robot technology, museum staff are able to broadcast real-time images, video and audio back to students in their classrooms. Students can learn about how the Basin’s water movement and volume has varied over the past 300 000 years, and the importance of water quality and its role in determining where human settlements develop and whether they survive and prosper.

This is a new departure for the robots. In the past, they’ve mainly been used to give a taste of the museum to people in remote areas who can’t easily travel there. Now they’re letting students get an understanding of the broader Murray-Darling picture.

It works this way. The museum robot (accompanied by education staff) takes the remote visitors on a virtual tour of the museum.

The robot has a high speed broadband connection, so remote visitors can interact with a human educator in the museum. The human educator leads the robot, while the remote visitors use a panoramic camera to look around and explore.  Launch of the Museum Robot, Landmarks Gallery, Acton

In an ultimate case of ‘look but don’t touch’ students can see and interact with information about each of the objects on display.

The best thing is that it’s a conversation, not a monologue with pictures. The museum educator can engage and challenge the students by posing multiple-choice questions, polling and viewing the student’s responses in real-time.

We’re doing a lot of work on digital immersive learning. Apart from the Telepresence robots, we’re working with science education experts to develop learning environments that mirror real-life places. These 3D models of real places will be created using our award winning laser mapping technology Zebedee and panoramic video to create the immersive environment. We’ve already taken students through Jenolan Caves from the safety of their own classrooms.

Almost makes you wish you were back at school again …


Latest findings about the faces of the Great Barrier Reef

Family swinging chilld walking through shallow water on the Great Barrier Reef

The Great Barrier Reef is important for many people’s livelihoods, plays a central role in our culture and is highly valued for its influence of personal wellbeing, according to results from the Social and Economic Long Term Monitoring Program (Image: Matt Curnock).

The Great Barrier Reef is many different things to different people: a tropical playground, a natural wonder, and the domain of Nemo, Dory and Bruce. But it’s  also home to around a million Australians, a booming tourism industry, major exporting ports and countless fishing, farming and agriculture operations.

Indisputably the Reef has immense ecological importance as a unique site of marine biodiversity and is viewed by many as the technicolour jewel in the crown of Australia’s UNESCO World Heritage-listed sites. The Reef also brings in $5.7 billion annually and generates over 64,000 full time jobs.

These impressive figures however only hint at the full value of the Great Barrier Reef to Australia and Australians.

The future of the Reef is an international issue: just this weekend we saw the US President, Barack Obama, talking about his desire for his grandchildren’s generation to still be able to enjoy the Reef in the way that we’re able to today. Knowing how to plan and manage the best outcomes for it is of the upmost importance.

To derive a clearer picture, we set out to measure the human dimensions of the Great Barrier Reef through the Social and Economic Long Term Monitoring Program (SELTMP).

In the largest study of its kind, we surveyed over 6,000 tourists and coastal residents, as well as over 2,000 Australians outside the region, to understand and measure use and dependency, personal wellbeing and the cultural connectedness of people to the Great Barrier Reef.

It is no surprise we found that we love the Reef. In a snapshot of our results, we found that:

  • a quarter of residents depend on the Reef for their household income
  • many coastal residents feel it is part of their identity and provides a lifestyle they highly value
  • appreciation of the aesthetic values are shared by all visitors and residents
  • threats to the ongoing viability of the reef is a major concern to all user groups, and
  • most coastal residents support regulations to protect the Reef.

ThreatsToReef

The SELTMP was set up to assist reef managers and other decision-makers within the region to incorporate the human dimension into planning and management. Our work is providing information to help government, industry, researchers and communities understand trends, interconnections, conflicts, dependencies and vulnerabilities of  user groups and so develop insights into the complex social and economic conditions of the Great Barrier Reef.

These results represent the current status and condition of the human dimension of the Great Barrier Reef which we will continue to monitor in the future.

The findings of the SELMTP 2013 baseline study were presented at the World Parks Congress in Sydney today. More information on the research can be found at eatlas.org.au/seltmp.

Media enquiries: contact Keirissa Lawson, 02 4960 6286, Keirissa.Lawson@csiro.au.


Usability – it has its own day, but what is it?

smartphones_322Today is World Usability Day (WUD). WUD celebrates the technologies, products and services that improve our lives by doing what they’re designed to do in a way that engages and assists us. And more importantly, it’s a day for encouraging creators, designers and manufacturers to put usability at the forefront when they’re making products.

We’re pretty proud of some of our useable technology – like our smartphone apps. We’re leveraging a technology that’s well on the way from being popular to being ubiquitous, and creating applications and services that can make a big difference to a wide range of people.

Take people who’ve had heart attacks, for instance. Nowadays, a lot more people survive heart attacks than in the past, but post-heart attack rehab remains a problem. It used to involve travelling to an outpatient clinic or similar centre, and there was a considerable dropout rate from the program. This is a problem, because patients who successfully complete cardiac rehab following a heart attack have much better health outcomes.

They are less likely to have another cardiac event, be readmitted to hospital or die from their condition. So we developed a smartphone home care delivery model – known as the Care Assessment Platform. A clinical trial found that people were almost 30 per cent more likely to take part in their rehab program at home using the app than those who had to travel to a clinic.

What’s more, people using the app were 40 per cent more likely to stick to the program and almost 70 per cent more likely to see it through to completion. That’s REAL usability.

Of course, the best treatment for heart attacks is not having one in the first place. As we all know, weight is a factor in heart disease. And certainly, keeping your weight down is a very, very good thing to do after a heart attack. We’re hoping we can help there too.

We’re currently working with Bupa Health Foundation on a trial of smartphone apps to assist with dieters’ mood and motivation. Face-to-face support is often the best way to succeed on a diet, but this is not always possible, and it can get expensive.

old handsBut a weight loss app can give advice at a single swipe, providing regular daily reminders and support.

So you’ve survived a heart attack and done the rehab using an app. And you’ve lost weight. That means you’ve got more chance of living to be old. We’ve been working on apps to help with that, too.

Our Smarter, Safer Homes project is looking at ways to keep older people living safely in their own homes for longer. This not only takes pressure off the aged care home sector, but also improves older people’s health and wellbeing.

Our app involves placing simple sensors such as motion detectors and energy sensors placed around the home. These monitor the person as they go about their day and report the data back to family members or carers.

For example, motion sensors can detect whether a person got up at the usual time, put the kettle on, regularly cooked food for themselves, and even if they left the oven on.

The data is also reported to a tablet device owned by the elderly person, who retains full control over what data gets reported to others and what stays private.

SoilMapp screen

SoilMapp screen

Not all our work on apps is in human health. There’s one for soil health too. SoilMapp is designed to make soil information more accessible for Australian farmers, consultants, planners, natural resource managers, researchers and people

interested in soil. It provides direct access to the best national soil data and information from several sources.

With SoilMapp, users can find information on soil depth, acidity, salinity, soil carbon, soil water holding capacity and other attributes in a matter of minutes, anywhere there’s a wireless or internet connection.

We’ve also counted koalas using an app, and we’re looking at doing many more things with this technology.

Even the first version of the iPhone had more computing power than all of NASA had for the Apollo 11 mission, so there’s plenty of opportunity to make use of the potential of smartphones. That very usable thing in your pocket just keeps on getting more so.


Making molecules crystal clear

A crystal image of a breakdown pathway of atrazine

A crystal image of a breakdown pathway of atrazine

The International Year of Crystallography is drawing to a close, and we’re not going to let it finish without showing you something about what crystallographers do. Which is not what most people would assume when they hear the word: there are crystals involved, but it’s not exactly the study of crystals as we generally think of them. It’s the study of how matter is organised, using crystals as a tool.

Now, naturally we want to know how matter is arranged. Apart from being very, very interesting to find out about, it also helps in many different fields, from drug delivery to materials science. In fact, it was crystallography that provided – controversially – the key to understanding the structure of DNA.

So assume you want to look at something in the greatest possible detail, seeing its smallest possible components. Obviously, you’d use a microscope. But there’s a limit to the smallness of things you can see that way: the wavelength of the light human eyes see. Visible light has a frequency of between roughly 400 and 700 nanometres, and can’t detect atoms, which are separated by 0.1 nanometres. This is the perfect frequency for X-rays.

We can’t make appropriate X-ray lenses to make x-ray microscopes to study molecules: we have to do it in a roundabout way. We beam X-rays onto crystals, scattering the rays, in just the same way that light reflects when it hits an object. Then we use a computer to reassemble the rays —the diffraction pattern —into an image. The diffraction of a single molecule would be so weak that we couldn’t get any meaningful information from it, so we use crystals, which have many molecules in an ordered array, to amplify the signal so we can see it. Crystals are highly ordered structures, made up of 1012 or more molecules, makes the x-ray diffraction patterns — the main tool of crystallography —possible to analyse.

Crystallographers were among the first scientists to use computers, and used them to do the advanced calculations needed to reassemble diffraction patterns into coherent images. That’s why it seemed fitting to name our supercomputer after the founders of crystallography – Lawrence and Henry Bragg. Lawrence was the first person to solve a molecular structure using x-ray diffraction.

Today we can not only view molecules in 3D, but also study the way they operate. Improvements in x-ray machines have also led to synchrotron facilities, which can produce far more efficient and precise beams.

And speaking of synchrotrons …

One of our crystallographers, Tom Peat, has deposited more than 120 structures in the Protein Data Bank using data collected at the Australian Synchrotron. They were all derived from crystals developed in CSIRO’s Collaborative Crystallisation Centre.

This is one of our favourite structures.

AtzF structure (the Australia crystal)

AtzF structure (the Australia crystal)

It’s the structure of AtzF. This enzyme forms part of the breakdown pathway for atrazine, a commonly used herbicide. We’re trying to understand enzymes better and use them for bioremediation – cleaning up environmental detritus such as pesticides and herbicides – and we’ve now solved the structures of four of the six enzymes involved in the atrazine breakdown pathway. We also look at protein engineering, to see if we can make these enzymes even more effective at cleaning up the environment.

Before we get to the crystal image, there are other steps on the way. First, someone has to grow the crystals (clone the protein, express it, purify it and crystallise it). Then it’s off to the Synchrotron to get a data set (many diffraction images in sequence). Here you can see an actual protein crystal.

An actual protein crystal

An actual protein crystal

The picture on the right is the diffraction image.

Diffraction image

Diffraction image

The crystallographers measure the intensity of the reflections (the dark dots). They combine that with the geometry and use some complicated maths (a Fourier Transform) to produce an electron density map. They then use that map to build a model.

Not all our crystallography work is in the same area. We also work on some pharmaceutical applications. One of our projects, with hugely important implications for human health, is on the design of desperately needed new antibiotics.  We’ve been collaborating with Monash University, looking at the pathway that sulpha drugs (such as sulfamethoxazole)– the ones we used to treat bacterial infections prior to the discovery of penicillin – take to treat Golden Staph infections in humans. The aim is to design new antibiotics that target the same pathway. You can read a paper that describes our recent findings in the Journal of Medicinal Chemistry, and here’s a picture of what we’ve been doing.

Golden Staph treatment pathway

Golden Staph treatment pathway

We think this deserves its own Year. And we hope it’s clear just how important it is. Crystal clear.


The science of living with fire

Bushfire in tropical Australia

Bushfire in tropical Australia

The bushfire season isn’t wasting any time this year. There are already large fires in South Australia and NSW, and the obligatory ‘the state is a tinderbox’ warnings came out months ago. There were bushfires in July in NSW, and the number of local council areas in southern Australia declaring the fire season open in August – it starts in July up north – has more than doubled.

So a program on SBS on bushfires is timely. Inside the Inferno screens at 8.30pm on 5 November and 12 November. It’s mainly about the everyday heroes who volunteer to fight bushfires. But there’s a fair bit about the science of bushfires, and that’s where we come in. We contributed to the program, but much more importantly, we contribute to helping keep people safe from fires.

You might be surprised at how much of our work has a connection to fires. It’s not all we work on, obviously, but many areas of science go into prediction and management of fire.

Let’s start at prediction. Wenju Cai and his team at the Oceans and Atmosphere Flagship have been working on a better understanding of the El Niño Southern Oscillation and its lesser-known counterpart, the Indian Ocean Dipole (IOD). Their groundbreaking work has established that the IOD preconditions south-eastern Australia for major bushfires, and enabled us to stretch out the prediction range for severe fire activity to between four to six months. What you can predict, you can plan for.

And you need to know exactly what you’re planning for. That’s what the Pyrotron helps us do. It’s a 25 metre long fire-proof wind tunnel, with a working section for conducting experiments and a glass observation area.

It’s used to study – safely, under controlled conditions – how fires ignite in bushfire fuel and how they spread. Obviously, this is necessary work that can’t be done in the field under wildfire conditions. Using the Pyrotron we can study the mechanisms of bushfires’ spread, their thermokinetics – the chemistry of combustion – and fuel consumption, emissions and residues under different burning conditions.

But we won’t ever be able to prevent fires breaking out. We can plan, we can study, but we can’t change the nature of Australia. We can’t stop hot, dry days or lightning strikes. What we can do is find the safest way to live in our combustible climate.

Fire is one of many influences that define our living space. The challenge is to find acceptable ways of living with bushfires while retaining the ability to choose where and how we live. We also need to dispel some of the myths about bushfires that have put people and property in greater danger than was necessary. And we need to understand the risks from bushfires inherent in different types of construction. We need to know what’s safest and strongest, and how to build it.

We’ve surveyed every bushfire involving significant house loss since the 1983 Ash Wednesday fires, and we’ve tested a variety of construction methods to find optimal building types for fire-prone areas. How do we test them? The obvious – only – way. We set fire to them.

It’s not just how resistant they are to collapsing into a pile of ashes that’s important, although obviously that’s a major consideration. We also need to know what kind of house would best enable people to shelter in them while actively defending them from the bushfire attack. To put this all together, we use our expertise in:

  • assessment of bushfire risk at the urban interface
  • integrated urban design solutions including whole-of-life energy and water use, biodiversity, landscapes, cultural value systems, lifestyle expectations, and risk from other sources
  • analysis of major bushfire events
  • development of fire spread prediction models and tools
  • evaluation of fire suppressants and applications
  • post-incident analysis of bushfire impact on houses and people
  • fire characteristics at the urban interface
  • community education
  • performance of materials during bushfire exposure
  • characterisation of materials or systems performance in bushfires
  • product development, verification and enhancement for use in bushfire-prone areas (specialist coatings, glazing protection, timber deck design).
  • fire fighter vehicle burn over protection systems

It’s a lot, but we don’t stop there. We also work on disaster management tools for fires. We developed the Emergency Response Intelligence Capability (ERIC) in collaboration with the Australian Government Department of Human Services Emergency Management team.

This uses information from a range of sources and includes:

  • region data from the Australian Bureau of Statistics
  • context data including demographics and details of the natural and built environment
  • ‘live’ data feeds describing the emergency event as it progresses and the historical record of previous ‘live’ data feeds
  • an archive of previous situation reports

This information can be focused for a specific region under investigation and collated semi-automatically to generate situation reports. The situation reports include information synthesised from available datasets and augmented by user provided content. The situation reports it generates describe what the event is, where it is located and the impact on the local community and to the department.

These days, social media is one of the most important channels when a disaster is unfolding, so we’re working on that too. Every minute, vast amounts of information are communicated via Twitter. Our challenge is to make relevant information accessible to emergency services.

Without suitable tools this information can’t be used. A huge amount of detail about the 2009 Victorian bushfires was reported in real-time on social network sites. The trouble was that state and federal disaster response agencies couldn’t see it.

We’ve created Emergency Situation Awareness (ESA) software to detect unusual behaviour in the Twittersphere and alert users in the emergency services if a disaster is unfolding online.

So that’s our contribution to helping keep people safe from bushfires, and, if worst comes to worst, in them. We hope it helps the firies – they deserve all the help they can get.


Shale gas, coal seam gas… what’s the difference?

By Tsuey Cham

A few weeks ago we took a look at coal seam gas (CSG) and the hydraulic fracturing (‘fraccing’) process used in its extraction. You may have also heard of shale gas, another type of natural gas found deep underground.

So what exactly makes them different?

In terms of their gas content they’re really quite similar, with both made up predominantly of methane – the type of gas used in homes for cooking and heating.

However, when it comes to extraction and production CSG and shale gas can be quite different. For example, CSG can be found up to about 1000 meters underground, whereas shale gas is found much deeper, usually 1500 to 4000 meters below the surface.

In Australia, hydraulic fracturing – a technique that increases the rate of gas flow for extraction – is used in CSG production 20-40% of the time, whereas in shale gas production it’s used every time.

Another interesting difference is that the process used to extract CSG produces more water than it uses – so there are large quantities of water produced as a by-product. Conversely, for shale gas, the extraction process uses more water than it produces.

Watch our latest short animation to find out more about shale gas, how it’s extracted and some of the potential environmental challenges involved in its production:

If you missed the animation on CSG extraction, watch it here.

You can also find out more from our fact sheets on CSG, shale gas and hydraulic fracturing in coal seams.


Follow

Get every new post delivered to your Inbox.

Join 3,973 other followers