Ozone hole closing for the year, but full recovery is decades away

These clouds – formed high in the Antarctic atmosphere during spring – provide a place where ozone-destroying chemicals can form. Image: sandwich/Flickr, CC BY-NC-ND

These clouds – formed high in the Antarctic atmosphere during spring – provide a place where ozone-destroying chemicals can form. Image: sandwich/Flickr, CC BY-NC-ND

By Paul Krummel, CSIRO and Paul Fraser, CSIRO

Imagine an environmental crisis caused by a colourless, odourless gas, in minute concentrations, building up in the atmosphere. There is no expert consensus, but in the face of considerable uncertainty and strong resistance to the science, global regulation of these emissions succeeds.

Subsequently, the science is established and the damage, though already apparent, begins to be mitigated.

Trends in the size of the ozone hole (top), amount of ozone (middle) and ozone deficit (bottom) show the ozone hole may be recovering. The green lines show the amount of chlorine in the stratosphere relative to the other measurements.

No, this is not fantasy. It’s history. We’re talking about the ozone hole.

In two or three weeks the Antarctic’s seasonal ozone hole will close for the year. The ozone hole has formed every spring since the 1970s. This year’s is among the smaller ones over the past 20 years — since ozone-depleting substances began declining.

The United Nations Environment Program and World Meteorological Organisation’s Scientific Assessment of Ozone Depletion: 2014 states that global total column ozone has shown a small increase in recent years.

However, it may take another few years before we can definitively say the Antarctic ozone hole has recovered, and several decades until full recovery to pre-1980 conditions.

Radical discovery

Atmospheric ozone isn’t a single layer at a certain altitude above the Earth’s surface; it’s dispersed — there is even a significant amount of ozone at the Earth’s surface.

Even the stratospheric ozone known as “the ozone layer” is not a single layer of pure ozone, but a region where ozone is more abundant than it is at other altitudes. Satellite sensors and other ozone-measuring devices monitor the total ozone concentration for an entire column of the atmosphere, and whether there is more or less than normal.

Throughout the 1970s, scientists began to observe two separate but related phenomena: the total amount of ozone in the stratosphere — the region 10 to 50 kilometres above the earth’s surface — was declining steadily at about 4% every ten years. And in spring there was a much larger decrease in stratospheric ozone over the polar regions.

By the mid-1980s, they reached the conclusion that the cause was a chemical reaction between ozone and halogen (chlorine and bromine). This halogen came from man-made substances: chlorine/bromine-containing refrigerants, solvents, propellants and foam-blowing agents (chlorofluorocarbons or CFCs, halons and hydrochlorofluorocarbons or HCFCs).

When exposed to UV light and in the presence of polar stratospheric clouds, these molecules break down, releasing radical chemicals that destroy ozone atoms at an alarming rate.

CFCs, one of the most prominent culprits, were first synthesised in the 1890s, but it wasn’t until the 1950s that they began to be widely used as refrigerants.

The most unfortunate scientist

Thomas Midgely, the chemical engineer who improved their synthesis and demonstrated their potential uses, was probably the most unfortunate scientist ever to rise to an influential position.

In 1921, he discovered that adding tetra-ethyl lead to fuel improved the efficiency of internal combustion engines. Unfortunately, this discovery was commercialised. Lead persists in the atmosphere today. It also accumulates in animals, sometimes to toxic levels, particularly those at the top of food chains.

Subsequently, Midgely set himself to solving the problem of the refrigerants in use in the earlier part of the 20th century. These were uniformly dangerous — either flammable, explosive or toxic. CFCs weren’t and were soon widely adopted, not only as refrigerants but also later as propellants and blowing agents.

The best thing about CFCs – their low reactivity – is also the worst. Because they’re so unreactive, they’re very long-lived (often in excess of 100 years). This gives them time to get into the stratosphere. One of the components of CFCs is chlorine. Very little chlorine exists naturally in the stratosphere, but CFCs are a very effective way of introducing significant amounts of chlorine into the ozone layer.

Midgely’s efforts to do good had dire unintended consequences: he’s been described as having had more impact on the atmosphere than any other single organism in Earth’s history.

Recognising the threat

By the late 1960s, scientists had detected growth in the level of CFCs in the atmosphere. By 1974 researchers published the first paper predicting that the increase in CFCs would cause significant ozone loss.

The ozone hole hypothesis was strongly disputed by some industry representatives.

Nonetheless, the reality of a possible depleted ozone layer and the threat to human health it implied so alarmed the international community that by 1985 the Vienna Convention for the Protection of the Ozone Layer was agreed on, even before the significant ozone depletion was detected. The convention came into force in 1988 and was ratified over subsequent decades by 197 nations, making it one of the most successful treaties of all time.

The following year (1989), the Montreal Protocol on Substances that Deplete the Ozone Layer (which falls under the Vienna Convention) also came into force. This treaty was designed to enact the spirit of the Vienna Convention – i.e. to protect the ozone layer – and achieved it by phasing out the production and consumption of numerous substances that are responsible for ozone depletion.

Long time to recovery

Repairing the ozone hole is a long-term process. CSIRO has been monitoring the hole over Antarctica since the late 1970s. The ozone hole first appeared in spring over Antarctica and subsequently over the Arctic, as the ozone-destroying chemical processes require very cold conditions and the onset of sunlight (following the polar winter).

In Antarctica, the hole lasts for two to three months before breaking up and mixing with ozone–richer air from mid-latitudes. It’s not constant in size — except in the sense that it’s consistently very large — although it waxes and wanes. The record so far is 29.5 million square kilometres, set in 2006.

For comparison, the land mass of Australia (including Tasmania) is 7.7 million square kilometres.

Although some reports claim that the ozone layer over Antarctica is recovering, it’s too early to make a definitive call. Measurements at surface monitoring stations show that the amount of ozone-destroying chemicals at the surface has been dropping since about 1994-1995. The amount is now about 10-15% down on that peak.

The stratosphere lags behind the surface, and the effects of this will take some time to play out. However, satellite measurements show that the decline in ozone amount in the stratosphere has stopped, and perhaps begun recovery.

The size and depth of the ozone hole each year shows quite large variability due to different meteorological conditions, in particular stratospheric temperatures.

How the 2014 hole measures up

Overall, out of the 35 years of satellite data analysed, the 2014 ozone hole is one of the smaller ones since the late 1980s. It ranks as the 18th largest in daily area; 16th largest for daily ozone deficit; and 21st lowest for minimum ozone.

The 2014 ozone hole appeared in the first week of August and grew rapidly in size from mid-August through to the second week of September, reaching 23.5 million square kilometres on September 15.

The 2014 hole is among the smaller since the mid-1990s.

The 2014 hole is among the smaller since the mid-1990s.

During the third and fourth weeks of September the ozone hole area decreased to 17.5 million square kilometres. Then, in a final flurry, the daily ozone hole area grew sharply again during the last days of September to peak at 23.9 million square kilometres on October 1.

This is the peak daily ozone hole area for 2014, larger than in 2010, 2012 and 2013, about the same as 2009, and smaller than in 2011.

The ozone hole is now in the recovery phase, and had shrunk to about 9 million square kilometres by November 14. It is expected to recover this year in two to three weeks.

You can find the full details on the 2014 ozone hole here.

The Conversation

Paul Krummel receives funding from MIT, NASA, Australian Bureau of Meteorology, Department of the Environment, & Refrigerant Reclaim Australia.

Paul Fraser receives funding from receives funding from MIT, NASA, Australian Bureau of Meteorology, Department of the Environment, & Refrigerant Reclaim Australia.

This article was originally published on The Conversation.
Read the original article.


Five tips for enjoying the festive season without gaining weight

The average Aussie risks gaining several kilos over the holiday period. That might sound like a small number, but few of us lose it when the festive season is over. We asked Professor Manny Noakes, research director of our Food and Nutrition Flagship and co-author of the famous Total Wellbeing Diet, for five tips on how to survive the silly season without gaining extra baggage.

'Tis the season of indulgence.

‘Tis the season of indulgence.

1. Don’t just count kilojoules

Restricting your kilojoule intake is a surefire way to lose weight, but cutting back indiscriminately can lead to an unbalanced, unhealthy diet. Noakes recommends a focus on food groups rather than kilojoules counting. Ensuring you include food from each of the essential food groups each day, is a better way to get healthy.

“It is a much easier approach because you get optimal nutrients without having to learn the kilojoules of hundreds of different foods,” Noakes says. The essential food groups include protein foods such as meat, fish, chicken and eggs; dairy foods; low GI grains and cereals; fruits and vegetables, and healthy oils such as spreads and nuts.

2. Limit indulgences

Thirty-five per cent of the average Australian’s diet comes from “discretionary” foods with little nutritional value, such as alcohol, chips, lollies and cakes. That adds up to a whopping 2500-3500 kilojoules a day. “If you want to lose weight, limiting indulgences can have a dramatic impact over a period of time,” says Noakes. Try limiting yourself to one small indulgence per day (see blow), or seven in a week, to give yourself a small reward for eating well.

lollies

1 small indulgence equals:

  • 100ml Wine
  • 4 squares of chocolate
  • 1 fun size packet of potato chips
  • 1 scoop of ice cream
  • 1 chocolate biscuit

3. Stand every hour

It’s important to increase your everyday activity to help prevent weight gain and a good place to start is limiting the amount of time you sit per day.

Noakes recommends looking for opportunities to get off the couch or office chair and move wherever possible. For example taking the stairs, having short stand-up meetings at work, standing up when you take a phone call, or standing at parties rather than sitting. “Simply making an effort to spend less time sitting down and stand every hour can improve your health,” says Noakes.

4. Manage your appetite

During the festive season it’s hard to say no to holiday nibbles and cocktails or that extra snag at the weekend BBQ. Proactively managing your appetite with a higher protein, low GI diet can help prevent poor choices. “Protein controls appetite and low GI carbohydrates sustain energy, so having a light meal of 100g of lean protein food with a slice of grainy bread one hour before a party can help keep hunger in check,” says Noakes.

5. Sign up for a formal healthy eating plan

There is evidence showing that people who seek support in their weight loss efforts do better than those who go it alone. “We know that people who take part in weight loss programs find it easier to reach their goals,” says Noakes. “The support that people receive and the regular weight checks contribute to some of that success. The type of eating plan can also make a difference and a higher protein low GI plan has the best evidence for sustained weight loss success.

“The type of eating plan can also make a difference and a higher protein low GI plan has the best evidence for sustained weight loss success. That’s why we are releasing this new version of the Total Wellbeing Diet available as an online program in a new trial.”

Registrations for the online trial of our Total Wellbeing Diet are open until 10 November 2014. The cost for the 12-week program is $99 which is fully refundable if you complete the trial.

This article was originally published on Body & Soul


Smartphone app a life-saver for heart attack patients

Smartphone

A simple app can improve the rehabilitation of heart attack survivors

By Mohan Karunanithi, CSIRO; Marlien Varnfield, CSIRO, and Simon McBride, CSIRO

Patients recovering from heart attacks are almost 30% more likely to take part in rehab at home using a new smartphone app compared to those who had to travel to an outpatient clinic or centre.

What’s more, those who used the online delivery model – known as the Care Assessment Platform – were 40% more likely to adhere to the rules of the program and almost 70% more likely to see it through to completion.

The clinical trial, conducted by CSIRO and Queensland Health through the Australian E-Health Research Centre, also showed that the online model was just as clinically effective as a traditional rehab program.

Cardiologists were so pleased by the results that the next generation version of the platform is soon to be offered in a number of Queensland hospitals including Ipswich, Metro North and West Moreton Hospital and Health Services.

Why go digital?

Clinical guidelines recommend patients attend and complete a cardiac rehabilitation program following a heart attack. Studies have shown that those who do this have much better long-term health outcomes.

They are less likely to be re-admitted to hospital and will have a better quality of life. Most importantly, they are far less likely to have another cardiac event or die from their condition.

Traditionally, rehab programs take the form of group-based exercise, nutritional counselling, risk factor assessment and educational activities. They are designed to help patients return to an active, satisfying life.

Despite the benefits, uptake is generally poor. Successful recovery relies on patients starting and completing a rehabilitation program, but many find travelling to a health care facility on a weekly basis to be an onerous requirement – particularly for those who work or care for others, or live in regional or remote Australia where these services are not available.

The Care Assessment Platform features health and exercise monitoring tools, weekly mentoring consultations, delivers motivational materials through text messages and contains multimedia that educates patients about disease symptoms and management.

Most importantly, it gives patients a more flexible option. By integrating rehab treatment with a patient’s daily life, they are more likely to complete the program and make their new healthy lifestyle permanent. This overcomes one of the key barriers to patient participation and recovery.

The cost of heart disease

There have been huge advances in our treatment of heart disease in recent years. Researchers from the University of New South Wales have shown that rapid response teams, first introduced in Australia in the 1990s, have halved cardiac arrests and associated deaths in our hospitals. This saves around 12,000 Australian lives each year.

Despite the success of innovations like this, cardiovascular disease still kills one Australian nearly every 12 minutes. And for many of these patients, it is not their first cardiac event.

With more than A$5.5 billion spent every year on acute and chronic management of heart disease, any digital technology that improves recovery rates offers huge potential to reduce the burden and cost to the community.

What does a fully digital health system look like?

Australia’s health system faces significant challenges including rising costs, an ageing population, a rise in chronic diseases and fewer rural health workers. Total government expenditure on health has trebled in the last 25 years.

Clearly, something needs to change.

Emergency department

Technology can help ease the burden on emergency departments.

We must reduce the reliance on our hospitals by helping patients before they are admitted. Digital tools can do this by moving many services into the home through broadband delivery and models of care based on rich digital information.

In another study, CSIRO is running Australia’s largest clinical telehealth trial. In this trial, we’ve equipped a group of elderly patients with broadband-enabled home monitoring systems.

Patients can use machines to measure vital signs such as blood pressure, blood sugar, heart abnormalities, lung capacity, body weight and temperature.

Data is then immediately available to the patient’s doctor or nurse, allowing them to provide appropriate care interventions much earlier. This helps patients stay out of hospital and improve their quality of life.

Ultimately, those patients who are chronically ill will still need to attend hospital on occasion, so technology also has a role to play in improving the effectiveness of our admission systems.

Emergency departments are critically overcrowded and struggle to respond to day-to-day arrivals in a timely manner. Big data analytics can be used to predict how many patients will arrive at emergency, their medical needs and how many will be admitted or discharged. This can then be used to calculate how many beds will be required to meet patient’s needs.

New app-ortunities

The next step for our Care Assessment Platform research team is to adapt our mobile technology for rehabilitation for other chronic conditions such as pulmonary disease and diabetes. We’re also working hard to quantify the cost savings the program can deliver.

Australia has a track record in finding cures and new treatments for diseases. In order to sustain this, we also need to find new ways to deliver quality affordable care.

There is enormous potential for big data, analytics and decision support systems to help achieve this, reducing the burden on our health system and improving the wellbeing of all Australians.

The Conversation

Mohan Karunanithi receives funding from Queensland Health and The Prince Charles Foundation.

Simon McBride is affiliated with Health Informatics Society of Australia and the Australian College of Health Informatics.

Marlien Varnfield does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.

This article was originally published on The Conversation.
Read the original article.


Genetic evolution: how the Ebola virus changes and adapts

As viruses replicate, their genome changes. Image: EPA/Ahmed Jallanzo

As viruses replicate, their genome changes. Image: EPA/Ahmed Jallanzo

By Glenn Marsh, CSIRO

The current outbreak of Ebola virus in West Africa is unprecedented in size, with nearly 4,800 confirmed or probable cases and more than 2,400 deaths. People have been infected in Guinea, Liberia, Sierra Leone, Nigeria and Senegal.

The World Health Organization declared this outbreak a “public health emergency of international concern” in August and estimates it will claim a staggering 20,000 lives within the next six months.

A second completely independent and significantly smaller Ebola virus outbreak has been detected in the Democratic Republic of the Congo.

Like all viruses, the Ebola virus has evolved since the outbreak began. So, how does this occur and how does it impact our attempts to contain the disease?

Tracking Ebola

Ebolavirus and the closely related Marburgvirus genera belong to the Filoviridae family. Both of these genera contain viruses that may cause fatal haemorrhagic fevers.

The Ebola virus genus is made up of five virus species: Zaire ebolavirus (responsible for both of the current outbreaks), Sudan ebolavirus, Reston ebolavirus, Bundibugyo ebolavirus and Taï Forest ebolavirus.

In order to better understand the origin and transmission of the current outbreak in West Africa, researchers from the Broad Institute and Harvard University, in collaboration with the Sierra Leone Ministry of Health, sequenced 99 virus genomes from 78 patients.

The study, reported in Science, shows the outbreak resulted from a single introduction of virus into the human population and then ongoing human-to-human transmission. The scientists reported more than 300 unique changes within the virus causing the current West African outbreak, which differentiates this outbreak strain from previous strains.

The current Ebola outbreak has infected nearly 5,000 people. Image: EPA/Ahmed Jallanzo

Within the 99 genomes sequenced from this outbreak, researchers have recorded approximately 50 other changes to the virus as it spreads from person to person. Future work will investigate whether these differences are contributing to the severity of the current outbreak.

These 99 genome sequences have been promptly released to publicly available sequence databases such as Genbank, allowing scientists globally to investigate changes in these viruses. This is critical in assessing whether the current molecular diagnostic tests can detect these strains and whether experimental therapies can effectively treat the circulating strains.

How does Ebola evolve?

This is the first Ebola virus outbreak where scientists have sequenced viruses from a significant number of patients. Despite this, the Broad Institute/Harvard University study findings are not unexpected.

The Ebola virus genome is made up of RNA and the virus polymerase protein that does not have an error-correction mechanism. This is where it gets a little complicated, but bear with me.

As the virus replicates, it is expected that the virus genome will change. This natural change of virus genomes over time is why influenza virus vaccines must be updated annually and why HIV mutates to become resistant to antiretroviral drugs.

Changes are also expected when a virus crosses from one species to another. In the case of Ebola virus, bats are considered to be the natural host, referred to as the “reservoir host”. The virus in bats will have evolved over time to be an optimal sequence for bats.

Knowing how the Ebola virus adapts will help health officials contain future outbreaks. Image: EPA/Ahmed Jallanzo

Crossing over into another species, in this case people, puts pressure on the virus to evolve. This evolution can lead to “errors” or changes within the virus which may make the new host sicker.

Ebola viruses are known to rapidly evolve in new hosts, as we’ve seen in the adaptation of lab-based Ebola viruses to guinea pigs and mice. This adaptation occurred by passing a low-pathogenic virus from one animal to the next until the Ebola virus was able to induce a fatal disease. Only a small number of changes were required in both cases for this to occur.

While this kind of viral mutation is well known with other viruses, such as influenza virus, we are only truly appreciating the extent of it with the Ebola viruses.

What do the genetic changes mean?

The Broad Institute/Harvard University study reported that the number of changes in genome sequences from this current outbreak was two-fold higher than in previous outbreaks.

This could be due to the increased number of sequences obtained over a period of several months, and the fact that the virus has undergone many person-to-person passes in this time.

However, it will be important to determine if virus samples from early and late in the outbreak have differing ability to cause disease or transmit. The genetic changes may, for example, influence the level of infectious virus in bodily fluids, which would make the virus easier to spread.

Analysing this data will help us understand why this outbreak has spread so rapidly with devastating consequences and, importantly, how we can better contain and manage future outbreaks.

Glenn Marsh receives funding from Australian National Health and Medical Research Council and Rural Industries Research and Development Corporation.

This article was originally published on The Conversation.
Read the original article.


Historic collections could be lost to ‘digital dinosaurs’

An image of Australian shearers taken on glass plate negative is now preserved in a digital collection. Powerhouse Museum Collection/Flickr

An image of Australian shearers taken on glass plate negative is now preserved in a digital collection. Powerhouse Museum Collection/Flickr

By Michael Brünig, CSIRO

Australian’s museums, galleries and other cultural institutions must adopt more of a digital strategy with their collections if they are to remain relevant with audiences.

Only about a quarter of the collections held by the sector have been digitised so far and a study out today says more needs to be done to protect and preserve the material, and make it available to people online.

Challenges and Opportunities for Australia’s Galleries, Libraries, Archives and Museums is a joint study by CSIRO and the Smart Services CRC.

It notes that Australia’s galleries, libraries, archives and museums (the GLAM sector) represent our accumulated achievements and experiences, inspire creativity and provide a place for us to connect with our heritage.

They are also crucial to our economy with the GLAM sector estimated to have a revenue of about A$2.5 billion each year. That’s not only a lot of paintings and artifacts, but a lot of jobs as well.

But despite its size and scope, we found that digital innovation in the sector has been inconsistent and isolated. If these cultural institutions don’t increase their use of digital technologies and services, they risk losing their relevance.

So what changes do they need to make in order to thrive in the digital economy?

Opening doors and minds

With Australia’s rapid uptake of online and mobile platforms, people are now choosing to access and share information in very different ways.

It’s safe to say that the only constant in this space is change. Research suggests that expectations for more personalised, better and faster services and more well-designed experiences will continue to increase.

Virtual tours are now possible at the National Museum of Australia.

Virtual tours are now possible at the National Museum of Australia.

This is why our cultural institutions need to review the kind of visitor experience they are providing. We found only a few organisations had made fundamental changes to their operations that would allow them to place digital services at their core, rather than as an add-on activity.

This is in contrast to the dramatic changes we’ve seen when it comes to adopting digital technologies in our daily lives.

In order to be successful, digital experiences need to be an integrated and cohesive part of an institution’s offering.

Take what is happening at the National Museum of Australia. It’s now possible to take a tour of the museum via a telepresence-enabled robot.

This means school students – particularly those in rural and regional Australia – can explore exhibits virtually, without even leaving the classroom. Interestingly, we hear that this actually increases their desire to visit the museum in person.

Digital-savvy innovations such as this need to be at the fore of our institutions’ thinking if they want to engage with the community and break down barriers to participation.

Engaging with the public

To be successful in this new era, institutions need to meet people on their own (digital) terms. We can no longer expect visitors to queue at the turnstiles waiting for opening time. Organisations need to bring experiences to the user so that they can access them wherever and however they prefer.

Some of Australia’s cultural institutions are starting to get this.

Another image available freely online as part of the Powerhouse Museum Collection. Powerhouse Museum/Flickr

Another image available freely online as part of the Powerhouse Museum Collection. Powerhouse Museum/Flickr

The NSW State Library has appointed a Wikipedian-In-Residence to contribute expertise and train the public in publishing information online.

The National Library of Australia has attracted a large online user base with its online Trove service attracting almost 70,000 unique users each day.

The Powerhouse Museum has made parts of their photographic collections available on Flickr via Creative Commons licensing. This has caused a surge in the level of use and allowed the public to contribute information, adding value to the collection.

While these examples provide a lot of hope for the sector, the unfortunate reality is that they are few and far between. Most of Australia’s cultural institutions have not kept pace with this change and are missing the opportunity to better connect and actually increase their revenue.

Digitise this!

Australia’s eight national, state and territory art organisations hold archives that, if laid out flat end-to-end, would span 629km. This is on top of a staggering 100,000 million artworks, books and audio-visual items in Australia.

But only a quarter of these items are digitised, with some of Australia’s collections still being managed through “old school” mechanisms such as log books and card indices.

Imagine if there was a fire at one of our great institutions? We would risk losing cultural and heritage material of significance. Parts of our history could be completely lost. Even without such a devastating event, if we don’t make our collections more accessible, in a sense they’ll be lost to many of us anyway.

As a country, not only do we need to get moving when it comes to digitising our collections, we also need to explore new and innovative ways to do this. Traditionally, digitisation has meant scanning flat documents, photographing objects or creating electronic versions of catalogue data.

But what if we could do so much more? Researchers are now focused on the next challenge: digitising objects and spaces in three dimensions.

Researchers from the University of Wollongong with support from the Smart Services CRC are focusing on capturing 3D models and the textures of surfaces using low-cost equipment such as a Kinect camera from an Xbox.

3D map of The Shrine of Remembrance, Melbourne

3D map of The Shrine of Remembrance, Melbourne

At CSIRO, we’ve even used our own handheld scanner Zebedee to map culturally and environmentally significant sites suchb as the Jenolan Caves, Melbourne’s Shrine of Remembrance and even a semi-submerged wreckage of the HMQS Gayundah.

We’re also creating high-quality 2D and 3D image libraries based on the National Biological Collections, letting us document biodiversity in the digital era.

Embracing the digital economy

While our study reveals that Australia’s cultural institutions are certainly at risk of becoming “digital dinosaurs”, it also demonstrated how those organisations that are embracing digital are reaping the benefits.

It provides recommendations for the GLAM industry in order for it to maximise its digital potential, including:

  • shifting to open access models and greater collaboration with the public
  • exploring new approaches to copyright management that stimulate creativity and support creators
  • building on aggregation initiatives such as the Atlas of Living Australia
  • standardising preservation of “born digital” material to avoid losing access to digital heritage
  • exploiting the potential of Australia’s Academic and Research Network (AARNet) and the National Broadband Network (NBN) for collection and collaboration.

By adopting these recommendations and building on some innovative examples in the sector, Australia’s GLAM industry will be well placed to embrace digital, rather than be engulfed by it.

This article was originally published on The Conversation.
Read the original article 


99.999% certainty humans are driving global warming: new study

Gambling for high stakes

Would you take a gamble with these odds? Image by Kraevski Vitaly Shutterstock

By Philip Kokic, CSIRO; Mark Howden, CSIRO, and Steven Crimp, CSIRO

There is less than 1 chance in 100,000 that global average temperature over the past 60 years would have been as high without human-caused greenhouse gas emissions, our new research shows.

Published in the journal Climate Risk Management today, our research is the first to quantify the probability of historical changes in global temperatures and examines the links to greenhouse gas emissions using rigorous statistical techniques.

Our new CSIRO work provides an objective assessment linking global temperature increases to human activity, which points to a close to certain probability exceeding 99.999%.

Our work extends existing approaches undertaken internationally to detect climate change and attribute it to human or natural causes. The 2013 Intergovernmental Panel on Climate Change Fifth Assessment Report provided an expert consensus that:

It is extremely likely [defined as 95-100% certainty] that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic [human-caused] increase in greenhouse gas concentrations and other anthropogenic forcings together.

Decades of extraordinary temperatures

July 2014 was the 353rd consecutive month in which global land and ocean average surface temperature exceeded the 20th-century monthly average. The last time the global average surface temperature fell below that 20th-century monthly average was in February 1985, as reported by the US-based National Climate Data Center.

This means that anyone born after February 1985 has not lived a single month where the global temperature was below the long-term average for that month.

We developed a statistical model that related global temperature to various well-known drivers of temperature variation, including El Niño, solar radiation, volcanic aerosols and greenhouse gas concentrations. We tested it to make sure it worked on the historical record and then re-ran it with and without the human influence of greenhouse gas emissions.

Our analysis showed that the probability of getting the same run of warmer-than-average months without the human influence was less than 1 chance in 100,000.

We do not use physical models of Earth’s climate, but observational data and rigorous statistical analysis, which has the advantage that it provides independent validation of the results.

Detecting and measuring human influence

Our research team also explored the chance of relatively short periods of declining global temperature. We found that rather than being an indicator that global warming is not occurring, the observed number of cooling periods in the past 60 years strongly reinforces the case for human influence.

We identified periods of declining temperature by using a moving 10-year window (1950 to 1959, 1951 to 1960, 1952 to 1961, etc.) through the entire 60-year record. We identified 11 such short time periods where global temperatures declined.

Our analysis showed that in the absence of human-caused greenhouse gas emissions, there would have been more than twice as many periods of short-term cooling than are found in the observed data.

There was less than 1 chance in 100,000 of observing 11 or fewer such events without the effects of human greenhouse gas emissions.

Good risk management is all about identifying the most likely causes of a problem, and then acting to reduce those risks. Some of the projected impacts of climate change can be avoided, reduced or delayed by effective reduction in global net greenhouse gas emissions and by effective adaptation to the changing climate.

Ignoring the problem is no longer an option. If we are thinking about action to respond to climate change or doing nothing, with a probability exceeding 99.999% that the warming we are seeing is human-induced, we certainly shouldn’t be taking the chance of doing nothing.

The Conversation

The authors do not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article. They also have no relevant affiliations.

This article was originally published on The Conversation.
Read the original article.


Australia’s astronomy future in a climate of cutbacks

Parkes radio telescope

What future for the Parkes radio telescope amid the CSIRO cutbacks? Image: Wayne England

By Lewis Ball, CSIRO

The future looks very bright for Australian radio astronomy but it was somewhat clouded earlier this year when CSIRO’s radio astronomy program took a dramatic hit in the Australian federal budget.

CSIRO has cut its funding for radio astronomy by 15%, down A$3.5 million to A$17 million for the 2014-15 financial year. The result will be a reduction of about 30 staff from the plan of just three months ago.

The cuts will impact most heavily on CSIRO’s in-house astronomy research, on the operation of the Parkes radio telescope – instantly recognisable from the movie The Dish – on the less well known but tremendously productive Australia Telescope Compact Array near Narrabri and on the Mopra Telescope near Coonabarabran, all in New South Wales.

The Australia Telescope Compact Array.

The Australia Telescope Compact Array. Image: D. Smyth

About two-thirds of ATNF’s staffing reduction will be effected through not filling planned new roles, most prominent of which was to be a CSIRO “SKA Chief Scientist”. A third of the reduction will be through involuntary redundancies. Eight staff across sites in Sydney, Parkes, Narrabri and Geraldton have already been informed that their roles are expected to cease.

The speed of implementation of such a substantial funding reduction forces swift action. This has unsettled staff and the broader astronomy community, but it hasn’t changed the broad direction of CSIRO’s astronomy program.

World leaders in radio astronomy

Australian scientists and engineers are world leaders in radio astronomy, both in understanding our universe and in developing some of the most innovative technologies used to gain that understanding, and have been for 50 years.

CSIRO’s Australia Telescope National Facility (ATNF) has been integral to the discovery of the first double pulsar system (a long-sought holy grail of astronomy), the identification of a previously unknown arm of our own galaxy, the Milky Way, and the invention of Wi-Fi now so embedded in everyday communications.

For the past decade CSIRO has been steadily changing the way it operates its radio astronomy facilities. CSIRO’s highest priority is the pursuit of science enabled by the development of an innovative new technology that provides an unprecedented wide field of view.

This uses “Phased Array Feeds” (PAFs) as multi-pixel radio cameras at the focus of dishes. PAFs are being deployed in the Australian SKA Pathfinder (ASKAP), in Western Australia, which will be the fastest radio telescope in the world for surveying the sky.

ASKAP telescopes in WA

High-speed radio astronomy surveys will be possible thanks to the PAF receivers (green chequerboard at top of the quadrupod) on the ASKAP telescopes in Western Australia.

ASKAP is in the early stages of commissioning. It is just now starting to demonstrate the new capabilities obtainable with a PAF-equipped array.

ASKAP is an outstanding telescope in its own right but is also a pathfinder to the huge Square Kilometre Array (SKA). This enormous project will build the world’s biggest astronomy observatory in Australia and southern Africa. It’s also the most expensive at a cost of around A$2.5 billion.

Cutbacks at The Dish

To resource these exciting developments, CSIRO has been reducing costs and staffing at its existing facilities, including the venerable Parkes Dish. This is a painful but necessary process. The most recent funding cuts will result in more pain.

Astronomers will no longer have the option of travelling to the Compact Array to operate the telescope to collect their data. They can run the telescope from CSIRO’s operations centre in Sydney, or from their own university, or from anywhere in the world via an internet connection.

Astronomers who use the Parkes telescope have been doing this for the past year after a very successful program to make the 50-year-old dish remotely operable. That is pretty amazing for a machine built before the advent of modern computers.

Parkes telescope

The Parkes dish gets the remote treatment.
Image: John Sarkissian

For many decades Parkes staff have swapped detector systems or “radio receivers” in and out of the focus cabin, the box at the tip of the tripod that sits about 64 metres off the ground. Each receiver operates at different wavelengths and offers quite different types of science.

It seems likely that CSIRO will offer just two Parkes receivers for at least the next six to 12 months, since it will no longer have the staff needed to swap receivers. Similar reductions in the capability of the Compact Array will also be needed to fit within the budget.

The future

While the current changes are painful, the future is incredibly exciting. The direction of Australia’s astronomy is described in the Decadal Plan for Australian Astronomy for 2006–2015. It identifies participation in the SKA and access to the world’s largest optical telescopes as the two highest priorities for Australian astronomy.

We are making progress on both fronts, despite some significant challenges. The process to develop the plan for the next decade is well in hand under the stewardship of the National Committee for Astronomy.

Phased arrays are also at the heart of the Murchison Widefield Array (MWA), another innovative SKA precursor that has been in operation for a little over a year.

ASKAP and the MWA are located in the Murchison region of Western Australia, chosen because it has a tremendously low level of human activity and so astonishingly little background radio noise.

This radio quietness is the equivalent of the dark skies so important for optical astronomers. Less noise means astronomers are better able to detect and study the incredibly weak radio signals from the most distant parts of the universe.

This freedom from radio interference is a unique resource available only in remote parts of Australia and is essential for ASKAP, MWA and much of the science targeted by the SKA.

PAFs

Prototype of the more sensitive second-generation PAFs to be deployed on ASKAP undergoing tests in Western Australia in August 2014.
Image: A. Hotan

The wide fields of view of ASKAP and the MWA enable unprecedented studies of the entire radio sky. Astronomers will measure the radio emission of millions of galaxies and complete massive surveys that for the first time will connect radio astronomy to the more mature field of optical astronomy.

Mapping the sky with EMU and WALLABY

The two highest priority projects for ASKAP are called the Evolutionary Map of the Universe (EMU) and the Widefield ASKAP L-Band Legacy All-Sky Blind Survey (WALLABY).

Both will survey millions of galaxies and together they will trace the formation and evolution of stars, galaxies and massive black holes to help us explore the large-scale structure of the universe.

The MWA is already producing great science targeted at the detection of intergalactic hydrogen gas during what’s known as the “epoch of reionisation” when the first stars in the universe began to shine.

With the SKA we aim to understand what the mysterious dark matter and dark energy are. We may also provide another spin-off such as the Wi-Fi technology, which came from CSIRO efforts to detect the evaporating black holes predicted by Stephen Hawking.

Advances in data-mining or processing techniques driven by the astonishing data rates that will be collected by the thousands of SKA antennas deployed across the Australian and African continents might provide the most fertile ground of all, illustrating once again the long-term benefits of investing in cutting-edge science.

Lewis Ball has received funding from the Australian Research Council. CSIRO Astronomy and Space Science receives funding from a variety of government sources, and from NASA/JPL.

This article was originally published on The Conversation.
Read the original article.


Follow

Get every new post delivered to your Inbox.

Join 3,970 other followers