These days, massive volumes of data about us are collected from censuses and surveys, computers and mobile devices, as well as scanning machines and sensors of many kinds. But this data can also reveal personal and sensitive information about us, raising some serious privacy concerns.
Data are routinely collected when we shop, use public transport, visit our GP or access government services in person or online. There’s also data from using our smart phones and fitness monitoring devices.
These data are generally collected for a purpose, called the “primary purpose”. For example, having purchased goods delivered, catching a bus from home to work, having a health check, obtaining a Medicare refund, navigating or searching our local area, as well as logging our fitness regime.
But in addition to being used for such primary purposes, many data are stored and used for other purposes, called “secondary purposes”. This includes research to help inform decision-making and debate within government and the community.
For example, data from Medicare, the Pharmaceutical Benefits Scheme and hospitals can be used to identify potential adverse drug reactions much faster than is currently possible.
What about privacy?
But these data can also reveal highly sensitive information about us, such as about our preferences, behaviours, friends and whether we have a disease or not.
Given the rapid change in the volume and nature of data in the digital age, it is timely to ask whether the existing ethics frameworks for the secondary use of such data are still adequate. Do they address the right ethical issues associated with research using the data? In particular, how will an individual’s privacy be protected?
There have been two important responses to these issues. A group of researchers, supported by the University of Melbourne and the Carlton Connect Initiative, explored these issues through workshops, desk research and many consultations.
They produced the Guidelines for the Ethical Use of Digital Data in Human Research. It’s a work in progress, requiring ongoing practice and revision, rather than a definitive set of prescriptions.
A team at CSIRO and the Sax Institute also addressed the deeper ethical issue of protecting privacy in the secondary use of health data. This work will be developed into Guidelines for Confidentiality Protection in Public Health Research Results.
Ethical issues for digital data
In the first of the guidelines, five key categories of ethical issues are identified as highly relevant to digital data and require additional consideration when using digital data.
- Consent: making sure that participants can make informed decisions about their participation in the research
- Privacy and confidentiality: privacy is the control that individuals have over who can access their personal information. Confidentiality is the principle that only authorised persons should have access to information
- Ownership and authorship: who has responsibility for the data, and at what point does the individual give up their right to control their personal data?
- Data sharing – assessing the social benefits of research: data matching and re-use of data from one source or research project in another
- Governance and custodianship: oversight and implementation of the management, organisation, access and preservation of digital data.
The voluntary guidelines were developed to help people conducting research and to assist ethics committees to assess research involving digital data.
Without such guidelines, there is a risk that new ethical issues involving digital data will not adequately be considered and managed by researchers and ethics committees.
Privacy risks from the data
Traditionally, the data custodians responsible for granting access to data sets have sought to protect people’s confidentiality by only providing access to approved researchers. They also restricted the detail of the data released, such as replacing age or date of birth by month or year of birth.
More recently, data custodians are increasingly being asked for highly flexible access to more and more details about individual persons from an expanded range of data collections.
Custodians are responding by developing a new flexible range of access modes or mechanisms, including remote analysis systems and virtual data centres.
Under remote analysis, a researcher does not have access to any of the data but submits queries and receives analysis results through a secure webpage.
A virtual data centre is less restrictive than a remote analysis system. It enables researchers to interact directly with data, submit queries and receive results through a secure interface.
But the results of statistical analysis as released by a virtual data centre may still reveal personal information. For example, if a result such as an average is computed on a very small number of people then it is probably very close to the value for each of those people.
By following such voluntary guidelines, researchers can maintain confidentiality while ensuring that society can benefit from their work.
The rapid technological advances in our society are creating more and more data archives of many different types. It’s vital that we continue to assess the ethical and privacy risks from secondary use of this data if researchers are to reap the potential benefits from access to the information.
Bushfires are highly chaotic natural events, dangerous to people and homes in their path and even more dangerous to those brave enough to fight them.
Australia is all-too-familiar with tragedy caused by bushfire, with days such as Ash Wednesday and Black Saturday ingrained into public and personal memories. The costs in a bad bushfire season can run into billions of dollars, although nothing can truly account for the lives and communities affected by these events.
Bushfires are hard to predict for two reasons. No-one can be sure where or when they will start, although well-educated guesses can be made.
Weather conditions conducive to the outbreak of bushfires are well known and serve to prompt total fire bans to reduce the chance of accidental ignitions. Unfortunately, some of the most frequent causes – lightning strikes and arson – are inherently unpredictable.
Once a bushfire has started it is also difficult to predict precisely where it will go.
While all bushfires do follow well understood physical laws, fine scale variations in factors such as the weather, topography and distribution of fuel mean that a bushfire may appear to behave erratically.
Sudden shifts in the wind direction may cause a quiescent flank to burst to life, creating a new wider fire front. A single tree next to a road or river may enable the fire to jump across an otherwise impassable barrier.
Fighting and controlling fires is a major difficulty for emergency services due to this level of uncertainty. Even deciding the best evacuation routes in uncertain fire conditions can be challenging.
Studying bushfire behaviour
This apparent unpredictability has not deterred fire scientists. Since the early part of the last century these scientists have been carefully studying the behaviour and spread of fires in different conditions.
The results have been collected and tabulated into mathematical formulae to predict how fast a fire will spread. These have been used in Australia for many years for early warning and planning purposes.
But the speed of a fire depends on a wide range of factors. These range from large scale effects, such as the weather or slope of the land, to the small scale, such as whether the fire is burning through leaf litter or grass. The resulting mathematical calculations are complicated, as all of these factors must be included.
Fire science, like many other science disciplines, has benefited from the recent growth in computer processing and data storage. These advances mean meteorological models can now give weather forecasts at very fine scales.
Improvements in computer algorithms have led to newer, more powerful, models to represent spreading fires. Growth in data storage has allowed the creation of detailed maps of terrain and vegetation.
Spark: a new insight into bushfire spread simulation
Fire spread simulation is an intersection of a number of disciplines including ecology, geography, physics, meteorology, mathematics and computer science. When simulating fires, each of these must work together.
To do this most effectively, a new way to bring all of these parts together was needed. This led to the creation of a new software system called Spark.
Spark is a bushfire prediction framework containing all the parts needed to process fine-scale weather and fuel data, run advanced fire simulations and depict the results. The system will be released today at the Australia New Zealand Disaster Management Conference on the Gold Coast.
The parts that make up Spark can also be connected together in whichever way best suits the user. This also has the advantage that as new models come along, the older parts in the system can simply be replaced.
The system enables scientists from multiple disciplines to collaborate. Currently, fire scientists are working to improve fire behaviour models, computer scientists are building new ways to simulate perimeter propagation and software engineers are developing the system on the latest computational hardware.
Spark has been built with the uncertainty of fire behaviour foremost in mind. For predictions of ongoing fires, multiple different cases can be run for slightly different weather forecasts.
The system contains statistical components that allow the results to be combined into maps of the likelihood of when the fire is going to arrive at a given location.
Other current research involves improving fire predictions by using a range of conditions, some likely and others very unlikely.
These predictions can be combined with real-world measurements of the fire using a statistical method to feed back into the model. This allows the model to respond to changing conditions, including highly unlikely events, providing better predictions of future fire behaviour.
Bringing the latest fire science to the fireground
The collaborative approach behind Spark means that services and agencies using the system will benefit from the latest advances in fire science.
The system can be fully customised and can be integrated with existing systems. Spark can also be built into any number of applications, such as evacuation planning or fire regime tools.
Spark can also be used for land management and planning, fire mitigation analysis, real-time fire prediction, risk analysis or reconstruction and analysis of fire events.
James Hilton is Research scientist at CSIRO.
Andrew Sullivan is Research Team Leader, Bushfire Behaviour and Risks at CSIRO.
Mahesh Prakash is Principal Research Scientist, Fluid Dynamics at CSIRO.
Ryan Fraser is Research Manager at CSIRO.
Australia’s CSIRO has come up with some pretty amazing inventions over the past 86 years of research, from polymer banknotes to insect repellent and the world-changing Wi-Fi. But we can also lay claim to something a little more esoteric – we actually invented a whole new word.
The word is “petrichor”, and it’s used to describe the distinct scent of rain in the air. Or, to be more precise, it’s the name of an oil that’s released from the earth into the air before rain begins to fall.
This heady smell of oncoming wet weather is something most Australians would be familiar with – in fact, some scientists now suggest that humans inherited an affection for the smell from ancestors who relied on rainy weather for their survival.
Even the word itself has ancient origins. It’s derived from the Greek “petra” (stone) and “ichor” which, in Greek mythology, is the ethereal blood of the gods.
But the story behind its scientific discovery is a lesser known tale. So, how is it that we came to find this heavenly blood in the stone?
Nature of Argillaceous Odour might be a mouthful, but this was the name of the paper published in the Nature journal of March 7, 1964, by CSIRO scientists Isabel (Joy) Bear and Richard Thomas, that first described petrichor.
Thomas had for years been trying to identify the cause for what was a long-known and widespread phenomena. As the paper opened:
That many natural dry clays and soils evolve a peculiar and characteristic odour when breathed on, or moistened with water, is recognised by all the earlier text books of mineralogy.
The odour was particularly prevalent in arid regions and was widely recognised and associated with the first rains after a period of drought. The paper went on to say:
There is some evidence that drought-stricken cattle respond in a restless matter to this “smell of rain”.
The smell had actually been described already by a small perfumery industry operating out of India, which had successfully captured and absorbed the scent in sandalwood oil. They called it “matti ka attar” or “earth perfume”. But its source was still unknown to science.
Joy and Richard, working at what was then our Division of Mineral Chemistry in Melbourne, were determined to identify and describe its origin.
By steam distilling rocks that had been exposed to warm, dry conditions in the open, they discovered a yellowish oil – trapped in rocks and soil but released by moisture – that was responsible for the smell.
The diverse nature of the host materials has led us to propose the name “petrichor” for this apparently unique odour which can be regarded as an “ichor” or “tenuous essence” derived from rock or stone.
The oil itself was thus named petrichor — the blood of the stone.
Bring on the humidity
The smell itself comes about when increased humidity – a pre-cursor to rain – fills the pores of stones (rocks, soil, etc) with tiny amounts of water.
While it’s only a minuscule amount, it is enough to flush the oil from the stone and release petrichor into the air. This is further accelerated when actual rain arrives and makes contact with the earth, spreading the scent into the wind.
According to the Nature Paper:
In general, materials in which silica or various metallic silicates predominated were outstanding in their capacity to yield the odour. It was also noted that the odour could be obtained from freshly ignited materials rich in iron oxide, with or without silica.
It’s a beautiful sequence of events, but one that may be hard to visualise.
Thankfully, in a testament to the ongoing scientific fascination with this finding, a team of scientists at the Massachusetts Institute of Technology have just this year released a super slow motion video of the petrichor process in motion.
Using high-speed cameras, the researchers observed that when a raindrop hits a porous surface, it traps tiny air bubbles at the point of contact. As in a glass of champagne, the bubbles then shoot upward, ultimately bursting from the drop in a fizz of aerosols.
The team was also able to predict the amount of aerosols released, based on the velocity of the raindrop and the permeability of the contact surface which may explain how certain soil-based diseases spread.
There’s a small body of research and literature on petrichor that’s fascinating in its own right, including Thomas and Bear’s subsequent paper Petrichor and Plant Growth a year after they first named the smell.
So what happened to Joy Bear and Richard Thomas?
Richard had actually retired from CSIRO in 1961 when he was First Chief of the Division of Minerals Chemistry. He died in 1974, aged 73.
Joy, aged 88, a true innovator and pioneer in her field, retired from CSIRO only in January this year, after a career spanning more than 70 years.
The joint discovery of petrichor was just part of a truly remarkable and inspiring career which culminated in 1986, with Joy’s appointment as a Member of the Order of Australia for services to science.
We are thankful to both for the lasting legacy on giving a name to the smell of rain and to Joy for the role model she has been to so many women in science.
This is part of a series on CSIRO Inventions.
This article was originally published on The Conversation.
Read the original article.
Just as we don’t all have the same tastes or preferences for football codes or teams – or even genres of music or flavours of ice cream – so too we don’t all have the same tastes or preferences when it comes to science.
Last year the CSIRO released the results of a major survey into public attitudes towards science and technology, and found four key segments of the population that view science in very different ways:
A: Fan Boys and Fan Girls. This group is about 23% of the population and they are very enthusiastic about science and technology. Science is a big part of their lives and they think everyone should all take an interest in it.
B: The Cautiously Keen make up about 28% of the public. They are interested in science and technology, but can be a little wary of it. They tend to believe that the benefits of science must be greater than any harmful effects.
C: The Risk Averse represent about 23% of the population. They are much more concerned about the risks of science and technology, including issues such as equality of access. Most of their values about science are framed in terms of risk.
D: The Concerned and Disengaged make up 20% of the population. They are the least enthusiastic and least interested in science and technology. Many of them don’t much trust it. They believe the pace of science and technology is too fast to keep up with and that science and technology create more problems than they solve.
If you are reading this article you are probably an A – and have self-selected to read the article as something you are interested in. But that is one of the problems: most audiences of science communications activities self-select from the As.
Interesting the disinterested
The research builds upon several other earlier surveys and its findings complement a 2014 survey designed by the Australian National University and conducted by Ipsos Public Affairs for the Inspiring Australia program.
This survey segmented Australians on the basis of how frequently they interacted with information about science and technology. It found that only half of the population could recall listening to, watching or reading something to do with science and technology, or even searching for science and technology information, at least once a fortnight. Also, 14% had much less frequent interactions with science and technology information.
So, while Merlin Crossley is quite right that we are increasingly well served by high-quality science communication activities, rather than simply needing even more, we believe we need a broader spread of activities, designed for different audiences who have different attitudes to science.
With science communication activities growing, the Fan Boys and Fan Girls have never had it so good. There are great science stories almost everywhere you turn, if you’re interested in those stories, of course.
But the CSIRO data showed that as many as 40% of the Australian public were unengaged, disinterested or wary of science – little changed since a similar Victorian government study in 2011.
So the growth in science communication is not necessarily growing its audience. To do that we need to align our science communication messages and channels with those that the disengaged and disinterested value.
Think of the football analogy mentioned above. A diehard AFL fan is not likely to seek out a rugby union match of their own volition. However, if you want to get them interested in rugby union, you might consider holding a demonstration match at an AFL game. Or even better, recruit AFL players to join one of the teams playing in the rugby union demo match.
More than blowing stuff up
There are many ways to get exciting science communication activities out of the existing channels and onto the Footy Shows and Today Shows of the world. Science communicators could show up at music and folk festivals and other community activities. They could get sports stars and TV personalities and musicians talking about science, much as the Inspiring Australia initiative has sought to do.
And they should think beyond BSU (blowing stuff up) approaches where the “wow” factor is high but longer term engagement is often quite low.
One of the other key findings of the CSIRO study was that the Fan Boys and Fan Girls are further away from the average point of community values than any other segment of the population. This means that Fan Boys or Girls probably have the least idea of what might appeal to the other segments. They know what turns them on, but they are probably only guessing what will work for the other segments.
So they need to recruit members of the other non-science fan segments to help devise science communication activities that appeal to them. For no one is going to understand the Bs, Cs and Ds like they understand themselves (even if they don’t much understand As!).
By Dr Narelle Fegan and Dr Andrew Leis
Keeping our food safe
The recent outbreak of hepatitis A, which is thought to be associated with the consumption of frozen berries, has highlighted food safety concerns and sparked debates around country of origin labelling and testing of imported foods. Ensuring the safety of our food supply can be a complex process that involves maintaining good hygienic practices in the production and handling of foods at all stages between the farm and consumption.
With some foods, we can reduce the risk of foodborne illness through a heating process, which includes practices like cooking, canning and pasteurisation. However, with fresh produce (such as leafy green and fruits), a heating step is less desirable – we have to rely more on hygienic production to deliver a safe food product. There are quality assurance schemes in place to ensure that fresh produce is grown, harvested, packed and transported to limit contamination by foodborne pathogens.
These schemes rely on people involved in all parts of the production chain following the procedures outlined, to deliver a product that is safe to consume. Such quality assurance schemes operate across the globe and imported products are required to meet the same hygienic standards as food produced in the importing country.
Can testing of food ensure it is safe to eat?
Microbiological testing of foods is only one aspect of quality assurance schemes designed to help keep our food safe. Scientific evidence and history tells us that testing of products for pathogens is not an efficient way of determining if food has been contaminated.
This is particularly true of pathogens that occur very rarely in food (such as hepatitis A) as only a very small amount of the food will be contaminated, and we can’t guarantee we will sample the portion of food where contamination occurs. The difficulties associated with pathogen testing of foods include:
- Contamination is not evenly distributed within the food and only certain portions of the food may contain the pathogen.
- Testing for foodborne viruses destroys the portion of food that is tested.
- Because the food is destroyed during testing, not all of the food can be tested as there would be nothing left for us to eat. Only some of the food can be tested – but sampling plans have been developed to try and maximise the chances of detecting foodborne pathogens.
- Testing methods for foodborne viruses in particular can be difficult to perform, as we have to try and isolate viruses and their genetic material from foods which are often very complex in nature (containing fats, sugars and salts, which can all make it more difficult to detect pathogens).
For these reasons testing of food is only one part of quality assurance schemes, with more attention focusing on hygienic production to limit the opportunities of food becoming contaminated with pathogens.
Why would frozen berries be at risk of carrying hepatitis A?
Freezing is a highly effective and convenient way to increase the shelf-life of foods, and unlike heat-based sterilisation techniques, it preserves most of the nutritional value of the food (some components, like vitamins, are quickly destroyed by heat). Freezing not only prolongs shelf life but also allows us to enjoy very seasonal products, such as berries, at any time of the year.
Preservation of viruses and bacteria during freezing is affected by the rate of freezing and the amounts of sugars and other molecules nearby that help to slow the growth of ice crystals. In a household freezer, water freezes quite slowly – consider the time it takes to freeze water in ice cube trays. Slow freezing favours the formation of ice crystals. As the crystals grow in size, they can kill some bacteria and viruses. On the other hand, high local concentrations of sugars and other molecules can protect the microorganisms from damage.
Frozen berries are generally safe to eat, and have only occasionally been involved in foodborne outbreaks. Hepatitis A virus infection as a result of eating contaminated food (not just berries) is also very rare, particularly in Australia where on average only five cases a year are associated with food consumption in Australia. This is very small compared to other foodborne pathogens in Australia such as Norovirus, where an estimated 276,000 cases a year are associated with food and bacteria, or Campylobacter where 179,000 cases are associated with the consumption of food.
What can we do to ensure the food we eat is safe?
It is not possible to ensure safety by testing a final product. Therefore, systems based on hazard analysis and identification of critical control points have been developed and adopted by governments and food producers through food regulations, industry guidelines and quality assurance schemes. However, human error through poor planning or poor execution can lead to one or more failures along the supply chain. The best thing we can do to ensure the food we eat is safe is to foster a culture of food safety. This means better educating all those involved in the food industry, as well as governments and consumers, so that they understand the safety risks associated with the production, manufacture and consumption of foods.Food safety needs to be seen as an investment, not a cost.
For more information, visit our website.
You might have heard the oceans are full of plastic, but how full exactly? Around 8 million metric tonnes go into the oceans each year, according to the first rigorous global estimate published in Science today.
That’s equivalent to 16 shopping bags full of plastic for every metre of coastline (excluding Antarctica). By 2025 we will be putting enough plastic in the ocean (on our most conservative estimates) to cover 5% of the earth’s entire surface in cling film each year.
Around a third of this likely comes from China, and 10% from Indonesia. In fact all but one of the top 20 worst offenders are developing nations, largely due to fast-growing economies but poor waste management systems.
However, people in the United States – coming in at number 20 and producing less than 1% of global waste – produce more than 2.5 kg of plastic waste each day, more than twice the amount of people in China.
While the news for us, our marine wildlife, seabirds, and fisheries is not good, the research paves the way to improve global waste management and reduce plastic in the waste stream.
Follow the plastic
An international team of experts analysed 192 countries bordering the Atlantic, Pacific and Indian Oceans, and the Mediterranean and Black Seas. By examining the amount of waste produced per person per year in each country, the percentage of that waste that’s plastic, and the percentage of that plastic waste that is mismanaged, the team worked out the likely worst offenders for marine plastic waste.
In 2010, 270 million tonnes of plastic was produced around the world. This translated to 275 million tonnes of plastic waste; 99.5 million tonnes of which was produced by the two billion people living within 50 km of a coastline. Because some durable items such as refrigerators produced in the past are also thrown away, we can find more waste than plastic produced at times.
Of that, somewhere between 4.8 and 12.7 million tonnes found its way into the ocean. Given how light plastic is, this translates to an unimaginably large volume of debris.
While plastic can make its way into oceans from land-locked countries via rivers, these were excluded in the study, meaning the results are likely a conservative estimate.
With our planet still 85 years away from “peak waste” — and with plastic production skyrocketing around the world — the amount of plastic waste getting into the oceans is likely to increase by an order of magnitude within the next decade.
Our recent survey of the Australian coastline found three-quarters of coastal rubbish is plastic, averaging more than 6 pieces per meter of coastline. Offshore, we found densities from a few thousand pieces of plastic to more than 40,000 pieces per square kilometre in the waters around the continent.
Where is the plastic going?
While we now have a rough figure for the amount of plastic rubbish in the world’s oceans, we still know very little about where it all ends up (it isn’t all in the infamous “Pacific Garbage Patch”).
Between 6,350 and 245,000 metric tons of plastic waste is estimated to float on the ocean’s surface, which raises the all-important question: where does the rest of it end up?
Some, like the plastic microbeads found in many personal care products, ends up in the oceans and sediments where they can be ingested by bottom-dwelling creatures and filter-feeders.
It’s unclear where the rest of the material is. It might be deposited on coastal margins, or maybe it breaks down into fragments so small we can’t detect it, or maybe it is in the guts of marine wildlife.
Wherever it ends up, plastic has enormous potential for destruction. Ghost nets and fishing debris snag and drown turtles, seals, and other marine wildlife. In some cases, these interactions have big impacts.
For instance, we estimate that around 10,000 turtles have been trapped by derelict nets in Australia’s Gulf of Carpentaria region alone.
More than 690 marine species are known to interact with marine litter. Turtles mistake floating plastic for jellyfish, and globally around one-third of all turtles are estimated to have eaten plastic in some form. Likewise seabirds eat everything from plastic toys, nurdles and balloon shreds to foam, fishing floats and glow sticks.
While plastic is prized for its durability and inertness, it also acts as a chemical magnet for environmental pollutants such as metals, fertilisers, and persistent organic pollutants. These are adsorbed onto the plastic. When an animal eats the plastic “meal”, these chemicals make their way into their tissues and — in the case of commercial fish species — can make it onto our dinner plates.
Plastic waste is the scourge of our oceans; killing our wildlife, polluting our beaches, and threatening our food security. But there are solutions – some of which are simple, and some a bit more challenging.
If the top five plastic-polluting countries – China, Indonesia, the Philippines, Vietnam and Sri Lanka – managed to achieve a 50% improvement in their waste management — for example by investing in waste management infrastructure, the total global amount of mismanaged waste would be reduced by around a quarter.
Higher-income countries have equal responsibility to reduce the amount of waste produced per person through measures such as plastic recycling and reuse, and by shifting some of the responsibility for plastic waste back onto the producers.
The simplest and most effective solution might be to make the plastic worth money. Deposits on beverage containers for instance, have proven effective at reducing waste lost into the environment – because the containers, plastic and otherwise, are worth money people don’t throw them away, or if they do others pick them up.
Extending this idea to a deposit on all plastics at the beginning of their lifecycle, as raw materials, would incentivize collection by formal waste managers where infrastructure is available, but also by consumers and entrepreneurs seeking income where it is not.
Before the plastic revolution, much of our waste was collected and burned. But the ubiquity, volume, and permanence of plastic waste demands better solutions.
Australia is on track for up to 1.7C of warming this century if the world curbs its greenhouse emissions, but under a worst-case scenario could see anything from 2.8C to 5.1C of warming by 2090, according to new climate change projections released by the CSIRO and the Bureau of Meteorology.
The projections are the most comprehensive ever released for Australia. They are similar to those published in 2007, but based on stronger evidence, with more regional detail. These projections have been undertaken primarily to inform the natural resources management sector, although the information will be useful for planning and managing the impacts of climate change in other sectors.
The new report draws on climate model data used by the Intergovernmental Panel on Climate Change (the IPCC). The Fifth IPCC Assessment Report (AR5), released in 2013 and 2014, used a range various greenhouse gas and aerosol scenarios to project future climate change.
Over the past 10 years, carbon dioxide emissions have been tracking the highest IPCC emission scenario (known as RCP8.5). If there is limited international action to reduce emissions, then projections based on the highest scenario may be realised.
However, if emissions are significantly reduced over the coming decades, then intermediate emissions (RCP4.5) might be feasible. Following the low emissions scenario (RCP2.6) would be very challenging given the current trajectory of carbon dioxide emissions.
How does Australia compare?
By late in this century (2090), Australia’s average warming is projected to be 0.6 to 1.7C for a low emission scenario, or 2.8 to 5.1C under a high emission scenario.
The warming under the high scenario is similar to the global average warming of 2.6 to 4.8C under the high emission scenario reported by the IPCC AR5. However, inland areas of Australia will warm faster than coastal areas.
The new projections should be viewed in the context of what has already been observed. Australia has become 0.9C warmer since 1910. Rainfall has increased in northern Australia since the 1970s and decreased in south-east and south-west Australia.
More of Australia’s rain has come from heavy falls and there has been more extreme fire weather in southern and eastern Australia since the 1970s. Sea levels have risen by approximately 20 cm since 1900.
In future, Australia’s average temperature will increase and we will experience more heat extremes and fewer cold extremes. Winter and spring rainfall in southern Australia is projected to decline while changes in other regions are uncertain.
For the rest of Australia, natural climate variability will predominate over rainfall trends caused by increasing greenhouse gases until 2030. By 2090, a winter rainfall decrease is expected in eastern Australia, but a winter rainfall increase is expected in Tasmania.
Historical climate data can be used as an analogue for the future. The analogue could be a location that currently has a climate similar to that expected in another region in the future.
For example, for a warming of 1.5-3.0C and a rainfall decrease of 5-15%, Melbourne’s climate becomes similar to that of Clare in South Australia, Sydney becomes more like Brisbane, and Brisbane becomes more like Bundaberg in inland Queensland.
Extreme rainfall events that lead to flooding are likely to become more intense. The number of tropical cyclones is projected to decrease but they may be more intense and possibly reach further south. Southern and eastern Australia is projected to experience harsher fire weather. The time in drought will increase over southern Australia, with a greater frequency of severe droughts.
A projected increase in evaporation rates will contribute to a reduction in soil moisture across Australia. There will be a decrease in snowfall, an increase in snowmelt, and therefore reduced snow cover.
Sea levels will continue to rise throughout the 21st century and beyond. Oceans around Australia will warm and become more acidic.
What will Australia look like?
Freshwater resources are projected to decline in far south-west and far south-east mainland Australia. Rising sea levels and increasing heavy rainfall are projected to increase erosion and inundation, with consequent damage to many low-lying ecosystems, infrastructure and housing.
Increasing heat waves will increase risks to human health. Rainfall changes and rising temperatures will shift agricultural production zones. Many native species will suffer from reduced habitats and some may face local or even global extinction.
The most vulnerable regions/sectors are coral reefs, increased frequency and intensity of flood damage to infrastructure and settlements, and increasing risks to coastal infrastructure and low-lying ecosystems.
While reductions in global greenhouse gas emissions would increase the chance of slowing climate change, adaptation is also required because some warming and associated climate changes are unavoidable.