Proceeds of crime: how polymer banknotes were invented

They’re waterproof and tough – not to mention colourful – but plastic notes were developed for their unforgeability. Lis Ferla/Flickr, CC BY-NC

They’re waterproof and tough – not to mention colourful – but plastic notes were developed for their unforgeability. Lis Ferla/Flickr, CC BY-NC

By Tom Spurling, Swinburne University of Technology and David Solomon, University of Melbourne

Welcome to CSIRO Inventions, a series looking at the discoveries and innovations borne from Australia’s national science agency. In this first instalment, we outline the story behind the plastic money we use today – and the criminal world that drove its development.

The Reserve Bank of Australia (RBA) and CSIRO’s 20-year “bank project” resulted in the introduction of the polymer banknote – the first ever of its kind, and the most secure form of currency in the world.

The project commenced in 1968 and continued until 1988 with the release of the A$10 bicentennial commemorative banknote. But it’s the story behind this story – a personal note of forgeries, underworld figures and CSIRO – that is just as impressive.

Australia’s transition from the pound to the dollar – on Decimal Day, February 14 1966 – was a momentous occasion. The new currency was seen as being a marker of our independence from the mother country, and the changeover from pound to dollar was well-planned and executed.

The first polymer banknotes. One side symbolised European settlement and the other, the original discovery and settlement of Australia 40–60,000 years earlier. RBA/Wikimedia Commons

The first polymer banknotes. One side symbolised European settlement and the other, the original discovery and settlement of Australia 40–60,000 years earlier.
RBA/Wikimedia Commons

(A little-known fact: a nationwide competition was held to find a name for our new currency with an “Australian flavour”. Among the more than 1,000 submissions were the “austral”, “boomer”, “kwid” and “ming”, but “dollar” was chosen.)

By April 1966 most of the old imperial banknotes had been removed from circulation, and a new range of state-of-the-art dollars and coins were doing the rounds of the nation’s tills, wallets and pockets. With designs by leading Australian artists and cutting edge security features such as watermarks and metal thread, things couldn’t have been better for the note-issuer, the RBA.

But the new notes were not infallible, and it didn’t take long for counterfeiters to strike.

Enter the forgers

By the end of the year, a team of amateurs from suburban Melbourne, armed with simple office equipment and a desire to make some money, were able to produce a batch of fake notes with no intaglio printing, no watermark and no metal thread that would net them almost A$800,000 worth of forgeries. (That figure’s not to be sneezed at – it would be worth A$9.6 million in 2013.)

The nucleus of this team were two “regular joes” with no real criminal history: Francis Papworth, an artist from Bentleigh, and Jeffrey Mutton, who owned a failing milk bar in Moorabbin near a printing plant where Papworth worked.

As with many great schemes, this one was hatched over a beer – Papworth and Mutton often met at the Boundary Hotel in East Bentleigh. It was January 1966, only a few weeks before the introduction of the dollar, and the two mates were looking for an easy way to reverse their fortunes. Papworth worked at a printing plant … so why not print some money?

The first A$10 notes, featuring architect Francis Greenway on the front and poet Henry Lawson on the back. RBA/Wikimedia Commons

The first A$10 notes, featuring architect Francis Greenway on the front and poet Henry Lawson on the back. RBA/Wikimedia Commons


Deciding it was a “goer”, they enlisted a third contemporary, Dale Code, along with Ron Adam (a professional photographer) and Bert Kidd, a notorious career criminal who was to provide the funding for the scheme. Their original target was the ten shilling note, but on the release of the A$10 note on Decimal Day they decided that the new version was going to be even easier.

What followed was a tale of ingenuity, intrigue and deceit. Using only their basic printing equipment, the forgers were able to produce three batches of fake notes – each more sophisticated than the next – that would stay in circulation for many years. But despite their initial success, the authorities soon picked up on their activities.

Adam, Code and Mutton were tried and found guilty of forgery in 1967 but Papworth, who had been a police informant, was found not guilty. Kidd was arrested in 1969 after Mutton, who was already serving time, gave evidence against him.

Paper dollars looked nice, but thanks to Mutton and his forging mates, were replaced by polymer.

News of the forgery soon became public, and a period of unrest followed. Instructions were issued by the Reserve Bank on how to spot the forgeries, which were then to be handed to authorities. But anybody turning a note in would not receive a genuine note in return, so many continued to be circulated.

A general distrust of A$10 notes permeated Australian society – at one stage, members of the Amalgamated Engineering Union refused to accept them as part of their pay packet.

Call in the scientists


The RBA’s Governor, HC (Nugget) Coombs turned to science – or, more specifically, to CSIRO.

The challenge was set: could we create the world’s most secure banknote? After some preliminary planning, the “bank project” began. Coombs originally enlisted seven top Australian scientists – five physicists and two chemists – to help the RBA develop a more secure banknote. They met on April 1, 1968, and despite the date, these were no April fools – the two chemists were Jerry Price, who went on to become chairman of CSIRO, and Sefton Hamann, chief of the CSIRO Division of Applied Chemistry.

The group was introduced to the general principles of banknote design and production, and sent off to think about it before reconvening for a second meeting at Thredbo in June 1968.

Two more scientists were invited to Thredbo: Neil Lewis, recently retired from Kodak, and David Solomon, a young, award-winning polymer scientist from CSIRO. It was during these first few years that Dr Solomon first hit on the idea of a plastic banknote after being given a business card printed on plastic by a visitor from Japan.

From polymer granules, notes grow.

By February 1972, CSIRO and the RBA had agreed to commence a project to develop polymer banknotes with a range of optically variable security devices. The CSIRO team soon developed a “proof of concept” and presented it to the RBA.

The concept had:

  • a see-through panel
  • diffraction grating (an optical component which splits and diffracts light into several beams) embedded in the note
  • and it was, of course, plastic.

As well as being difficult to forge, these new notes were also more durable than the traditional “rag notes”, more environmentally friendly and less likely to carry dirt and disease.

Dr Spurling conducting a ‘feel test’ with Governor Arthur Phillip. CSIRO, Author provided

Dr Spurling conducting a ‘feel test’ with Governor Arthur Phillip.
CSIRO, Author provided


These technical improvements were made within the first ten years of the bank project, but behind-the-scenes delays prevented the issue of these revolutionary notes until the bicentennial year 1988. In a defiant gesture to Papworth, Mutton and co, the first note issued was – you guessed it – A$10.

Today, there are more than 30 different denominations totalling some three billion polymer notes in service in 22 countries worldwide.

For more information, The Plastic Banknote: from conception to reality is available to buy from CSIRO Publishing.

The Conversation

Tom Spurling was employed by CSIRO during the development of the polymer banknote.

David Solomon was employed by CSIRO during the development of the polymer banknote.

This article was originally published on The Conversation.
Read the original article.

Ozone hole closing for the year, but full recovery is decades away

These clouds – formed high in the Antarctic atmosphere during spring – provide a place where ozone-destroying chemicals can form. Image: sandwich/Flickr, CC BY-NC-ND

These clouds – formed high in the Antarctic atmosphere during spring – provide a place where ozone-destroying chemicals can form. Image: sandwich/Flickr, CC BY-NC-ND

By Paul Krummel, CSIRO and Paul Fraser, CSIRO

Imagine an environmental crisis caused by a colourless, odourless gas, in minute concentrations, building up in the atmosphere. There is no expert consensus, but in the face of considerable uncertainty and strong resistance to the science, global regulation of these emissions succeeds.

Subsequently, the science is established and the damage, though already apparent, begins to be mitigated.

Trends in the size of the ozone hole (top), amount of ozone (middle) and ozone deficit (bottom) show the ozone hole may be recovering. The green lines show the amount of chlorine in the stratosphere relative to the other measurements.

No, this is not fantasy. It’s history. We’re talking about the ozone hole.

In two or three weeks the Antarctic’s seasonal ozone hole will close for the year. The ozone hole has formed every spring since the 1970s. This year’s is among the smaller ones over the past 20 years — since ozone-depleting substances began declining.

The United Nations Environment Program and World Meteorological Organisation’s Scientific Assessment of Ozone Depletion: 2014 states that global total column ozone has shown a small increase in recent years.

However, it may take another few years before we can definitively say the Antarctic ozone hole has recovered, and several decades until full recovery to pre-1980 conditions.

Radical discovery

Atmospheric ozone isn’t a single layer at a certain altitude above the Earth’s surface; it’s dispersed — there is even a significant amount of ozone at the Earth’s surface.

Even the stratospheric ozone known as “the ozone layer” is not a single layer of pure ozone, but a region where ozone is more abundant than it is at other altitudes. Satellite sensors and other ozone-measuring devices monitor the total ozone concentration for an entire column of the atmosphere, and whether there is more or less than normal.

Throughout the 1970s, scientists began to observe two separate but related phenomena: the total amount of ozone in the stratosphere — the region 10 to 50 kilometres above the earth’s surface — was declining steadily at about 4% every ten years. And in spring there was a much larger decrease in stratospheric ozone over the polar regions.

By the mid-1980s, they reached the conclusion that the cause was a chemical reaction between ozone and halogen (chlorine and bromine). This halogen came from man-made substances: chlorine/bromine-containing refrigerants, solvents, propellants and foam-blowing agents (chlorofluorocarbons or CFCs, halons and hydrochlorofluorocarbons or HCFCs).

When exposed to UV light and in the presence of polar stratospheric clouds, these molecules break down, releasing radical chemicals that destroy ozone atoms at an alarming rate.

CFCs, one of the most prominent culprits, were first synthesised in the 1890s, but it wasn’t until the 1950s that they began to be widely used as refrigerants.

The most unfortunate scientist

Thomas Midgely, the chemical engineer who improved their synthesis and demonstrated their potential uses, was probably the most unfortunate scientist ever to rise to an influential position.

In 1921, he discovered that adding tetra-ethyl lead to fuel improved the efficiency of internal combustion engines. Unfortunately, this discovery was commercialised. Lead persists in the atmosphere today. It also accumulates in animals, sometimes to toxic levels, particularly those at the top of food chains.

Subsequently, Midgely set himself to solving the problem of the refrigerants in use in the earlier part of the 20th century. These were uniformly dangerous — either flammable, explosive or toxic. CFCs weren’t and were soon widely adopted, not only as refrigerants but also later as propellants and blowing agents.

The best thing about CFCs – their low reactivity – is also the worst. Because they’re so unreactive, they’re very long-lived (often in excess of 100 years). This gives them time to get into the stratosphere. One of the components of CFCs is chlorine. Very little chlorine exists naturally in the stratosphere, but CFCs are a very effective way of introducing significant amounts of chlorine into the ozone layer.

Midgely’s efforts to do good had dire unintended consequences: he’s been described as having had more impact on the atmosphere than any other single organism in Earth’s history.

Recognising the threat

By the late 1960s, scientists had detected growth in the level of CFCs in the atmosphere. By 1974 researchers published the first paper predicting that the increase in CFCs would cause significant ozone loss.

The ozone hole hypothesis was strongly disputed by some industry representatives.

Nonetheless, the reality of a possible depleted ozone layer and the threat to human health it implied so alarmed the international community that by 1985 the Vienna Convention for the Protection of the Ozone Layer was agreed on, even before the significant ozone depletion was detected. The convention came into force in 1988 and was ratified over subsequent decades by 197 nations, making it one of the most successful treaties of all time.

The following year (1989), the Montreal Protocol on Substances that Deplete the Ozone Layer (which falls under the Vienna Convention) also came into force. This treaty was designed to enact the spirit of the Vienna Convention – i.e. to protect the ozone layer – and achieved it by phasing out the production and consumption of numerous substances that are responsible for ozone depletion.

Long time to recovery

Repairing the ozone hole is a long-term process. CSIRO has been monitoring the hole over Antarctica since the late 1970s. The ozone hole first appeared in spring over Antarctica and subsequently over the Arctic, as the ozone-destroying chemical processes require very cold conditions and the onset of sunlight (following the polar winter).

In Antarctica, the hole lasts for two to three months before breaking up and mixing with ozone–richer air from mid-latitudes. It’s not constant in size — except in the sense that it’s consistently very large — although it waxes and wanes. The record so far is 29.5 million square kilometres, set in 2006.

For comparison, the land mass of Australia (including Tasmania) is 7.7 million square kilometres.

Although some reports claim that the ozone layer over Antarctica is recovering, it’s too early to make a definitive call. Measurements at surface monitoring stations show that the amount of ozone-destroying chemicals at the surface has been dropping since about 1994-1995. The amount is now about 10-15% down on that peak.

The stratosphere lags behind the surface, and the effects of this will take some time to play out. However, satellite measurements show that the decline in ozone amount in the stratosphere has stopped, and perhaps begun recovery.

The size and depth of the ozone hole each year shows quite large variability due to different meteorological conditions, in particular stratospheric temperatures.

How the 2014 hole measures up

Overall, out of the 35 years of satellite data analysed, the 2014 ozone hole is one of the smaller ones since the late 1980s. It ranks as the 18th largest in daily area; 16th largest for daily ozone deficit; and 21st lowest for minimum ozone.

The 2014 ozone hole appeared in the first week of August and grew rapidly in size from mid-August through to the second week of September, reaching 23.5 million square kilometres on September 15.

The 2014 hole is among the smaller since the mid-1990s.

The 2014 hole is among the smaller since the mid-1990s.

During the third and fourth weeks of September the ozone hole area decreased to 17.5 million square kilometres. Then, in a final flurry, the daily ozone hole area grew sharply again during the last days of September to peak at 23.9 million square kilometres on October 1.

This is the peak daily ozone hole area for 2014, larger than in 2010, 2012 and 2013, about the same as 2009, and smaller than in 2011.

The ozone hole is now in the recovery phase, and had shrunk to about 9 million square kilometres by November 14. It is expected to recover this year in two to three weeks.

You can find the full details on the 2014 ozone hole here.

The Conversation

Paul Krummel receives funding from MIT, NASA, Australian Bureau of Meteorology, Department of the Environment, & Refrigerant Reclaim Australia.

Paul Fraser receives funding from receives funding from MIT, NASA, Australian Bureau of Meteorology, Department of the Environment, & Refrigerant Reclaim Australia.

This article was originally published on The Conversation.
Read the original article.

Five tips for enjoying the festive season without gaining weight

The average Aussie risks gaining several kilos over the holiday period. That might sound like a small number, but few of us lose it when the festive season is over. We asked Professor Manny Noakes, research director of our Food and Nutrition Flagship and co-author of the famous Total Wellbeing Diet, for five tips on how to survive the silly season without gaining extra baggage.

'Tis the season of indulgence.

‘Tis the season of indulgence.

1. Don’t just count kilojoules

Restricting your kilojoule intake is a surefire way to lose weight, but cutting back indiscriminately can lead to an unbalanced, unhealthy diet. Noakes recommends a focus on food groups rather than kilojoules counting. Ensuring you include food from each of the essential food groups each day, is a better way to get healthy.

“It is a much easier approach because you get optimal nutrients without having to learn the kilojoules of hundreds of different foods,” Noakes says. The essential food groups include protein foods such as meat, fish, chicken and eggs; dairy foods; low GI grains and cereals; fruits and vegetables, and healthy oils such as spreads and nuts.

2. Limit indulgences

Thirty-five per cent of the average Australian’s diet comes from “discretionary” foods with little nutritional value, such as alcohol, chips, lollies and cakes. That adds up to a whopping 2500-3500 kilojoules a day. “If you want to lose weight, limiting indulgences can have a dramatic impact over a period of time,” says Noakes. Try limiting yourself to one small indulgence per day (see blow), or seven in a week, to give yourself a small reward for eating well.


1 small indulgence equals:

  • 100ml Wine
  • 4 squares of chocolate
  • 1 fun size packet of potato chips
  • 1 scoop of ice cream
  • 1 chocolate biscuit

3. Stand every hour

It’s important to increase your everyday activity to help prevent weight gain and a good place to start is limiting the amount of time you sit per day.

Noakes recommends looking for opportunities to get off the couch or office chair and move wherever possible. For example taking the stairs, having short stand-up meetings at work, standing up when you take a phone call, or standing at parties rather than sitting. “Simply making an effort to spend less time sitting down and stand every hour can improve your health,” says Noakes.

4. Manage your appetite

During the festive season it’s hard to say no to holiday nibbles and cocktails or that extra snag at the weekend BBQ. Proactively managing your appetite with a higher protein, low GI diet can help prevent poor choices. “Protein controls appetite and low GI carbohydrates sustain energy, so having a light meal of 100g of lean protein food with a slice of grainy bread one hour before a party can help keep hunger in check,” says Noakes.

5. Sign up for a formal healthy eating plan

There is evidence showing that people who seek support in their weight loss efforts do better than those who go it alone. “We know that people who take part in weight loss programs find it easier to reach their goals,” says Noakes. “The support that people receive and the regular weight checks contribute to some of that success. The type of eating plan can also make a difference and a higher protein low GI plan has the best evidence for sustained weight loss success.

“The type of eating plan can also make a difference and a higher protein low GI plan has the best evidence for sustained weight loss success. That’s why we are releasing this new version of the Total Wellbeing Diet available as an online program in a new trial.”

Registrations for the online trial of our Total Wellbeing Diet are open until 10 November 2014. The cost for the 12-week program is $99 which is fully refundable if you complete the trial.

This article was originally published on Body & Soul

Smartphone app a life-saver for heart attack patients


A simple app can improve the rehabilitation of heart attack survivors

By Mohan Karunanithi, CSIRO; Marlien Varnfield, CSIRO, and Simon McBride, CSIRO

Patients recovering from heart attacks are almost 30% more likely to take part in rehab at home using a new smartphone app compared to those who had to travel to an outpatient clinic or centre.

What’s more, those who used the online delivery model – known as the Care Assessment Platform – were 40% more likely to adhere to the rules of the program and almost 70% more likely to see it through to completion.

The clinical trial, conducted by CSIRO and Queensland Health through the Australian E-Health Research Centre, also showed that the online model was just as clinically effective as a traditional rehab program.

Cardiologists were so pleased by the results that the next generation version of the platform is soon to be offered in a number of Queensland hospitals including Ipswich, Metro North and West Moreton Hospital and Health Services.

Why go digital?

Clinical guidelines recommend patients attend and complete a cardiac rehabilitation program following a heart attack. Studies have shown that those who do this have much better long-term health outcomes.

They are less likely to be re-admitted to hospital and will have a better quality of life. Most importantly, they are far less likely to have another cardiac event or die from their condition.

Traditionally, rehab programs take the form of group-based exercise, nutritional counselling, risk factor assessment and educational activities. They are designed to help patients return to an active, satisfying life.

Despite the benefits, uptake is generally poor. Successful recovery relies on patients starting and completing a rehabilitation program, but many find travelling to a health care facility on a weekly basis to be an onerous requirement – particularly for those who work or care for others, or live in regional or remote Australia where these services are not available.

The Care Assessment Platform features health and exercise monitoring tools, weekly mentoring consultations, delivers motivational materials through text messages and contains multimedia that educates patients about disease symptoms and management.

Most importantly, it gives patients a more flexible option. By integrating rehab treatment with a patient’s daily life, they are more likely to complete the program and make their new healthy lifestyle permanent. This overcomes one of the key barriers to patient participation and recovery.

The cost of heart disease

There have been huge advances in our treatment of heart disease in recent years. Researchers from the University of New South Wales have shown that rapid response teams, first introduced in Australia in the 1990s, have halved cardiac arrests and associated deaths in our hospitals. This saves around 12,000 Australian lives each year.

Despite the success of innovations like this, cardiovascular disease still kills one Australian nearly every 12 minutes. And for many of these patients, it is not their first cardiac event.

With more than A$5.5 billion spent every year on acute and chronic management of heart disease, any digital technology that improves recovery rates offers huge potential to reduce the burden and cost to the community.

What does a fully digital health system look like?

Australia’s health system faces significant challenges including rising costs, an ageing population, a rise in chronic diseases and fewer rural health workers. Total government expenditure on health has trebled in the last 25 years.

Clearly, something needs to change.

Emergency department

Technology can help ease the burden on emergency departments.

We must reduce the reliance on our hospitals by helping patients before they are admitted. Digital tools can do this by moving many services into the home through broadband delivery and models of care based on rich digital information.

In another study, CSIRO is running Australia’s largest clinical telehealth trial. In this trial, we’ve equipped a group of elderly patients with broadband-enabled home monitoring systems.

Patients can use machines to measure vital signs such as blood pressure, blood sugar, heart abnormalities, lung capacity, body weight and temperature.

Data is then immediately available to the patient’s doctor or nurse, allowing them to provide appropriate care interventions much earlier. This helps patients stay out of hospital and improve their quality of life.

Ultimately, those patients who are chronically ill will still need to attend hospital on occasion, so technology also has a role to play in improving the effectiveness of our admission systems.

Emergency departments are critically overcrowded and struggle to respond to day-to-day arrivals in a timely manner. Big data analytics can be used to predict how many patients will arrive at emergency, their medical needs and how many will be admitted or discharged. This can then be used to calculate how many beds will be required to meet patient’s needs.

New app-ortunities

The next step for our Care Assessment Platform research team is to adapt our mobile technology for rehabilitation for other chronic conditions such as pulmonary disease and diabetes. We’re also working hard to quantify the cost savings the program can deliver.

Australia has a track record in finding cures and new treatments for diseases. In order to sustain this, we also need to find new ways to deliver quality affordable care.

There is enormous potential for big data, analytics and decision support systems to help achieve this, reducing the burden on our health system and improving the wellbeing of all Australians.

The Conversation

Mohan Karunanithi receives funding from Queensland Health and The Prince Charles Foundation.

Simon McBride is affiliated with Health Informatics Society of Australia and the Australian College of Health Informatics.

Marlien Varnfield does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.

This article was originally published on The Conversation.
Read the original article.

Genetic evolution: how the Ebola virus changes and adapts

As viruses replicate, their genome changes. Image: EPA/Ahmed Jallanzo

As viruses replicate, their genome changes. Image: EPA/Ahmed Jallanzo

By Glenn Marsh, CSIRO

The current outbreak of Ebola virus in West Africa is unprecedented in size, with nearly 4,800 confirmed or probable cases and more than 2,400 deaths. People have been infected in Guinea, Liberia, Sierra Leone, Nigeria and Senegal.

The World Health Organization declared this outbreak a “public health emergency of international concern” in August and estimates it will claim a staggering 20,000 lives within the next six months.

A second completely independent and significantly smaller Ebola virus outbreak has been detected in the Democratic Republic of the Congo.

Like all viruses, the Ebola virus has evolved since the outbreak began. So, how does this occur and how does it impact our attempts to contain the disease?

Tracking Ebola

Ebolavirus and the closely related Marburgvirus genera belong to the Filoviridae family. Both of these genera contain viruses that may cause fatal haemorrhagic fevers.

The Ebola virus genus is made up of five virus species: Zaire ebolavirus (responsible for both of the current outbreaks), Sudan ebolavirus, Reston ebolavirus, Bundibugyo ebolavirus and Taï Forest ebolavirus.

In order to better understand the origin and transmission of the current outbreak in West Africa, researchers from the Broad Institute and Harvard University, in collaboration with the Sierra Leone Ministry of Health, sequenced 99 virus genomes from 78 patients.

The study, reported in Science, shows the outbreak resulted from a single introduction of virus into the human population and then ongoing human-to-human transmission. The scientists reported more than 300 unique changes within the virus causing the current West African outbreak, which differentiates this outbreak strain from previous strains.

The current Ebola outbreak has infected nearly 5,000 people. Image: EPA/Ahmed Jallanzo

Within the 99 genomes sequenced from this outbreak, researchers have recorded approximately 50 other changes to the virus as it spreads from person to person. Future work will investigate whether these differences are contributing to the severity of the current outbreak.

These 99 genome sequences have been promptly released to publicly available sequence databases such as Genbank, allowing scientists globally to investigate changes in these viruses. This is critical in assessing whether the current molecular diagnostic tests can detect these strains and whether experimental therapies can effectively treat the circulating strains.

How does Ebola evolve?

This is the first Ebola virus outbreak where scientists have sequenced viruses from a significant number of patients. Despite this, the Broad Institute/Harvard University study findings are not unexpected.

The Ebola virus genome is made up of RNA and the virus polymerase protein that does not have an error-correction mechanism. This is where it gets a little complicated, but bear with me.

As the virus replicates, it is expected that the virus genome will change. This natural change of virus genomes over time is why influenza virus vaccines must be updated annually and why HIV mutates to become resistant to antiretroviral drugs.

Changes are also expected when a virus crosses from one species to another. In the case of Ebola virus, bats are considered to be the natural host, referred to as the “reservoir host”. The virus in bats will have evolved over time to be an optimal sequence for bats.

Knowing how the Ebola virus adapts will help health officials contain future outbreaks. Image: EPA/Ahmed Jallanzo

Crossing over into another species, in this case people, puts pressure on the virus to evolve. This evolution can lead to “errors” or changes within the virus which may make the new host sicker.

Ebola viruses are known to rapidly evolve in new hosts, as we’ve seen in the adaptation of lab-based Ebola viruses to guinea pigs and mice. This adaptation occurred by passing a low-pathogenic virus from one animal to the next until the Ebola virus was able to induce a fatal disease. Only a small number of changes were required in both cases for this to occur.

While this kind of viral mutation is well known with other viruses, such as influenza virus, we are only truly appreciating the extent of it with the Ebola viruses.

What do the genetic changes mean?

The Broad Institute/Harvard University study reported that the number of changes in genome sequences from this current outbreak was two-fold higher than in previous outbreaks.

This could be due to the increased number of sequences obtained over a period of several months, and the fact that the virus has undergone many person-to-person passes in this time.

However, it will be important to determine if virus samples from early and late in the outbreak have differing ability to cause disease or transmit. The genetic changes may, for example, influence the level of infectious virus in bodily fluids, which would make the virus easier to spread.

Analysing this data will help us understand why this outbreak has spread so rapidly with devastating consequences and, importantly, how we can better contain and manage future outbreaks.

Glenn Marsh receives funding from Australian National Health and Medical Research Council and Rural Industries Research and Development Corporation.

This article was originally published on The Conversation.
Read the original article.

Historic collections could be lost to ‘digital dinosaurs’

An image of Australian shearers taken on glass plate negative is now preserved in a digital collection. Powerhouse Museum Collection/Flickr

An image of Australian shearers taken on glass plate negative is now preserved in a digital collection. Powerhouse Museum Collection/Flickr

By Michael Brünig, CSIRO

Australian’s museums, galleries and other cultural institutions must adopt more of a digital strategy with their collections if they are to remain relevant with audiences.

Only about a quarter of the collections held by the sector have been digitised so far and a study out today says more needs to be done to protect and preserve the material, and make it available to people online.

Challenges and Opportunities for Australia’s Galleries, Libraries, Archives and Museums is a joint study by CSIRO and the Smart Services CRC.

It notes that Australia’s galleries, libraries, archives and museums (the GLAM sector) represent our accumulated achievements and experiences, inspire creativity and provide a place for us to connect with our heritage.

They are also crucial to our economy with the GLAM sector estimated to have a revenue of about A$2.5 billion each year. That’s not only a lot of paintings and artifacts, but a lot of jobs as well.

But despite its size and scope, we found that digital innovation in the sector has been inconsistent and isolated. If these cultural institutions don’t increase their use of digital technologies and services, they risk losing their relevance.

So what changes do they need to make in order to thrive in the digital economy?

Opening doors and minds

With Australia’s rapid uptake of online and mobile platforms, people are now choosing to access and share information in very different ways.

It’s safe to say that the only constant in this space is change. Research suggests that expectations for more personalised, better and faster services and more well-designed experiences will continue to increase.

Virtual tours are now possible at the National Museum of Australia.

Virtual tours are now possible at the National Museum of Australia.

This is why our cultural institutions need to review the kind of visitor experience they are providing. We found only a few organisations had made fundamental changes to their operations that would allow them to place digital services at their core, rather than as an add-on activity.

This is in contrast to the dramatic changes we’ve seen when it comes to adopting digital technologies in our daily lives.

In order to be successful, digital experiences need to be an integrated and cohesive part of an institution’s offering.

Take what is happening at the National Museum of Australia. It’s now possible to take a tour of the museum via a telepresence-enabled robot.

This means school students – particularly those in rural and regional Australia – can explore exhibits virtually, without even leaving the classroom. Interestingly, we hear that this actually increases their desire to visit the museum in person.

Digital-savvy innovations such as this need to be at the fore of our institutions’ thinking if they want to engage with the community and break down barriers to participation.

Engaging with the public

To be successful in this new era, institutions need to meet people on their own (digital) terms. We can no longer expect visitors to queue at the turnstiles waiting for opening time. Organisations need to bring experiences to the user so that they can access them wherever and however they prefer.

Some of Australia’s cultural institutions are starting to get this.

Another image available freely online as part of the Powerhouse Museum Collection. Powerhouse Museum/Flickr

Another image available freely online as part of the Powerhouse Museum Collection. Powerhouse Museum/Flickr

The NSW State Library has appointed a Wikipedian-In-Residence to contribute expertise and train the public in publishing information online.

The National Library of Australia has attracted a large online user base with its online Trove service attracting almost 70,000 unique users each day.

The Powerhouse Museum has made parts of their photographic collections available on Flickr via Creative Commons licensing. This has caused a surge in the level of use and allowed the public to contribute information, adding value to the collection.

While these examples provide a lot of hope for the sector, the unfortunate reality is that they are few and far between. Most of Australia’s cultural institutions have not kept pace with this change and are missing the opportunity to better connect and actually increase their revenue.

Digitise this!

Australia’s eight national, state and territory art organisations hold archives that, if laid out flat end-to-end, would span 629km. This is on top of a staggering 100,000 million artworks, books and audio-visual items in Australia.

But only a quarter of these items are digitised, with some of Australia’s collections still being managed through “old school” mechanisms such as log books and card indices.

Imagine if there was a fire at one of our great institutions? We would risk losing cultural and heritage material of significance. Parts of our history could be completely lost. Even without such a devastating event, if we don’t make our collections more accessible, in a sense they’ll be lost to many of us anyway.

As a country, not only do we need to get moving when it comes to digitising our collections, we also need to explore new and innovative ways to do this. Traditionally, digitisation has meant scanning flat documents, photographing objects or creating electronic versions of catalogue data.

But what if we could do so much more? Researchers are now focused on the next challenge: digitising objects and spaces in three dimensions.

Researchers from the University of Wollongong with support from the Smart Services CRC are focusing on capturing 3D models and the textures of surfaces using low-cost equipment such as a Kinect camera from an Xbox.

3D map of The Shrine of Remembrance, Melbourne

3D map of The Shrine of Remembrance, Melbourne

At CSIRO, we’ve even used our own handheld scanner Zebedee to map culturally and environmentally significant sites suchb as the Jenolan Caves, Melbourne’s Shrine of Remembrance and even a semi-submerged wreckage of the HMQS Gayundah.

We’re also creating high-quality 2D and 3D image libraries based on the National Biological Collections, letting us document biodiversity in the digital era.

Embracing the digital economy

While our study reveals that Australia’s cultural institutions are certainly at risk of becoming “digital dinosaurs”, it also demonstrated how those organisations that are embracing digital are reaping the benefits.

It provides recommendations for the GLAM industry in order for it to maximise its digital potential, including:

  • shifting to open access models and greater collaboration with the public
  • exploring new approaches to copyright management that stimulate creativity and support creators
  • building on aggregation initiatives such as the Atlas of Living Australia
  • standardising preservation of “born digital” material to avoid losing access to digital heritage
  • exploiting the potential of Australia’s Academic and Research Network (AARNet) and the National Broadband Network (NBN) for collection and collaboration.

By adopting these recommendations and building on some innovative examples in the sector, Australia’s GLAM industry will be well placed to embrace digital, rather than be engulfed by it.

This article was originally published on The Conversation.
Read the original article 

99.999% certainty humans are driving global warming: new study

Gambling for high stakes

Would you take a gamble with these odds? Image by Kraevski Vitaly Shutterstock

By Philip Kokic, CSIRO; Mark Howden, CSIRO, and Steven Crimp, CSIRO

There is less than 1 chance in 100,000 that global average temperature over the past 60 years would have been as high without human-caused greenhouse gas emissions, our new research shows.

Published in the journal Climate Risk Management today, our research is the first to quantify the probability of historical changes in global temperatures and examines the links to greenhouse gas emissions using rigorous statistical techniques.

Our new CSIRO work provides an objective assessment linking global temperature increases to human activity, which points to a close to certain probability exceeding 99.999%.

Our work extends existing approaches undertaken internationally to detect climate change and attribute it to human or natural causes. The 2013 Intergovernmental Panel on Climate Change Fifth Assessment Report provided an expert consensus that:

It is extremely likely [defined as 95-100% certainty] that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic [human-caused] increase in greenhouse gas concentrations and other anthropogenic forcings together.

Decades of extraordinary temperatures

July 2014 was the 353rd consecutive month in which global land and ocean average surface temperature exceeded the 20th-century monthly average. The last time the global average surface temperature fell below that 20th-century monthly average was in February 1985, as reported by the US-based National Climate Data Center.

This means that anyone born after February 1985 has not lived a single month where the global temperature was below the long-term average for that month.

We developed a statistical model that related global temperature to various well-known drivers of temperature variation, including El Niño, solar radiation, volcanic aerosols and greenhouse gas concentrations. We tested it to make sure it worked on the historical record and then re-ran it with and without the human influence of greenhouse gas emissions.

Our analysis showed that the probability of getting the same run of warmer-than-average months without the human influence was less than 1 chance in 100,000.

We do not use physical models of Earth’s climate, but observational data and rigorous statistical analysis, which has the advantage that it provides independent validation of the results.

Detecting and measuring human influence

Our research team also explored the chance of relatively short periods of declining global temperature. We found that rather than being an indicator that global warming is not occurring, the observed number of cooling periods in the past 60 years strongly reinforces the case for human influence.

We identified periods of declining temperature by using a moving 10-year window (1950 to 1959, 1951 to 1960, 1952 to 1961, etc.) through the entire 60-year record. We identified 11 such short time periods where global temperatures declined.

Our analysis showed that in the absence of human-caused greenhouse gas emissions, there would have been more than twice as many periods of short-term cooling than are found in the observed data.

There was less than 1 chance in 100,000 of observing 11 or fewer such events without the effects of human greenhouse gas emissions.

Good risk management is all about identifying the most likely causes of a problem, and then acting to reduce those risks. Some of the projected impacts of climate change can be avoided, reduced or delayed by effective reduction in global net greenhouse gas emissions and by effective adaptation to the changing climate.

Ignoring the problem is no longer an option. If we are thinking about action to respond to climate change or doing nothing, with a probability exceeding 99.999% that the warming we are seeing is human-induced, we certainly shouldn’t be taking the chance of doing nothing.

The Conversation

The authors do not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article. They also have no relevant affiliations.

This article was originally published on The Conversation.
Read the original article.


Get every new post delivered to your Inbox.

Join 4,001 other followers