Kids raid caves in virtual classroom

Remember when going on a school excursion meant a trip to the bowling alley? Or, for a really special occasion, perhaps a visit to the local fun park?

Well, things have certainly changed since I was at school.

Today, we’re launching what could be Australia’s biggest (and arguably coolest) school excursion ever. In classrooms around the country, students will set out to explore the spectacular Jenolan caves located in the scenic Blue Mountains.

How will this be possible? They’ll be embarking on their journey in virtual reality. 


To create this digital experience, we teamed up with 3P Learning to combine their latest educational resource, IntoScience, with HD panoramic video and 3D models of the Jenolan Caves scanned using our (ahem, award winning), laser mapping technology, Zebedee.

Using their own avatars, students from years 6 to 9 will be able to delve into the natural wonder of the caves, all without leaving the classroom. The Jenolan caves are Australia’s largest and, with elaborate underground structures, offer a rich scientific environment full of learning opportunities.

“It’s exciting to see our cave models now brought to life as a virtual world that students can explore and perform their own scientific investigations in,” said Michael Bruenig from our Digital Productivity and Services Flagship.

Zebedee is the first technology capable of mapping caves with lasers while continuously moving, meaning the 3D models it creates are incredibly detailed and can be produced in only the time it takes to walk (or climb or crawl) through a cave.

The technology is already well-known for mapping other iconic landmarks such as the Melbourne Shrine of Remembrance, Queensland’s Fort Lytton, and a little structure you might have heard of known as the Leaning Tower of Pisa  Oh yes, and there was a Boeing 727 too.

So, as much as I enjoyed my school field trips (complete with packed lunch and walkman), I can’t help but feel a teensy bit jealous of today’s students. First stop is the Jenolan Caves, but what’s next? The possibilities are endless. Check it out in this video:

This online 3D educational initiative, funded by the Australian Government, will be officially launched today by the Minister for Communication, Malcolm Turnbull.


Keeping us connected – wirelessly

By Nola Wilkinson

Dr Iain Collings decided early in his career that wireless communications was hot. Fascinated by the prospect of transferring information without wires or optic cables, he saw its huge potential to change our daily lives – and wanted to be a part of it.

The whole world is into cool electronic devices. Our growing appetite for smartphones, iPads, GoPros and FitBits has produced a huge new market – and wireless data transfer is essential for these devices.

What’s more, the more we use them, the faster our demand for higher rates of data transmission grows. Iain has focused his work on the use of multiple antennae to vastly improve the rate of transmission of data wirelessly. As he points out, this is fundamental to meeting the ever-growing global consumer demand. Watch this video to find out more.

Five uses for WiFi even we didn’t see coming

CSIRO_Wifi_highres-01-01 Remember the internet of the 90s? When browsing online meant being stuck at your desk with your whiz-bang 56k modem. It was an era without smartphones, without tablets – some might say, without freedom.

Luckily the clever folks in our labs came up with the underlying technology behind a little something called WiFi using the same mathematics that astronomers initially applied to piece together the waves from black holes (for more on the WiFI story click on our handy infographic on the right).

While WiFi has given us the freedom to work wirelessly in our homes, offices and out-and-about, it has also inspired a few other – err, interesting – innovations. Here’s a few that even we didn’t see coming:

  1. No more queuing for beer at the footy – thanks to a digital upgrade at Adelaide Oval, sports fans won’t even have to get out of their seats to order a drink. Or hot chips.
  2. Keeping Rover happy – this WiFi enabled system is a fully autonomous robotic dog sitter complete with video conferencing capabilities, remote tug-o-war, ball fetch mechanism, and treat dispenser.
  3. The humble bathroom scale has taken a leap forward – why waste energy (and calories) having to get up to manually record your weight when your wireless bathroom scale can do it for you?
  4. Did someone say bionic butler? For a couple of hundred thou’, this guy will get you a drink and even flip your pancakes.
  5. Yep, it’s a WiFi rabbit. We’re sure he’s useful in some way. We just can’t figure out what it is yet.

Find out more about how we invented and patented wireless LAN technology on our website.

The spice is right: how curry and the cloud may improve Alzheimer’s testing

By Carrie Bengston, James Davidson and Olivier Salvado

Mmm . . . lovely! A hot Indian curry is simmering away on the stove on a wintry night. The smell of spices fills the kitchen. One of the spices is turmeric, from the ginger family. Its vibrant yellow colour comes from the compound curcumin which is finding a use in clinical tests for Alzheimers disease (AD).

Who knew? Soon everyone will! We’re presenting our research this week at a major conference in Copenhagen, AAIC2014.

Indian curry in a dish

Curry night! The golden yellow spice turmeric contains curcumin – a key ingredient of a new eye test for Alzheimer’s

A clinical trial of the spice-infused eye test is being led by our own Dr Shaun Frost and team, with WA’s Edith Cowan University, US company NeuroVision Imaging, and the McCusker Alzheimer’s Research Foundation in Perth. Several hundred volunteers have taken part. They include healthy people, mildly cognitively impaired people and patients with AD. It’s all part of the Australian Imaging Biomarkers and Lifestyle study of Aging (AIBL)

The trial asks volunteers to come along to two visits for retinal fluorescence imaging, ie an eye scan. This is quick and painless. Patients sit in front of a specialised camera and a photo is taken of the retina at the back of their eye.

Patient having eye scanned by researcher

Dr Shaun Frost takes a photo of the back of a patient’s eyes, not for Instagram but for a clinical trial

Between visits, volunteers eat some curcumin which binds to beta-amyloid plaques, the sticky proteins that indicate Alzheimers, and fluoresces. The plaques (if there are any) show up in the eye scans as bright spots which can be counted and measured. The data is then used to calculate a special number for each patient, a retinal amyloid index (RAI), and compared between healthy, mildly cognitively impaired and AD patients.

Bright spots showing Alzheimer's plaques in retinal scan

Amyloid plaques, a sign of Alzheimer’s, show up in retinal scan as fluorescent spots as curcumin binds to them

Encouragingly, as we announced this week, early results show the amount of plaque in the retina closely mirrors the amount in the brain. If confirmed, retinal imaging may be the beginnings of an easy, non-invasive test for early detection of AD. Combined with results of cognitive tests and other markers it could help doctors diagnose AD more confidently.

Eye scans like this also find plaques when they’re smaller than the ones in brain scans, potentially finding signs of AD earlier – maybe up to 20 years before cognitive symptoms appear. If diagnosed, AD patients could start treatment sooner and have regular eye scans to see which treatments work best for them.

Brain imaging on the cloud

From curry to the cloud. More research presented this week is about more accurately interpreting brain images sometimes used to diagnose AD.

To get a brain scan, a patient lies on a bed in a large machine like a Magnetic Resonance Imaging (MRI) or Positron Emission tomography (PET) scanner. These machines record a series of images through the brain, which are then visually checked by a radiologist who compiles a report for the patient’s doctor.

This visual inspection can be subjective, tedious and time consuming. But recent advances in scientific computing and machine learning allows systems to accurately measure features of the 3D scan, such as brain size or concentration of a tracer molecule, that support a diagnosis.

Using these techniques, a new trend is emerging for improving radiologists’ productivity. Scanners and specialised medical software can report quantitative values and compare them to the values expected for normal, healthy patients – just like blood test results from a pathology lab do.

Our researchers, led by health imaging specialist Associate Prof Olivier Salvado, have just released a new cloud computing application, MILXCloud, that automatically delivers standardised radiology reports.

brain surface showing the concentration of a radioactive dye imaged by PET scan

Our new software, MILXCloud, automates brain scan analysis and reporting on the Cloud

Users will be able to upload a PET scan and within 15 minutes be emailed a one page quantitative report showing a diagram of the brain with colour coded values compared with what’s normal. This data will help support diagnosis by the radiologist and enhance delivery of eHealth services.

Whether it’s curry or the Cloud, the future of Alzheimer’s detection sure looks bright.

Media: Andreas Kahl  |  0407 751 330  |

Not 3D heat scanning, waving

With the first World Cup match just over (Brazil beat Croatia 3-1) and a month of games to go, no doubt we’ll see soccer crowds in stadiums across Brazil erupting into a Mexican Wave.

Crowd at sports game doing Mexican wave

Getting into the spirit of the match.

Back home, our robotics researchers are waving their arms around too, but for a different reason. They’re using new handheld 3D heat mapping technology.

We’ve all heard of energy audits. Maybe you’ve even had one done to find out where your home is losing heat and how you can better conserve energy. Accurate measurements of temperatures and their precise locations is key to that. It’s now easier, cheaper and more reliable thanks to HeatWave, a new 3D handheld thermal imaging system we’ve been working on. We’re pleased that last night it was the winner of the R&D category at the Queensland iAwards, the IT industry awards.

Thermal imaging systems have been around for decades. But they’ve been a bit cumbersome and often only display results in 2D (which for some applications is fine). Not just that, they’ve needed experts to interpret the results. This often involves mentally challenging tasks like comparing and interpreting different kinds of images from different angles. That’s called ‘cognitive load’ in case you need to know.

Operator holding HeatWave scanner near engine to generate a heat map

HeatWave being used to scan the heat in a Bobcat engine

Our system needs less brain and more brawn. Basically if you can wave your arm, you can use HeatWave. It weighs around half a kilo and captures multiple perspectives of the same object, then merges them. It’s built on the SLAM (simultaneous localisation and mapping) technology that makes our famous Zebedee scanner work. But it adds information beyond just the 3D shape – it overlays accurate temperature measurements into a single view using some software smarts we’ve developed. Here’s how it maps a hot engine.

Colourful 3D heat map of a Bob cat engine.

So hot right now. Rainbow colours in the 3D heat map show the temperature of specific engine parts.

We’re hoping the $4B thermal imaging industry will benefit from HeatWave’s ease of use, precision, portability and reliability. Already we’ve done pilot trials for detecting faults in engines, auditing heat loss in buildings, monitoring the condition of livestock and even people!

In the spirit of practicing what we preach, we’ve tried it out on ourselves. One of our researchers had his sore back scanned using HeatWave. The results clearly show where the hot spots (ie painful spots) are in his right shoulder. Hopefully, with a bit of physio, he’ll be waving his arms around pain free sometime soon.

Maybe even doing a Mexican Wave in front of the telly . . .

*  *  *

Media: Carrie Bengston  |  0417 266 190  |

Not so chirpy: How ‘We Feel’ revealed by tweets

That little blue bird in the Twitter logo is always so damn chirpy! But not all our tweets come from a happy state of mind.

With over 9000 tweets posted every second, it’s clear that Twitter has become a popular place to express our feelings in 140 characters or less – from fear, joy and sadness, to anger, surprise and of course, hunger.

But what if we could use this information to help those who aren’t feeling so chirpy? We’ve teamed up with the Black Dog Institute and Amazon to do just that. Our new research project, called ‘We Feel’, will monitor and analyse public tweets to see when (collectively) people are feeling down and so help mental health support organisations be better prepared.

For this project, we’re drawing on our expertise in natural language processing, big data mining and social media analysis to present data in a way that’s usable by mental health experts.

‘We Feel’ started out by creating an emotional word map. We asked a bunch of people to look at a set of 600 emotion words used in tweets and link them to particular feelings. The emotions are based on US Psychology Professor W. Gerrod Parrott’s standard list of basic emotions including love, joy, surprise, anger, sadness and fear. They were divided further into specific secondary emotions which were then mapped to words used in tweets. For example, the word ‘hurt’ might suggest a feeling of sadness; or ‘nice’ in a tweet might reveal contentment.

This word map combined with data analysis software and techniques can be used to process the stream of thousands of tweets sent into the public domain every second and create a graph of a particular emotion. Here’s ‘fear’ as the Federal Budget was announced last week.

Graph showing how the 'We Feel' software records growing 'fear' as revelead in Tweets on Federal Budget night.

As Budget night unfolds, ‘We Feel’ shows Tweets revealing ‘fear’ growing in number as a thickening band in purple and pink from left to right. Click to view full size.

Contrast that fear with the joy of Eurovision the previous weekend. The winner, Austria’s Conchita Wurst, was certainly ecstatic. And millions of Eurovision fans on couches around the world were tweeting almost simultaneously, giving the Eurovision hashtag a work out. ‘We Feel’ revealed spikes of words like ‘amazing’, ‘surprising’, ‘funny’ and ‘weird’ on Twitter in Europe at the time.

Crunching even a small proportion of twitter data like this as-it-happens is a huge task. That’s where Amazon came in with their cloud service Kinesis. It enabled our software to process the data efficiently as it rushed down the pipe. The user interface for ‘We Feel’ is freely available to other developers and researchers so they can use the data too.

As well as providing real time population data, this tool will also help uncover where people are most at risk of poor mental health and how the mood and emotions of an area or region can change over time. We Feel will also help to understand how strongly our emotions are related to social, economic and environmental factors such as the weather, time of day, day of the week, news of a major disaster or economic downturn. This is all very important to health services who could use it to both monitor community mental health and predict where/when more services are needed.

An infographic: "This harvesting of tweets gives an instant snapshot of how the world is travelling emotionally."

‘We Feel’ harvests emotional tweets from around the world.

Using social media data to support those with mental health issues – now that’s something we should all feel chirpy about!

Read more about this at: Twitter tool shows a real-time view of our emotions

Media contact: Sarah Klistorner 02 9372 4662, 0423 350 515,

Future cyber threats to cause more headaches than Heartbleed

By Hannah Scott

For average Australians it’s probably safe to say cyber security isn’t something that’s thought about often. Maybe it should be.

Think about the things you take for granted, like having electricity to power your home and having access to crucial health services.  Imagine if this was all compromised because of a cyber attack that could shut down crucial infrastructure.  In seconds, your local energy company could be unable to deliver power to your home and your confidential health records, bank account details and passwords could be compromised.

As the electricity grid moves towards a smart grid we face the risk of a cyber attack that causes major power outages across the country.

As the electricity grid moves towards a smart grid we face the risk of a cyber attack that causes major power outages across the country.

Heartbleed is a recent cyber attack where a major internet security flaw allowed attackers to gain access to encrypted passwords, credit card details, and other data on trusted websites including Facebook, Gmail, Instagram, and Pinterest.

The attack was made on a certain type of secure access software which allowed cyber attackers to gather personal information over what was considered to be a secure connection or site.  It’s considered to be one of the first instances of a widespread attack against something that we all take for granted as being secure. James Deverell, from our Futures team, said we’re at risk of incidences such as Heartbleed happening again.

“Not only are we becoming increasingly vulnerable because of the wide range of services that we are moving onto technology platforms, but we are also seeing the nature of these types of cyber attacks changing.”

“For example cyber attacks are becoming increasingly more sophisticated as more tools available and it’s very likely that we will continue to see these types of vulnerabilities continue into the future.”

As technology use is on the rise and as we become more technology dependant for services, we’ll continue to see these kinds of vulnerabilities increase.  We’ve released a new report on the future of cyber security, highlighting a number of different vulnerabilities and threats of future attacks.

In the future of cyber attacks, we could expect to see anything from individuals and “Script Kiddies” who are trying to outdo each other in a game of seeing how much damage they can cause, to more serious acts of  “Hacktivism” and corporate and government espionage.  Each of these threats are becoming increasingly more sophisticated and we need to be prepared as a nation to respond to them.

Future cyber threats could cause more headaches than Heartbleed. Image: xkcd

Core Australian industries such as energy, mining, healthcare and IT services all have vulnerabilities that can potentially be exploited through cyber security, the cost of this estimated to be as high as $2 billion annually.

“Our work across digital productivity, computational informatics and big data analytics will be necessary to address the risks and vulnerability of attacks. We want to keep sectors like Health, Government and Energy safe from cyber attacks in the future,” said Deverall.


Enabling Australia’s Digital Future: Cyber security trends and implications is available to download on our website.

Media: Samantha Lucia, Communication Manager, Information Sciences:, 0467 768 960
Sarah Klistorner, Communications Manager, Digital Productivity and Services Flagship:, 0423 350 515

Are you a sucker or a chewer?

By James Davidson and Pamela Tyers

How do you eat your Easter chocolate? Do you suck it or chew it? Does your tongue smear the inside of your mouth as the chocolate melts, or does it get chomped by your back teeth then sent down your throat?

It’s true, some of us suck and some of us chew. Whichever process we use to break down food in our mouth, it affects the taste sensation.

Flavour is released through the movement and time taken for taste components to hit our taste buds. Those taste components include salt, sugar and fat. If we know how to place those tasty bits into foods so that they achieve maximum delicious flavour before we digest the food, we then know how to use less of the unhealthy ingredients because our inefficient chewing means that we don’t taste much of them anyway.

For example, bread would taste unappetising if too much salt was removed out of it, but science can help us understand how to remove some of the less healthy components out of foods while retaining their familiar, delicious taste.

The life and times of a creme egg. How do you eat yours?

The life and times of a creme egg. How do you eat yours? Image: Flickr/Mark Seton

Enter our new 3D dynamic virtual mouth – the world’s first – which is helping our researchers understand how foods break down in the mouth, as well as how the food components are transported around the mouth, and how we perceive flavours. Using a nifty technique called smooth particle hydrodynamics, we can model the chewing process on specific foods and gather valuable data about how components such as salt, sugar and fat are distributed and interact with our mouths at the microscopic level.

We’re using it to make food products with less salt, sugar and fat and incorporate more wholegrains, fibre and nutrients without affecting the taste.

It’s part of research that will help us understand how we can modify and develop particular food products with more efficient release of the flavour, aroma and taste of our everyday foods.

And it’s good news for all of us. Eighty percent of our daily diet is processed foods – think breakfast cereals, sliced meats, pasta, sauces, bread and more. So, creating healthier processed foods will help tackle widespread issues such as obesity and chronic lifestyle diseases.

In fact, our scientific and nutritional advice to government and industry has so far helped remove 2,200 tons of salt from the Australian food supply, and reduced our population’s salt consumption by 4 per cent.

Oh…and we’ve also used the virtual mouth to model just how we break down our Easter chocolate.

As the teeth crush the egg, the chocolate fractures and releases the caramel. The chocolate coating collapses further and the tongue moves to reposition the food between the teeth for the next chewing cycle. The caramel then pours out of the chocolate into the mouth cavity.

With this virtual mouth, variations to thickness of chocolate, chocolate texture, caramel viscosity, and sugar, salt and fat concentrations and locations can all be modified simply and quickly to test the effects on how the flavours are released.

Now that’s something to chew on. Happy Easter!

Media contact: James Davidson, 03 9545 2185,

Media Release: Chocolate bytes with virtual mouth.

Big screen premiere for life’s micro marvels

It’s often hard to understand what’s happening inside us, because the processes and phenomena that influence our bodies and impact our health are invisible.

Not being able to understand why we’re sick or why our body is acting the way it does can add to the stress and strain of illness.

But now, a new generation of movie makers are drawing back the curtain, revealing the hidden secrets of our marvellous biology and setting new standards for communicating biological science to the world.

Kate Patterson, Chris Hammang and Maja Divjak on the red carpet.

Animators Kate Patterson, Chris Hammang and Maja Divjak on the red carpet.

Three spectacular new biomedical animations were premiered today during a red carpet event at Federation Square in Melbourne.

The molecular movies bring to life some very complex processes, researched by health researchers and detailed in scientific journals most of us never see. They showcase the work of VIZBIplus – Visualising the Future of Biomedicine, a project that is helping to make the invisible visible, so that unseen bodily processes are easier to understand, which will help us make better choices about our health and lifestyle.

With BAFTA and Emmy award winning biomedical animator Drew Berry as mentor, three talented scientific animators – Kate Patterson (Garvan Institute of Medical Research), Chris Hammang (CSIRO) and Maja Divjak (Walter and Eliza Hall Institute of Medical Research) – have created biomedically accurate animations, showing what actually happens in our bodies at the micro scale.

The animators used the same or similar technology as Dreamworks and Pixar Animation Studios, as well as video game creators, to paint mesmerising magnifications of our interior molecular landscapes. While fantastic, the animations are not fantasies. They are well-researched 3D representations of cutting-edge biomedical research.

Kate Patterson’s animation shows that cancer is not a single disease. She highlights the role of the tumour suppressor protein p53, known as ‘the guardian of the cell’, in the formation of many cancer types.

Chris Hammang’s animation describes how starch gets broken down in the gut. It is based on our very own health research about resistant starch, a type of dietary fibre found in foods like beans and legumes that protects against colorectal cancer – one of Australia’s biggest killers. Chris shows us the ‘why’ behind advice to change our dietary habits.

An animated gif for resistant starch taken from the movie, 'The Hungry Microbiome'

‘The Hungry Microbiome’ animates how resistant starch works in our bodies.

Maja Divjak’s animation highlights how diseases associated with inflammation, such as type 2 diabetes, are ‘lifestyle’ diseases that represent some of the greatest health threats of the 21st century.

With our current ‘YouTube generation’ opting to watch rather than read, biomedical animations will play a key role in revealing the mysteries of science. These videos will allow researchers to communicate the exciting and complex advances in medicine that can’t be seen by the naked eye.

Watch all the videos here and be among the first to see these amazing visualisations!


Shrine of the times

Picture this. It’s a beautiful autumn day in Melbourne. You’re about to embark on a walking tour to discover some of the city’s finest architecture. My name is Carrie and I’ll be your tour guide.

We begin at one of the city’s more stately buildings – the Shrine of Remembrance. This grand temple-like structure was built back in 1926 and is located right next to the Botanic Gardens. It’s a focus for the city’s ANZAC Day ceremonies each year and in this ANZAC Centenary commemorating the start of WW1.

3D map of The Shrine of Remembrance, Melbourne

3D map of The Shrine of Remembrance, Melbourne

This month our scientists at CSIRO brought high tech to history by mapping the Shrine using a 3D laser scanner, preserving it digitally with a tool called Zebedee.

As you can see, it’s very timely. Major renovations at the Shrine are underway to get ready for commemorations of the Gallipoli landing’s 100th anniversary in 2015. It’s part of the $45M ‘Galleries of Remembrance’ project.

The Shrine joins a select group of heritage sites mapped in 3D by the Zebedee scanner, along with Brisbane’s Fort Lytton, and even the Leaning Tower of Pisa.

Now I personally get quite excited about architectural drawings, but these 3D maps add detailed information for building managers and heritage experts by measuring the actual built spaces. Zebedee technology offers a new way for recording some of our priceless treasures.

Let me show you one of the interior images of the Shrine. These amazing ‘point clouds’ are created by a handheld laser scanner bouncing on a spring as the user walks through corridors, up stairs and round about. As long as it takes to walk through the building is about how long it takes to make the map. You can watch it online afterwards.

Aerial section view from 3D map of the Shrine's undercroft showing columns

Aerial section view from 3D map of the Shrine’s undercroft showing columns

Despite their almost X-ray look, Zebedee can’t see through walls as the laser bounces off solid surfaces. But when you put all the data in one place you get a sliceable, zoomable, turnable map with architectural details like stairs, columns, voids, ceilings all measured to the nearest centimetre. But . . . no roof! That’s because our scientists are developing a flying laser scanner that scans rooftops from the air. Secret attics may be secret no longer.

That concludes our tour for today. If you’d like to take home your very own Zebedee souvenir, head to our website.

Home is where the heart is…monitored

Frances using the telehealth device.

Our new home monitoring tool is helping elderly Aussies like Frances better manage chronic disease.

By Ali Green

Almost one in four older Australians are affected by chronic health conditions, and close to 1.2 million currently suffer from more than one. Given our ageing population, this number is set to increase significantly by 2030, adding more pressure to our health system.

Life for a chronic disease sufferer with complex conditions like diabetes, heart or lung disease typically means two to three hospital stays per year, on top of multiple visits to the GP for regular health checks.

In Australia’s largest clinical telehealth trial, we’ve equipped a group of elderly patients with broadband-enabled home monitoring systems to help them manage and monitor their conditions from home.

Patients can use the machines to measure their blood pressure, blood sugar, ECG (to detect heart abnormalities through electrical signals), lung capacity, body weight and temperature in a process that generally takes around 10-20 minutes.

The monitoring system’s large screen helps guide patients through the different procedures, and the data is sent off to a secure website where it becomes immediately available to a care team including the patient’s nurse and doctor. Daily stats are checked regularly by a specialist nurse who can assist the patient via telephone if there are any changes in their regular patterns.

150 patients across Australia are testing out the machines as part of the CSIRO-led trial. Here are a few of their stories.

Janice and Bill

Victorian retiree Janice suffers from an irregular heartbeat, diabetes and low blood pressure – conditions that require twice weekly visits to her doctor and multiple hospital stays to be controlled. She also has diabetes related retinopathy which has caused her to lose most of her vision, making medical visits difficult for both herself and husband Bill.

Since using the telehealth monitoring system, Janice’s GP and hospital visits have reduced significantly, and she can better manage her symptoms to prevent hypoglycaemic episodes. Bill also has a clearer idea of how Janice is doing from day to day.

“If Janice’s blood pressure reading is particularly low, I can prevent any dizzy spells by getting her to sit down and giving her a glass of water. If her measurements are stable, I can pop out to do some shopping or walk the dog knowing that she should be fine on her own for a little while,” says Bill.

Bill helps Janice record her blood oxygen levels using the telehealth home monitoring system.

Bill helps Janice record her blood oxygen levels using the telehealth home monitoring system.


Jack has ischaemic heart disease and chronic obstructive pulmonary disease (COPD). This affects his airways causing breathlessness, a chronic cough and mucus build-up.

During a routine check of Jack’s telehealth monitoring data, his nurse Lay noticed that his ECG results were slightly unusual. This prompted the nurse to call Jack, who complained of shortness of breath. An appointment was made with Jack’s doctor for a full ECG, which turned out to be fine.

As a result of this episode, Jack’s nurse arranged to visit him at home to discuss a medication regime and teach him to use his medicated spray. This meant Jack could self-manage his shortness of breath and prevent unnecessary doctor visits.

Jack‘s nurse Lay helps him measure his lung capacity using the telehealth system.

Jack’s nurse Lay helps him measure his lung capacity using the telehealth system.


75-year-old Frances (pictured top) has a respiratory condition called bronchiectasis. This can easily develop into a chest infection without early warning and lead to a stay in hospital.

Every day, Frances conducts a ten minute check up with the telehealth monitoring system in between washing up the breakfast dishes and getting ready to go out. A nurse at the other end of the internet connection checks Frances’ measurements, looking for any signs of early deterioration.

“I was surprised by the idea of self-monitoring at first, but now that I’m used to it, I think it’s a terrific idea. It has really helped me to better understand my health,” says Frances.

As Australia’s population ages and more demand is placed on our health system, telehealth can help reduce patient hospitalisation, and the related costs, by allowing patients to better manage their chronic diseases from home.

The Home Monitoring of Chronic Disease for Aged Care project is an initiative funded by the Australian Government.


CSIRO is participating in One in Four Lives: The Future of Telehealth in Australia event, at Parliament House this morning from 7:45am AEDT.

Media: Sarah Klistorner M: +61 477 716 031

Emergency services benefit from a high-speed world without wires

What’s left of homes after bushfires swept through Warrandyte, in Victoria, in January. Image: AAP Image/Joe Castro

What’s left of homes after bushfires swept through Warrandyte, in Victoria, in January. Image: AAP Image/Joe Castro

By Ian Oppermann, Director, Digital Productivity and Services 

When disaster strikes – such as January’s bushfire in Victoria or the recent cold spell that froze much of north America – it’s vital for emergency services to get the latest information.

They need to access real-time data from any emergency sites and command centres so they can analyse it, make timely decisions and broadcast public-service updates.

CSIRO thinks it has a solution in its high speed and high bandwidth wireless technology known as Ngara, originally developed to help deliver broadband speeds to rural Australia.

The organisation has announced a licensing deal with Australian company RF Technology to commercialise Ngara so it can be used to allow massive amounts of information to be passed between control centres and emergency services in the field.

There is already interest from agencies in the United States and it’s hoped that Australian agencies will soon follow.

Squeezing more data through

The technology will package four to five times the usual amount of data into the same spectrum. This will allow emergency services to send and receive real time data, track assets and view interactive maps and live high definition video from their vehicles. It’s a step in what has been a long journey toward an ambitious vision.

For years, the vision of the communications research community was “connecting anyone, anywhere, anytime” – a bold goal encompassing many technical challenges. Achieving that depended heavily on radio technology because only radio supports connectivity and mobility.

Over the years we designed ever more complex mobile radio systems – more exotic radio waveforms, more antenna elements, clever frequency reuse, separation of users by power or by spreading sequence and shrinking the “cell” sizes users operate in.

A research surge in the late 1990s and 2000s led to a wealth of technology developed in the constant drive to squeeze more out of radio spectrum, and to make connections faster and more reliable for mobile users.

This radio access technology became 3G, LTE, LTE-A and now 4G. Europe is working on a 5G technology. We’ve also seen huge advances in wireless local area networks (WLAN) and a strong trend to offload cellular network data to WLAN to help cope with the traffic flowing through the networks.

Demand for more keeps growing

Despite this, the data rate demands from users are higher than what mobile technology can offer. Industry commentators who live in the world of fixed communication networks predict staggering growth in data demand which, time tells us, is constantly underestimated.

Many connections required.

Many connections required.

We’ve even stretched our ability to name the volume of data flowing through networks: following terabytes we have exabytes (1018), zetabytes (1021) and yottabyes (1024 bytes) to describe galloping data volumes.

Beyond that, we run out of prefixes, but we’ve not run out of ideas to generate more data or new devices to connect to a network.

A few more serious problems arise from all of this traffic flowing through the world’s networks. The first is the “spectrum crunch”. We have sliced the available radio spectrum in frequency, time, space and power. We need to pull something big out of the hat to squeeze more out of the spectrum available in heavy traffic environments such as cities.

The second is the “backhaul bottleneck”. All the data produced in the radio access part of the network (where mobile users connect) needs to flow to other parts of the network (for example to fixed or mobile users in other cities).

Network operators maintain dedicated high capacity links to carry this “backhaul” traffic, typically by optical fibre or point-to-point microwave links. This works well when the backhaul connects two cities, but less well when connecting the “last mile” in a built-up urban environment.

When the total data volume which needs to be moved in terms of bits-per-second-per-square-metre rises into the range requiring backhaul capacities and is mobile, then some clever dynamic backhaul technology is needed.

As more of us carry yet more devices, and continue to enjoy high quality video-intensive services, we will keep pushing up our data rate demands on mobile networks. In theory, there is no known upper limit on the amount of data an individual can generate or consume. In practice, it depends on available bandwidth, the cost of data and the ability of devices to serve it up to us.

We have seen amazing progress in mobile data rates over the past decade. This trend will need to continue if we’re to keep pace with demand.

A new solution

To address the burgeoning data demand, and building on a strong history in wireless research, CSIRO has developed two major pieces of new technology – Ngara point-to-point (backhaul) and Ngara point-to-multi-point (access) technology. (Ngara is an Aboriginal word from the language of the Dharug people and means to “listen, hear, think”.)

How Ngara works (click to view full size).

How Ngara works (click to view full size).

The latter Ngara technology solves several big challenges over LTE networks through its “narrow cast” beam forming transmissions and smart algorithms which can form a large number of “fat pipes” in the air, reducing energy wastage of the radio signal, and increasing data rates and range.

It also enables wireless signals to avoid obstacles like trees, minimises the need for large chunks of expensive spectrum and allows agencies to dynamically change data rates where and when needed during an emergency.

In Australia we are looking at a field trial of Ngara in remote and regional communities to deliver critical broadband services such as health and education.

This article was originally published on The Conversation. Read the original article.

Fighting crime with lasers

It’s the type of technology you’d expect on Batman’s utility belt – but you won’t find it in a DC Comics book about the world’s greatest detective. Instead, this bad boy is being used by our real-life heroes to fight crime.

It’s called Zebedee – a handheld laser scanner that generates 3D maps of all sorts of environments, from caves to factory floors, in the time it takes to walk through them.

The portable device works by emitting laser beams while rotating around a spring that continuously scans the environment, converting 2D measurements into a 3D field of view. In fact, it can collect over 40,000 range measurements in just one second. It could even create a 3D map of the Batcave in around 20 minutes.

A 3D scan map of a Boeing 727 airplane

Not the Batplane, but a 3D scan of a Boeing 727 at Aviation Australia in Brisbane.

Since 2010, Zebedee has been in action across the globe for a range of cool purposes, from preserving world heritage sites to exploring the complex interior of the Leaning Tower of Pisa.

But it’s never been used like this before.

For the first time, our real world crime fighters at the Queensland Police Service are using Zebedee to help piece together crime scene puzzles.

Crime scenes can be difficult to investigate. They’re often places like dense bushland, steep slopes or dangerous caves, which can make thorough sweeps of the scene both tough and time consuming. Using Zebedee, also known as ZEB1, police can now easily access these hard to reach places and map confined spaces where it may be difficult to set up bulky camera equipment and tripods. It also means less disturbance of the crime scene.

Using data collected by the scanner, police investigators can quickly recreate the scene on their computer in 3D, and view it from any angle they want. They can then locate and tag evidence to particular locations with pinpoint accuracy.

3 images showing different angles of a 3D scan map of the HMQS Gayundah shipwreck.

Not the Batboat, but a 3D scan of the half-submerged HMQS Gayundah wreck. Zebedee helps make all angles accessible.

This Brisbane-born technology is now available commercially and this local application is a striking example of how 3D mapping can allow us to access locations and view angles previously out of our reach.

It will help our local detectives and crime fighters in the Police Service generate and pursue lines of enquiry for incidents like murder cases and car crashes. You could say that Zebedee puts the CSI in CSIRO.

We’re working on even more ways to adapt Zebedee for a range of other jobs that require 3D mapping, from security and emergency services to forestry and mining, even classroom learning.

Film makers may soon be able to use Zebedee technology to easily digitise actual locations and structures when creating animated worlds. Maybe the next Batcave we see at the movies will be more realistic than ever before, created by 3D mapping an actual cave.

The potential applications are endless.

Sight saving science for the Torres Strait

By Ali Green and Sarah Klistorner

An estimated one million Australians have diabetes and this number is expected to double by 2025. About 60 per cent of these people will develop eye issues, like the diabetes-related disease retinopathy.

Diabetic retinopathy is one of the leading causes of irreversible blindness in Australian adults. The disease often has no early-stage symptoms and is four times more likely to affect indigenous Australians.

Just imagine if this disease was preventable.

A landscape image of Thursday Island coastline

On Thursday Island, Torres Strait, an ophthalmic nurse is screening eyes at the local hospital as part of the Tele-eye Care trial.

During the past few months, our researchers have been working with Queensland Health and the Indigenous and Remote Eye Health Service (IRIS) on the Torres Strait Islands to set up a remote eye screening service – giving hundreds of people access to specialist eye care.

For people living in remote areas, travelling a 5 hour round trip for specialist medical care can be disruptive to their family and community. Transporting patients can also be expensive.

A satellite broadband dish on a roof

Patient’s retinal images and health data can be sent from a remote community health clinic to the desk of a city based ophthalmologist.

Our Remote-I system is saving patients from the long and sometimes unnecessary journey by utilising local clinicians to conduct routine 15 minute retinal screenings, often as part of scheduled health clinic visits. Our technology sends hi-res retinal images taken in the screenings to ophthalmologists in Brisbane via satellite broadband.

Previously, ophthalmologists would only be able to fit in a limited number of eye screenings and surgeries when they visited remote communities. Once fully implemented, a city-based ophthalmologist will be able to screen up to 60 retinal images per week with the help of Remote-I.

Preliminary results from a review of data collected at one location showed that only three out of 82 patients screened to that date had a sight-threatening condition and required an immediate referral. Previously, those other 79 patients not requiring referrals may have held up the queue while the specialist was visiting the remote community. With Remote-I, those who need immediate treatment or attention can already be first in line.

With only 900 practicing ophthalmologists in Australia, and a high demand for eye health services in remote locations, finding new ways to deliver health services to remote communities is vital to providing the best care when and where it’s needed.

A Torres Strait man undergo an IRIS eye scan.

IRIS Coordinator patient screening in the Torres Strait.

By June 2014 the Tele-Eye Care trial will have screened 900 patients in remote WA and QLD. In addition to streamlining health care processes, the trial is collecting a lot of data.

And this is where the science gets interesting.

With patients’ consent, collected images will be used by the Tele-Eye Care project to study blood vessel patterns in retinas. Algorithms will then be designed to automatically detect particular eye diseases, which will aid diagnosis in routine screenings.

Even though tele-ophthalmology has been around for many years, this is the first time anyone has looked at image processing techniques to automatically detect eye defects in routine screening environments via satellite broadband.

A retinal image

Retinal images, each around 1.5MB in size, and electronic patient files are sent to an Ophthamologist in Brisbane.

We’re working hard to deliver better health outcomes for indigenous Australians. Being able to provide diagnoses on the spot will make a huge impact on delivering faster, more cost effective eye care services to the outback and prevent blindness.

This initiative is funded by the Australian Government.

Media contact: Ali Green +61 3 9545 8098

Answering the burning questions

People in fire gear supervise flames burning across grassland

We’re keeping a close eye on grassfires this summer.

This week’s heatwave across southern Australia has reminded us of the serious dangers posed by grassfires. They might not sound that threatening, but these fires can travel at speeds of up to 25 kilometres per hour, and damage hundreds of hectares of land within a matter of hours.

So while many of us were enjoying our summer holidays, our team of fire scientists were hard at work with researchers and volunteers from Victoria’s Country Fire Authority (CFA) to help learn more about grassfire behaviour in Australia.

In a series of carefully designed, planned and monitored experiments, the research team lit controlled fires in grass fields near Ballarat, an hour west of Melbourne.

The aim was to safely gather new and thorough data about grassfire behaviour in different conditions. Experimental plots containing grasses at different stages of their life cycle were burned, while experts observed and various instruments measured things like the time it took for the fire to burn across the 40 x 40 metre plot.

Aerial view of flames and smoke stretching across grasslands.

The view of our grassfire experiment captured by an unmanned aerial vehicle.

Australian researchers have been looking into forest fires and bush fires for decades, but this is the first time in nearly 30 years since we’ve conducted research into grassfires.

Back in 1986 we ran similar experiments in the Northern Territory, which led to the development of our Grassland Fire Spread Meter. This tool is used by rural fire authorities across the country to predict the rate that a grassfire is likely to spread once it starts.

What remains unknown is at what stage of the grass’s life cycle it becomes a fire hazard, especially for the grass types found across southern Australia.

Today, we have technology like unmanned aerial vehicles (UAVs) to help gain a new, bird’s eye perspective of the fire’s progress, allowing us to analyse the burns with a whole new level of detail.

A scientist in fire gear holds up a UAV with smoke in the background

Send in the drones: our quadcopter UAV examines the experimental grassland burns near Ballarat.

Controlled by our scientists, the robot quadcopters flew above the experimental burns, filming the fire as it spread through the grass. The vision, along with other data captured by thermal sensors, will be used to develop computer models that can replicate and predict grassfire behaviour.

The results will help fire authorities like the CFA better respond to grassfires, as well as improving how they determine Fire Danger Ratings and when to declare Fire Danger Periods in particular regions.

The timing of the experimental burns was critical. The crew waited until the weeks of late December and early January for safe temperature and wind conditions to ensure any fires they lit would be easy to control and contain. They also needed the grass curing levels to be at the right amount. Thankfully, this was all wrapped up before we, and the grasslands, were hot and bothered by the heatwave.

Check out this video for the how and why of the experimental burns:


Get every new post delivered to your Inbox.

Join 3,062 other followers