Remember the internet of the 90s? When browsing online meant being stuck at your desk with your whiz-bang 56k modem. It was an era without smartphones, without tablets – some might say, without freedom.
Luckily the clever folks in our labs came up with a little something called WiFi using the same mathematics that astronomers initially applied to piece together the waves from black holes (for more on the WiFI story click on our handy infographic on the right).
While WiFi has given us the freedom to work wirelessly in our homes, offices and out-and-about, it has also inspired a few other – err, interesting – innovations. Here’s a few that even we didn’t see coming:
- No more queuing for beer at the footy – thanks to a digital upgrade at Adelaide Oval, sports fans won’t even have to get out of their seats to order a drink. Or hot chips.
- Keeping Rover happy – this WiFi enabled system is a fully autonomous robotic dog sitter complete with video conferencing capabilities, remote tug-o-war, ball fetch mechanism, and treat dispenser.
- The humble bathroom scale has taken a leap forward – why waste energy (and calories) having to get up to manually record your weight when your wireless bathroom scale can do it for you?
- Did someone say bionic butler? For a couple of hundred thou’, this guy will get you a drink and even flip your pancakes.
- Yep, it’s a WiFi rabbit. We’re sure he’s useful in some way. We just can’t figure out what it is yet.
Find out more about how we invented and patented wireless LAN technology on our website.
By Carrie Bengston, James Davidson and Olivier Salvado
Mmm . . . lovely! A hot Indian curry is simmering away on the stove on a wintry night. The smell of spices fills the kitchen. One of the spices is turmeric, from the ginger family. Its vibrant yellow colour comes from the compound curcumin which is finding a use in clinical tests for Alzheimers disease (AD).
Who knew? Soon everyone will! We’re presenting our research this week at a major conference in Copenhagen, AAIC2014.
A clinical trial of the spice-infused eye test is being led by our own Dr Shaun Frost and team, with WA’s Edith Cowan University, US company NeuroVision Imaging, and the McCusker Alzheimer’s Research Foundation in Perth. Several hundred volunteers have taken part. They include healthy people, mildly cognitively impaired people and patients with AD. It’s all part of the Australian Imaging Biomarkers and Lifestyle study of Aging (AIBL)
The trial asks volunteers to come along to two visits for retinal fluorescence imaging, ie an eye scan. This is quick and painless. Patients sit in front of a specialised camera and a photo is taken of the retina at the back of their eye.
Between visits, volunteers eat some curcumin which binds to beta-amyloid plaques, the sticky proteins that indicate Alzheimers, and fluoresces. The plaques (if there are any) show up in the eye scans as bright spots which can be counted and measured. The data is then used to calculate a special number for each patient, a retinal amyloid index (RAI), and compared between healthy, mildly cognitively impaired and AD patients.
Encouragingly, as we announced this week, early results show the amount of plaque in the retina closely mirrors the amount in the brain. If confirmed, retinal imaging may be the beginnings of an easy, non-invasive test for early detection of AD. Combined with results of cognitive tests and other markers it could help doctors diagnose AD more confidently.
Eye scans like this also find plaques when they’re smaller than the ones in brain scans, potentially finding signs of AD earlier – maybe up to 20 years before cognitive symptoms appear. If diagnosed, AD patients could start treatment sooner and have regular eye scans to see which treatments work best for them.
Brain imaging on the cloud
From curry to the cloud. More research presented this week is about more accurately interpreting brain images sometimes used to diagnose AD.
To get a brain scan, a patient lies on a bed in a large machine like a Magnetic Resonance Imaging (MRI) or Positron Emission tomography (PET) scanner. These machines record a series of images through the brain, which are then visually checked by a radiologist who compiles a report for the patient’s doctor.
This visual inspection can be subjective, tedious and time consuming. But recent advances in scientific computing and machine learning allows systems to accurately measure features of the 3D scan, such as brain size or concentration of a tracer molecule, that support a diagnosis.
Using these techniques, a new trend is emerging for improving radiologists’ productivity. Scanners and specialised medical software can report quantitative values and compare them to the values expected for normal, healthy patients – just like blood test results from a pathology lab do.
Our researchers, led by health imaging specialist Associate Prof Olivier Salvado, have just released a new cloud computing application, MILXCloud, that automatically delivers standardised radiology reports.
Users will be able to upload a PET scan and within 15 minutes be emailed a one page quantitative report showing a diagram of the brain with colour coded values compared with what’s normal. This data will help support diagnosis by the radiologist and enhance delivery of eHealth services.
Whether it’s curry or the Cloud, the future of Alzheimer’s detection sure looks bright.
Media: Andreas Kahl | 0407 751 330 | firstname.lastname@example.org
With the first World Cup match just over (Brazil beat Croatia 3-1) and a month of games to go, no doubt we’ll see soccer crowds in stadiums across Brazil erupting into a Mexican Wave.
Back home, our robotics researchers are waving their arms around too, but for a different reason. They’re using new handheld 3D heat mapping technology.
We’ve all heard of energy audits. Maybe you’ve even had one done to find out where your home is losing heat and how you can better conserve energy. Accurate measurements of temperatures and their precise locations is key to that. It’s now easier, cheaper and more reliable thanks to HeatWave, a new 3D handheld thermal imaging system we’ve been working on. We’re pleased that last night it was the winner of the R&D category at the Queensland iAwards, the IT industry awards.
Thermal imaging systems have been around for decades. But they’ve been a bit cumbersome and often only display results in 2D (which for some applications is fine). Not just that, they’ve needed experts to interpret the results. This often involves mentally challenging tasks like comparing and interpreting different kinds of images from different angles. That’s called ‘cognitive load’ in case you need to know.
Our system needs less brain and more brawn. Basically if you can wave your arm, you can use HeatWave. It weighs around half a kilo and captures multiple perspectives of the same object, then merges them. It’s built on the SLAM (simultaneous localisation and mapping) technology that makes our famous Zebedee scanner work. But it adds information beyond just the 3D shape – it overlays accurate temperature measurements into a single view using some software smarts we’ve developed. Here’s how it maps a hot engine.
We’re hoping the $4B thermal imaging industry will benefit from HeatWave’s ease of use, precision, portability and reliability. Already we’ve done pilot trials for detecting faults in engines, auditing heat loss in buildings, monitoring the condition of livestock and even people!
In the spirit of practicing what we preach, we’ve tried it out on ourselves. One of our researchers had his sore back scanned using HeatWave. The results clearly show where the hot spots (ie painful spots) are in his right shoulder. Hopefully, with a bit of physio, he’ll be waving his arms around pain free sometime soon.
Maybe even doing a Mexican Wave in front of the telly . . .
* * *
Media: Carrie Bengston | 0417 266 190 | email@example.com
That little blue bird in the Twitter logo is always so damn chirpy! But not all our tweets come from a happy state of mind.
With over 9000 tweets posted every second, it’s clear that Twitter has become a popular place to express our feelings in 140 characters or less – from fear, joy and sadness, to anger, surprise and of course, hunger.
I’m so hungry I could Instagram a horse.
— Jason Lastname (@JasonLastname) May 1, 2013
But what if we could use this information to help those who aren’t feeling so chirpy? We’ve teamed up with the Black Dog Institute and Amazon to do just that. Our new research project, called ‘We Feel’, will monitor and analyse public tweets to see when (collectively) people are feeling down and so help mental health support organisations be better prepared.
For this project, we’re drawing on our expertise in natural language processing, big data mining and social media analysis to present data in a way that’s usable by mental health experts.
‘We Feel’ started out by creating an emotional word map. We asked a bunch of people to look at a set of 600 emotion words used in tweets and link them to particular feelings. The emotions are based on US Psychology Professor W. Gerrod Parrott’s standard list of basic emotions including love, joy, surprise, anger, sadness and fear. They were divided further into specific secondary emotions which were then mapped to words used in tweets. For example, the word ‘hurt’ might suggest a feeling of sadness; or ‘nice’ in a tweet might reveal contentment.
This word map combined with data analysis software and techniques can be used to process the stream of thousands of tweets sent into the public domain every second and create a graph of a particular emotion. Here’s ‘fear’ as the Federal Budget was announced last week.
Contrast that fear with the joy of Eurovision the previous weekend. The winner, Austria’s Conchita Wurst, was certainly ecstatic. And millions of Eurovision fans on couches around the world were tweeting almost simultaneously, giving the Eurovision hashtag a work out. ‘We Feel’ revealed spikes of words like ‘amazing’, ‘surprising’, ‘funny’ and ‘weird’ on Twitter in Europe at the time.
Crunching even a small proportion of twitter data like this as-it-happens is a huge task. That’s where Amazon came in with their cloud service Kinesis. It enabled our software to process the data efficiently as it rushed down the pipe. The user interface for ‘We Feel’ is freely available to other developers and researchers so they can use the data too.
As well as providing real time population data, this tool will also help uncover where people are most at risk of poor mental health and how the mood and emotions of an area or region can change over time. We Feel will also help to understand how strongly our emotions are related to social, economic and environmental factors such as the weather, time of day, day of the week, news of a major disaster or economic downturn. This is all very important to health services who could use it to both monitor community mental health and predict where/when more services are needed.
Using social media data to support those with mental health issues – now that’s something we should all feel chirpy about!
Read more about this at: Twitter tool shows a real-time view of our emotions
Media contact: Sarah Klistorner 02 9372 4662, 0423 350 515, firstname.lastname@example.org
By Hannah Scott
For average Australians it’s probably safe to say cyber security isn’t something that’s thought about often. Maybe it should be.
Think about the things you take for granted, like having electricity to power your home and having access to crucial health services. Imagine if this was all compromised because of a cyber attack that could shut down crucial infrastructure. In seconds, your local energy company could be unable to deliver power to your home and your confidential health records, bank account details and passwords could be compromised.
Heartbleed is a recent cyber attack where a major internet security flaw allowed attackers to gain access to encrypted passwords, credit card details, and other data on trusted websites including Facebook, Gmail, Instagram, and Pinterest.
The attack was made on a certain type of secure access software which allowed cyber attackers to gather personal information over what was considered to be a secure connection or site. It’s considered to be one of the first instances of a widespread attack against something that we all take for granted as being secure. James Deverell, from our Futures team, said we’re at risk of incidences such as Heartbleed happening again.
“Not only are we becoming increasingly vulnerable because of the wide range of services that we are moving onto technology platforms, but we are also seeing the nature of these types of cyber attacks changing.”
“For example cyber attacks are becoming increasingly more sophisticated as more tools available and it’s very likely that we will continue to see these types of vulnerabilities continue into the future.”
As technology use is on the rise and as we become more technology dependant for services, we’ll continue to see these kinds of vulnerabilities increase. We’ve released a new report on the future of cyber security, highlighting a number of different vulnerabilities and threats of future attacks.
In the future of cyber attacks, we could expect to see anything from individuals and “Script Kiddies” who are trying to outdo each other in a game of seeing how much damage they can cause, to more serious acts of “Hacktivism” and corporate and government espionage. Each of these threats are becoming increasingly more sophisticated and we need to be prepared as a nation to respond to them.
Core Australian industries such as energy, mining, healthcare and IT services all have vulnerabilities that can potentially be exploited through cyber security, the cost of this estimated to be as high as $2 billion annually.
“Our work across digital productivity, computational informatics and big data analytics will be necessary to address the risks and vulnerability of attacks. We want to keep sectors like Health, Government and Energy safe from cyber attacks in the future,” said Deverall.
Enabling Australia’s Digital Future: Cyber security trends and implications is available to download on our website.
Media: Samantha Lucia, Communication Manager, Information Sciences: email@example.com, 0467 768 960
Sarah Klistorner, Communications Manager, Digital Productivity and Services Flagship: firstname.lastname@example.org, 0423 350 515
By James Davidson and Pamela Tyers
How do you eat your Easter chocolate? Do you suck it or chew it? Does your tongue smear the inside of your mouth as the chocolate melts, or does it get chomped by your back teeth then sent down your throat?
It’s true, some of us suck and some of us chew. Whichever process we use to break down food in our mouth, it affects the taste sensation.
Flavour is released through the movement and time taken for taste components to hit our taste buds. Those taste components include salt, sugar and fat. If we know how to place those tasty bits into foods so that they achieve maximum delicious flavour before we digest the food, we then know how to use less of the unhealthy ingredients because our inefficient chewing means that we don’t taste much of them anyway.
For example, bread would taste unappetising if too much salt was removed out of it, but science can help us understand how to remove some of the less healthy components out of foods while retaining their familiar, delicious taste.
Enter our new 3D dynamic virtual mouth – the world’s first – which is helping our researchers understand how foods break down in the mouth, as well as how the food components are transported around the mouth, and how we perceive flavours. Using a nifty technique called smooth particle hydrodynamics, we can model the chewing process on specific foods and gather valuable data about how components such as salt, sugar and fat are distributed and interact with our mouths at the microscopic level.
We’re using it to make food products with less salt, sugar and fat and incorporate more wholegrains, fibre and nutrients without affecting the taste.
It’s part of research that will help us understand how we can modify and develop particular food products with more efficient release of the flavour, aroma and taste of our everyday foods.
And it’s good news for all of us. Eighty percent of our daily diet is processed foods – think breakfast cereals, sliced meats, pasta, sauces, bread and more. So, creating healthier processed foods will help tackle widespread issues such as obesity and chronic lifestyle diseases.
In fact, our scientific and nutritional advice to government and industry has so far helped remove 2,200 tons of salt from the Australian food supply, and reduced our population’s salt consumption by 4 per cent.
Oh…and we’ve also used the virtual mouth to model just how we break down our Easter chocolate.
As the teeth crush the egg, the chocolate fractures and releases the caramel. The chocolate coating collapses further and the tongue moves to reposition the food between the teeth for the next chewing cycle. The caramel then pours out of the chocolate into the mouth cavity.
With this virtual mouth, variations to thickness of chocolate, chocolate texture, caramel viscosity, and sugar, salt and fat concentrations and locations can all be modified simply and quickly to test the effects on how the flavours are released.
Now that’s something to chew on. Happy Easter!
Media contact: James Davidson, 03 9545 2185, email@example.com
Media Release: Chocolate bytes with virtual mouth.
It’s often hard to understand what’s happening inside us, because the processes and phenomena that influence our bodies and impact our health are invisible.
Not being able to understand why we’re sick or why our body is acting the way it does can add to the stress and strain of illness.
But now, a new generation of movie makers are drawing back the curtain, revealing the hidden secrets of our marvellous biology and setting new standards for communicating biological science to the world.
Three spectacular new biomedical animations were premiered today during a red carpet event at Federation Square in Melbourne.
The molecular movies bring to life some very complex processes, researched by health researchers and detailed in scientific journals most of us never see. They showcase the work of VIZBIplus – Visualising the Future of Biomedicine, a project that is helping to make the invisible visible, so that unseen bodily processes are easier to understand, which will help us make better choices about our health and lifestyle.
With BAFTA and Emmy award winning biomedical animator Drew Berry as mentor, three talented scientific animators – Kate Patterson (Garvan Institute of Medical Research), Chris Hammang (CSIRO) and Maja Divjak (Walter and Eliza Hall Institute of Medical Research) – have created biomedically accurate animations, showing what actually happens in our bodies at the micro scale.
The animators used the same or similar technology as Dreamworks and Pixar Animation Studios, as well as video game creators, to paint mesmerising magnifications of our interior molecular landscapes. While fantastic, the animations are not fantasies. They are well-researched 3D representations of cutting-edge biomedical research.
Kate Patterson’s animation shows that cancer is not a single disease. She highlights the role of the tumour suppressor protein p53, known as ‘the guardian of the cell’, in the formation of many cancer types.
Chris Hammang’s animation describes how starch gets broken down in the gut. It is based on our very own health research about resistant starch, a type of dietary fibre found in foods like beans and legumes that protects against colorectal cancer – one of Australia’s biggest killers. Chris shows us the ‘why’ behind advice to change our dietary habits.
Maja Divjak’s animation highlights how diseases associated with inflammation, such as type 2 diabetes, are ‘lifestyle’ diseases that represent some of the greatest health threats of the 21st century.
With our current ‘YouTube generation’ opting to watch rather than read, biomedical animations will play a key role in revealing the mysteries of science. These videos will allow researchers to communicate the exciting and complex advances in medicine that can’t be seen by the naked eye.
Watch all the videos here and be among the first to see these amazing visualisations!
Picture this. It’s a beautiful autumn day in Melbourne. You’re about to embark on a walking tour to discover some of the city’s finest architecture. My name is Carrie and I’ll be your tour guide.
We begin at one of the city’s more stately buildings – the Shrine of Remembrance. This grand temple-like structure was built back in 1926 and is located right next to the Botanic Gardens. It’s a focus for the city’s ANZAC Day ceremonies each year and in this ANZAC Centenary commemorating the start of WW1.
This month our scientists at CSIRO brought high tech to history by mapping the Shrine using a 3D laser scanner, preserving it digitally with a tool called Zebedee.
As you can see, it’s very timely. Major renovations at the Shrine are underway to get ready for commemorations of the Gallipoli landing’s 100th anniversary in 2015. It’s part of the $45M ‘Galleries of Remembrance’ project.
The Shrine joins a select group of heritage sites mapped in 3D by the Zebedee scanner, along with Brisbane’s Fort Lytton, and even the Leaning Tower of Pisa.
Now I personally get quite excited about architectural drawings, but these 3D maps add detailed information for building managers and heritage experts by measuring the actual built spaces. Zebedee technology offers a new way for recording some of our priceless treasures.
Let me show you one of the interior images of the Shrine. These amazing ‘point clouds’ are created by a handheld laser scanner bouncing on a spring as the user walks through corridors, up stairs and round about. As long as it takes to walk through the building is about how long it takes to make the map. You can watch it online afterwards.
Despite their almost X-ray look, Zebedee can’t see through walls as the laser bounces off solid surfaces. But when you put all the data in one place you get a sliceable, zoomable, turnable map with architectural details like stairs, columns, voids, ceilings all measured to the nearest centimetre. But . . . no roof! That’s because our scientists are developing a flying laser scanner that scans rooftops from the air. Secret attics may be secret no longer.
That concludes our tour for today. If you’d like to take home your very own Zebedee souvenir, head to our website.
By Ali Green
Almost one in four older Australians are affected by chronic health conditions, and close to 1.2 million currently suffer from more than one. Given our ageing population, this number is set to increase significantly by 2030, adding more pressure to our health system.
Life for a chronic disease sufferer with complex conditions like diabetes, heart or lung disease typically means two to three hospital stays per year, on top of multiple visits to the GP for regular health checks.
In Australia’s largest clinical telehealth trial, we’ve equipped a group of elderly patients with broadband-enabled home monitoring systems to help them manage and monitor their conditions from home.
Patients can use the machines to measure their blood pressure, blood sugar, ECG (to detect heart abnormalities through electrical signals), lung capacity, body weight and temperature in a process that generally takes around 10-20 minutes.
The monitoring system’s large screen helps guide patients through the different procedures, and the data is sent off to a secure website where it becomes immediately available to a care team including the patient’s nurse and doctor. Daily stats are checked regularly by a specialist nurse who can assist the patient via telephone if there are any changes in their regular patterns.
150 patients across Australia are testing out the machines as part of the CSIRO-led trial. Here are a few of their stories.
Janice and Bill
Victorian retiree Janice suffers from an irregular heartbeat, diabetes and low blood pressure – conditions that require twice weekly visits to her doctor and multiple hospital stays to be controlled. She also has diabetes related retinopathy which has caused her to lose most of her vision, making medical visits difficult for both herself and husband Bill.
Since using the telehealth monitoring system, Janice’s GP and hospital visits have reduced significantly, and she can better manage her symptoms to prevent hypoglycaemic episodes. Bill also has a clearer idea of how Janice is doing from day to day.
“If Janice’s blood pressure reading is particularly low, I can prevent any dizzy spells by getting her to sit down and giving her a glass of water. If her measurements are stable, I can pop out to do some shopping or walk the dog knowing that she should be fine on her own for a little while,” says Bill.
Jack has ischaemic heart disease and chronic obstructive pulmonary disease (COPD). This affects his airways causing breathlessness, a chronic cough and mucus build-up.
During a routine check of Jack’s telehealth monitoring data, his nurse Lay noticed that his ECG results were slightly unusual. This prompted the nurse to call Jack, who complained of shortness of breath. An appointment was made with Jack’s doctor for a full ECG, which turned out to be fine.
As a result of this episode, Jack’s nurse arranged to visit him at home to discuss a medication regime and teach him to use his medicated spray. This meant Jack could self-manage his shortness of breath and prevent unnecessary doctor visits.
75-year-old Frances (pictured top) has a respiratory condition called bronchiectasis. This can easily develop into a chest infection without early warning and lead to a stay in hospital.
Every day, Frances conducts a ten minute check up with the telehealth monitoring system in between washing up the breakfast dishes and getting ready to go out. A nurse at the other end of the internet connection checks Frances’ measurements, looking for any signs of early deterioration.
“I was surprised by the idea of self-monitoring at first, but now that I’m used to it, I think it’s a terrific idea. It has really helped me to better understand my health,” says Frances.
As Australia’s population ages and more demand is placed on our health system, telehealth can help reduce patient hospitalisation, and the related costs, by allowing patients to better manage their chronic diseases from home.
The Home Monitoring of Chronic Disease for Aged Care project is an initiative funded by the Australian Government.
CSIRO is participating in One in Four Lives: The Future of Telehealth in Australia event, at Parliament House this morning from 7:45am AEDT.
Media: Sarah Klistorner M: +61 477 716 031
By Ian Oppermann, Director, Digital Productivity and Services
When disaster strikes – such as January’s bushfire in Victoria or the recent cold spell that froze much of north America – it’s vital for emergency services to get the latest information.
They need to access real-time data from any emergency sites and command centres so they can analyse it, make timely decisions and broadcast public-service updates.
CSIRO thinks it has a solution in its high speed and high bandwidth wireless technology known as Ngara, originally developed to help deliver broadband speeds to rural Australia.
The organisation has announced a licensing deal with Australian company RF Technology to commercialise Ngara so it can be used to allow massive amounts of information to be passed between control centres and emergency services in the field.
There is already interest from agencies in the United States and it’s hoped that Australian agencies will soon follow.
Squeezing more data through
The technology will package four to five times the usual amount of data into the same spectrum. This will allow emergency services to send and receive real time data, track assets and view interactive maps and live high definition video from their vehicles. It’s a step in what has been a long journey toward an ambitious vision.
For years, the vision of the communications research community was “connecting anyone, anywhere, anytime” – a bold goal encompassing many technical challenges. Achieving that depended heavily on radio technology because only radio supports connectivity and mobility.
Over the years we designed ever more complex mobile radio systems – more exotic radio waveforms, more antenna elements, clever frequency reuse, separation of users by power or by spreading sequence and shrinking the “cell” sizes users operate in.
A research surge in the late 1990s and 2000s led to a wealth of technology developed in the constant drive to squeeze more out of radio spectrum, and to make connections faster and more reliable for mobile users.
This radio access technology became 3G, LTE, LTE-A and now 4G. Europe is working on a 5G technology. We’ve also seen huge advances in wireless local area networks (WLAN) and a strong trend to offload cellular network data to WLAN to help cope with the traffic flowing through the networks.
Demand for more keeps growing
Despite this, the data rate demands from users are higher than what mobile technology can offer. Industry commentators who live in the world of fixed communication networks predict staggering growth in data demand which, time tells us, is constantly underestimated.
We’ve even stretched our ability to name the volume of data flowing through networks: following terabytes we have exabytes (1018), zetabytes (1021) and yottabyes (1024 bytes) to describe galloping data volumes.
A few more serious problems arise from all of this traffic flowing through the world’s networks. The first is the “spectrum crunch”. We have sliced the available radio spectrum in frequency, time, space and power. We need to pull something big out of the hat to squeeze more out of the spectrum available in heavy traffic environments such as cities.
The second is the “backhaul bottleneck”. All the data produced in the radio access part of the network (where mobile users connect) needs to flow to other parts of the network (for example to fixed or mobile users in other cities).
Network operators maintain dedicated high capacity links to carry this “backhaul” traffic, typically by optical fibre or point-to-point microwave links. This works well when the backhaul connects two cities, but less well when connecting the “last mile” in a built-up urban environment.
When the total data volume which needs to be moved in terms of bits-per-second-per-square-metre rises into the range requiring backhaul capacities and is mobile, then some clever dynamic backhaul technology is needed.
As more of us carry yet more devices, and continue to enjoy high quality video-intensive services, we will keep pushing up our data rate demands on mobile networks. In theory, there is no known upper limit on the amount of data an individual can generate or consume. In practice, it depends on available bandwidth, the cost of data and the ability of devices to serve it up to us.
We have seen amazing progress in mobile data rates over the past decade. This trend will need to continue if we’re to keep pace with demand.
A new solution
To address the burgeoning data demand, and building on a strong history in wireless research, CSIRO has developed two major pieces of new technology – Ngara point-to-point (backhaul) and Ngara point-to-multi-point (access) technology. (Ngara is an Aboriginal word from the language of the Dharug people and means to “listen, hear, think”.)
The latter Ngara technology solves several big challenges over LTE networks through its “narrow cast” beam forming transmissions and smart algorithms which can form a large number of “fat pipes” in the air, reducing energy wastage of the radio signal, and increasing data rates and range.
It also enables wireless signals to avoid obstacles like trees, minimises the need for large chunks of expensive spectrum and allows agencies to dynamically change data rates where and when needed during an emergency.
In Australia we are looking at a field trial of Ngara in remote and regional communities to deliver critical broadband services such as health and education.
It’s the type of technology you’d expect on Batman’s utility belt – but you won’t find it in a DC Comics book about the world’s greatest detective. Instead, this bad boy is being used by our real-life heroes to fight crime.
It’s called Zebedee – a handheld laser scanner that generates 3D maps of all sorts of environments, from caves to factory floors, in the time it takes to walk through them.
The portable device works by emitting laser beams while rotating around a spring that continuously scans the environment, converting 2D measurements into a 3D field of view. In fact, it can collect over 40,000 range measurements in just one second. It could even create a 3D map of the Batcave in around 20 minutes.
But it’s never been used like this before.
For the first time, our real world crime fighters at the Queensland Police Service are using Zebedee to help piece together crime scene puzzles.
Crime scenes can be difficult to investigate. They’re often places like dense bushland, steep slopes or dangerous caves, which can make thorough sweeps of the scene both tough and time consuming. Using Zebedee, also known as ZEB1, police can now easily access these hard to reach places and map confined spaces where it may be difficult to set up bulky camera equipment and tripods. It also means less disturbance of the crime scene.
Using data collected by the scanner, police investigators can quickly recreate the scene on their computer in 3D, and view it from any angle they want. They can then locate and tag evidence to particular locations with pinpoint accuracy.
This Brisbane-born technology is now available commercially and this local application is a striking example of how 3D mapping can allow us to access locations and view angles previously out of our reach.
It will help our local detectives and crime fighters in the Police Service generate and pursue lines of enquiry for incidents like murder cases and car crashes. You could say that Zebedee puts the CSI in CSIRO.
We’re working on even more ways to adapt Zebedee for a range of other jobs that require 3D mapping, from security and emergency services to forestry and mining, even classroom learning.
Film makers may soon be able to use Zebedee technology to easily digitise actual locations and structures when creating animated worlds. Maybe the next Batcave we see at the movies will be more realistic than ever before, created by 3D mapping an actual cave.
The potential applications are endless.
By Ali Green and Sarah Klistorner
An estimated one million Australians have diabetes and this number is expected to double by 2025. About 60 per cent of these people will develop eye issues, like the diabetes-related disease retinopathy.
Diabetic retinopathy is one of the leading causes of irreversible blindness in Australian adults. The disease often has no early-stage symptoms and is four times more likely to affect indigenous Australians.
Just imagine if this disease was preventable.
During the past few months, our researchers have been working with Queensland Health and the Indigenous and Remote Eye Health Service (IRIS) on the Torres Strait Islands to set up a remote eye screening service – giving hundreds of people access to specialist eye care.
For people living in remote areas, travelling a 5 hour round trip for specialist medical care can be disruptive to their family and community. Transporting patients can also be expensive.
Our Remote-I system is saving patients from the long and sometimes unnecessary journey by utilising local clinicians to conduct routine 15 minute retinal screenings, often as part of scheduled health clinic visits. Our technology sends hi-res retinal images taken in the screenings to ophthalmologists in Brisbane via satellite broadband.
Previously, ophthalmologists would only be able to fit in a limited number of eye screenings and surgeries when they visited remote communities. Once fully implemented, a city-based ophthalmologist will be able to screen up to 60 retinal images per week with the help of Remote-I.
Preliminary results from a review of data collected at one location showed that only three out of 82 patients screened to that date had a sight-threatening condition and required an immediate referral. Previously, those other 79 patients not requiring referrals may have held up the queue while the specialist was visiting the remote community. With Remote-I, those who need immediate treatment or attention can already be first in line.
With only 900 practicing ophthalmologists in Australia, and a high demand for eye health services in remote locations, finding new ways to deliver health services to remote communities is vital to providing the best care when and where it’s needed.
By June 2014 the Tele-Eye Care trial will have screened 900 patients in remote WA and QLD. In addition to streamlining health care processes, the trial is collecting a lot of data.
And this is where the science gets interesting.
With patients’ consent, collected images will be used by the Tele-Eye Care project to study blood vessel patterns in retinas. Algorithms will then be designed to automatically detect particular eye diseases, which will aid diagnosis in routine screenings.
Even though tele-ophthalmology has been around for many years, this is the first time anyone has looked at image processing techniques to automatically detect eye defects in routine screening environments via satellite broadband.
We’re working hard to deliver better health outcomes for indigenous Australians. Being able to provide diagnoses on the spot will make a huge impact on delivering faster, more cost effective eye care services to the outback and prevent blindness.
This initiative is funded by the Australian Government.
Media contact: Ali Green +61 3 9545 8098
This week’s heatwave across southern Australia has reminded us of the serious dangers posed by grassfires. They might not sound that threatening, but these fires can travel at speeds of up to 25 kilometres per hour, and damage hundreds of hectares of land within a matter of hours.
So while many of us were enjoying our summer holidays, our team of fire scientists were hard at work with researchers and volunteers from Victoria’s Country Fire Authority (CFA) to help learn more about grassfire behaviour in Australia.
In a series of carefully designed, planned and monitored experiments, the research team lit controlled fires in grass fields near Ballarat, an hour west of Melbourne.
The aim was to safely gather new and thorough data about grassfire behaviour in different conditions. Experimental plots containing grasses at different stages of their life cycle were burned, while experts observed and various instruments measured things like the time it took for the fire to burn across the 40 x 40 metre plot.
Australian researchers have been looking into forest fires and bush fires for decades, but this is the first time in nearly 30 years since we’ve conducted research into grassfires.
Back in 1986 we ran similar experiments in the Northern Territory, which led to the development of our Grassland Fire Spread Meter. This tool is used by rural fire authorities across the country to predict the rate that a grassfire is likely to spread once it starts.
What remains unknown is at what stage of the grass’s life cycle it becomes a fire hazard, especially for the grass types found across southern Australia.
Today, we have technology like unmanned aerial vehicles (UAVs) to help gain a new, bird’s eye perspective of the fire’s progress, allowing us to analyse the burns with a whole new level of detail.
Controlled by our scientists, the robot quadcopters flew above the experimental burns, filming the fire as it spread through the grass. The vision, along with other data captured by thermal sensors, will be used to develop computer models that can replicate and predict grassfire behaviour.
The results will help fire authorities like the CFA better respond to grassfires, as well as improving how they determine Fire Danger Ratings and when to declare Fire Danger Periods in particular regions.
The timing of the experimental burns was critical. The crew waited until the weeks of late December and early January for safe temperature and wind conditions to ensure any fires they lit would be easy to control and contain. They also needed the grass curing levels to be at the right amount. Thankfully, this was all wrapped up before we, and the grasslands, were hot and bothered by the heatwave.
Check out this video for the how and why of the experimental burns:
By Carrie Bengston
We all feel quite virtuous when we do our bit for the environment – whether it’s taking our own bags to the shops, sorting our recycling or leaving the car at home. But have you ever thought about doing the same for your computer? We have – and we’ve got a pretty certificate to prove it.
Forgive us for bragging, but our supercomputer in Canberra, ‘Bragg’, has recently been named the world’s 10th most energy-efficient supercomputer in the Green500 list, which ranks the energy use of supercomputers according to performance-per-watt.
Bragg was named after Adelaide father-and-son physicists Lawrence and Henry Bragg, Australia’s first Nobel Prize winners. It handles our massive research data sets, does our complex computer modelling and simulates dynamic processes. This helps us make better decisions about things like water security, bushfire preparedness, materials analysis, and coastal water quality.
And it does all of this using less energy per MegaFLOP (a unit of supercomputer speed) than all but 9 of the world’s fastest computers. But how?
Well, Bragg is a GPU cluster which means it gets its speed from Graphic Processing Units (GPUs) initially designed for fancy graphics in computer games. A few years back, computer geeks realised GPUs could do more than handle images of shoot ‘em up games or Minecraft. They could also perform multiple tasks at once at a fraction of the price of traditional computers.
Our Bragg cluster has had three GPU upgrades during its four year life. It’s now over ten times faster and twice as energy efficient. This has kept it ‘up to speed’ to handle the workloads our scientists throw at it.
But it wasn’t that easy to get a top spot on the list. First we had to measure the number of double precision floating point operations per second (or FLOPS) by running a performance test known as the LINPACK benchmark.
Then we submitted this number (if you’re wondering, it was 167.5 TFLOPS or 167.5×10**12 FLOPS) to the TOP500 list – where we ranked at number 260. This qualified us to submit to the Green500 list. To do this, we ran the LINPACK benchmark while measuring the power consumption on the Bragg cluster in the Canberra Data Centre.
So our 167.5 TFLOPS using 71.01 kW of power gives us 2,358.69 MFLOPS/Watt i.e. about 2.36 billion calculations per watt. Confused?
To put it another way, if you have an old 100 watt light at home, Bragg can perform 236 billion calculations per second using that amount of power. Check it out in action:
Bragg isn’t our only cool supercomputer. The new iVEC Pawsey Centre in WA is Australia’s largest supercomputer, used for incredible data-intensive projects like the world’s largest radio telescope, the future Square Kilometre Array. It uses a recycled groundwater cooling system which will save around 38.5 million litres of water per year compared to traditional cooling methods.
December 1 – 6 is the 20th International Congress on Modelling and Simulation, MODSIM in Adelaide.
By James Davidson
Cylons, Skynet, HAL 9000, Agent Smith, Haley Joel Osment. With characters like these, it’s no wonder people are concerned about the intelligent machines of tomorrow. But is there really any reason to fear? The truth is artificial intelligence (AI) is proving to be quite helpful…at least so far.
Clever, self-learning computer systems are helping us answer some of the world’s biggest problems – like how to predict bushfire hotspots. Unlike traditional methods where our best guesses are subjective, intelligent computers can use machine learning to replicate events based on advanced pattern recognition.
This month, our researchers revealed an AI system that could help us plan for future fires. It’s based on artificial neural networks (ANNs) which have actually been around since the 80s. These models allow computers to learn from data and provide a pretty accurate estimate of future events, eliminating many assumptions.
Today, ANNs are being used for deep cognitive imaging, an advanced form of pattern recognition. Based on this idea, our team of machine learning experts have built a deep cognitive learning system using ANNs to predict fire incidents across Australia. This could be the first step in providing information for emergency services planners to decide where to focus future firefighting resources.
So how does it work?
To put it simply, a computer is shown an image that represents a set of data. It’s then shown another image that has resulted from the first. The computer doesn’t know how or why the two are related, but it learns to estimate the outcome based on the first input. Essentially, the computer learns the cause-and-effect relationship so it can also predict the effect side of the equation in different scenarios.
Our team trained their computer to learn the relationship between Australian climate maps and fire hotspots. To do this, they presented it with maps of Australia’s past climate using data from the Bureau of Meteorology. Next, they showed it maps of fire hotspots complied from satellite imagery data collected by NASA.
The computer wasn’t told how the two maps were related, other than the fact that the first map resulted in the second. But, almost magically, it was able to use the ANN to learn how to reproduce the fire maps.
Then, they got even trickier. They showed the computer a scenario based on Australia’s climate between 2001 and 2010. It was able to replicate the real world occurrence of fire hotspots with 90 per cent accuracy at the 5 x 5 kilometre scale. Not bad!
It’s early days for this AI, but unlike the scary smart machines of film fiction, this work poses no threat to human life. Instead, it could go a long way towards saving lives by improving our understanding of how different climate scenarios impact fire regimes across Australia.
We’re also working on a suite of other smart tools for disaster management and recovery. Learn more in our media release.
November 27-28 is the Building a System of Systems for Disaster Management event in Victoria. We’re looking at how Australia’s key agencies can improve the way they access vital information during emergencies.