By Carrie Bengston
We all feel quite virtuous when we do our bit for the environment – whether it’s taking our own bags to the shops, sorting our recycling or leaving the car at home. But have you ever thought about doing the same for your computer? We have – and we’ve got a pretty certificate to prove it.
Forgive us for bragging, but our supercomputer in Canberra, ‘Bragg’, has recently been named the world’s 10th most energy-efficient supercomputer in the Green500 list, which ranks the energy use of supercomputers according to performance-per-watt.
Bragg was named after Adelaide father-and-son physicists Lawrence and Henry Bragg, Australia’s first Nobel Prize winners. It handles our massive research data sets, does our complex computer modelling and simulates dynamic processes. This helps us make better decisions about things like water security, bushfire preparedness, materials analysis, and coastal water quality.
And it does all of this using less energy per MegaFLOP (a unit of supercomputer speed) than all but 9 of the world’s fastest computers. But how?
Well, Bragg is a GPU cluster which means it gets its speed from Graphic Processing Units (GPUs) initially designed for fancy graphics in computer games. A few years back, computer geeks realised GPUs could do more than handle images of shoot ‘em up games or Minecraft. They could also perform multiple tasks at once at a fraction of the price of traditional computers.
Our Bragg cluster has had three GPU upgrades during its four year life. It’s now over ten times faster and twice as energy efficient. This has kept it ‘up to speed’ to handle the workloads our scientists throw at it.
But it wasn’t that easy to get a top spot on the list. First we had to measure the number of double precision floating point operations per second (or FLOPS) by running a performance test known as the LINPACK benchmark.
Then we submitted this number (if you’re wondering, it was 167.5 TFLOPS or 167.5×10**12 FLOPS) to the TOP500 list – where we ranked at number 260. This qualified us to submit to the Green500 list. To do this, we ran the LINPACK benchmark while measuring the power consumption on the Bragg cluster in the Canberra Data Centre.
So our 167.5 TFLOPS using 71.01 kW of power gives us 2,358.69 MFLOPS/Watt i.e. about 2.36 billion calculations per watt. Confused?
To put it another way, if you have an old 100 watt light at home, Bragg can perform 236 billion calculations per second using that amount of power. Check it out in action:
Bragg isn’t our only cool supercomputer. The new iVEC Pawsey Centre in WA is Australia’s largest supercomputer, used for incredible data-intensive projects like the world’s largest radio telescope, the future Square Kilometre Array. It uses a recycled groundwater cooling system which will save around 38.5 million litres of water per year compared to traditional cooling methods.
December 1 – 6 is the 20th International Congress on Modelling and Simulation, MODSIM in Adelaide.
By James Davidson
Cylons, Skynet, HAL 9000, Agent Smith, Haley Joel Osment. With characters like these, it’s no wonder people are concerned about the intelligent machines of tomorrow. But is there really any reason to fear? The truth is artificial intelligence (AI) is proving to be quite helpful…at least so far.
Clever, self-learning computer systems are helping us answer some of the world’s biggest problems – like how to predict bushfire hotspots. Unlike traditional methods where our best guesses are subjective, intelligent computers can use machine learning to replicate events based on advanced pattern recognition.
This month, our researchers revealed an AI system that could help us plan for future fires. It’s based on artificial neural networks (ANNs) which have actually been around since the 80s. These models allow computers to learn from data and provide a pretty accurate estimate of future events, eliminating many assumptions.
Today, ANNs are being used for deep cognitive imaging, an advanced form of pattern recognition. Based on this idea, our team of machine learning experts have built a deep cognitive learning system using ANNs to predict fire incidents across Australia. This could be the first step in providing information for emergency services planners to decide where to focus future firefighting resources.
So how does it work?
To put it simply, a computer is shown an image that represents a set of data. It’s then shown another image that has resulted from the first. The computer doesn’t know how or why the two are related, but it learns to estimate the outcome based on the first input. Essentially, the computer learns the cause-and-effect relationship so it can also predict the effect side of the equation in different scenarios.
Our team trained their computer to learn the relationship between Australian climate maps and fire hotspots. To do this, they presented it with maps of Australia’s past climate using data from the Bureau of Meteorology. Next, they showed it maps of fire hotspots complied from satellite imagery data collected by NASA.
The computer wasn’t told how the two maps were related, other than the fact that the first map resulted in the second. But, almost magically, it was able to use the ANN to learn how to reproduce the fire maps.
Then, they got even trickier. They showed the computer a scenario based on Australia’s climate between 2001 and 2010. It was able to replicate the real world occurrence of fire hotspots with 90 per cent accuracy at the 5 x 5 kilometre scale. Not bad!
It’s early days for this AI, but unlike the scary smart machines of film fiction, this work poses no threat to human life. Instead, it could go a long way towards saving lives by improving our understanding of how different climate scenarios impact fire regimes across Australia.
We’re also working on a suite of other smart tools for disaster management and recovery. Learn more in our media release.
November 27-28 is the Building a System of Systems for Disaster Management event in Victoria. We’re looking at how Australia’s key agencies can improve the way they access vital information during emergencies.
Remember the ol’ days of dial-up internet? When you got disconnected every time the phone rang and used up all your drive space to download one little file? Man, life was hard.
Luckily in the 90s our peeps came up with a little something called WiFi – and hallelujah all of our first world problems were solved.
Using the same mathematics that astronomers initially applied to piece together the waves from black holes, the potential of WiFi became ‘patently’ clear to its inventors. Today, its myriad of applications have fundamentally changed how we think of and use technology in our daily lives. In fact, by the end of this year more than 5 billion devices will be connected to our WiFi patented technology. The discovery is one of our most successful inventions to date and is internationally recognised as a great Aussie science success story.
This infographic explains how WiFi technology was created and how it actually works (click for full size):
While WiFi was developed as part of our previous ICT Centre and Radiophysics Research Division, our main wireless networks laboratory is now a part of our new Computational Informatics Research Division and has approximately 50 researchers located at our Marsfield site in Sydney.
These days, we are working with industry partners around the world on new challenges such as using wireless tracking tools to help improve the performance of athletes and ensuring the safety of miners, firefighters and emergency service personnel. We’re also helping farmers monitor soil fertility, crop growth and animal health by integrating wireless networks with centralized cloud computing.
Learn more about how we patented Wireless LAN technology.
Media: Dan Chamberlain. P: +61 2 9372 4491 M: 0477 708 849 Email: email@example.com
By Carrie Bengston
For us, Movember isn’t just about blokes growing facial hair and raising funds for men’s health – it’s a chance to collect data and muck around with technology.
Computer fluid dynamicist Fletcher Woolard is more used to animating geophysical flows like tsunamis and landslides. But this month, he thought he’d try something a little different – animating mo growth.
Using his skills in computer simulation, he photographed day by day, millimetre by millimetre, follicle by follicle how his mo was growing – and turned it into a cool time lapse video.
In just four seconds, you can see Fletcher’s facial hair growing at around 400,000 times the normal speed.
Unfortunately his efforts were still a long way off Ram Singh Chauhan, who has spent over thirty years crafting an impressive 4.29 metre long moustache. But hey, it’s not bad for less than a month’s growth.
Perhaps not surprisingly, this isn’t our first attempt at capturing hair growth data.
Back in 2008, our image analysis team developed software to count and measure hair regrowth. It was designed to test the effectiveness of hair removal products more accurately – which is traditionally a manual (and pretty boring) process.
The software took digital images from a specifically designed scanner pressed on to the skin, and used smart algorithms to automatically look for the hairs. Despite initial interest from several hair replacement studios, it sadly never made it to product stage
But all is not lost. Luckily in today’s world of mobile wireless technology, there’s an app for that – the mo tracker.
For more information or to get involved in Movember, head to Movember Australia.
By Michael Brünig, Deputy Chief, Computational Informatics
There isn’t a radio-control handset in sight as a nimble robot briskly weaves itself in and out of the confined tunnels of an underground mine.
Powered by ultra-intelligent sensors, the robot intuitively moves and reacts to the changing conditions of the terrain, entering areas unfit for human testing. As it does so, the robot transmits a detailed 3D map of the entire location to the other side of the world.
While this might read like a scenario from a George Orwell novel, it is actually a reasonable step into the not-so-distant future of the next generation of robots.
A recent report released by the McKinsey Institute predicts the potential economic contribution of new technologies such as advanced robotics, mobile internet and 3D printing are expected to return between US$14 trillion and US$33 trillion globally per year by 2025.
Technology advisory firm Gartner also recently released a report predicting the “smart machine era” to be the most disruptive in the history of IT. This trend includes the proliferation of contextually aware, intelligent personal assistants, smart advisers, advanced global industrial systems and the public availability of early examples of autonomous vehicles.
If the global technology industry and governments are to reap the productivity and economical benefits from this new wave of robotics they need to act now to identify simple yet innovative ways to disrupt their current workflows.
The automotive industry is already embracing this movement by discovering a market for driver assistance systems that includes parking assistance, autonomous driving in “stop and go” traffic and emergency braking.
In August 2013, Mercedes-Benz demonstrated how their “self-driving S Class” model could drive the 100-kilometre route from Mannheim to Pforzheim in Germany. (Exactly 125 years earlier, Bertha Benz drove that route in the first ever automobile, which was invented by her husband Karl Benz.)
The car they used for the experiment looked entirely like a production car and used most of the standard sensors on board, relying on vision and radar to complete the task. Similar to other autonomous cars, it also used a crucial extra piece of information to make the task feasible – it had access to a detailed 3D digital map to accurately localise itself in the environment.
When implemented on scale, these autonomous vehicles have the potential to significantly benefit governments by reducing the number of accidents caused by human error as well as easing traffic congestion as there will no longer be the need to implement tailgating laws enforcing cars to maintain large gaps in between each other.
In these examples, the task (localisation, navigation, obstacle avoidance) is either constrained enough to be solvable or can be solved with the provision of extra information. However, there is a third category, where humans and autonomous systems augment each other to solve tasks.
This can be highly effective but requires a human remote operator or depending on real time constraints, a human on stand-by.
The question arises: how can we build a robot that can navigate complex and dynamic environments without 3D maps as prior information, while keeping the cost and complexity of the device to a minimum?
Using as few sensors as possible, a robot needs to be able to get a consistent picture of its environment and its surroundings to enable it to respond to changing and unknown conditions.
This is the same question that stood before us at the dawn of robotics research and was addressed in the 1980s and 1990s to deal with spatial uncertainty. However, the decreasing cost of sensors, the increasing computing power of embedded systems and the ability to provide 3D maps, has reduced the importance of answering this key research question.
In an attempt to refocus on this central question, we – researchers at the Autonomous Systems Laboratory at CSIRO – tried to stretch the limits of what’s possible with a single sensor: in this case, a laser scanner.
In 2007, we took a vehicle equipped with laser scanners facing to the left and to the right and asked if it was possible to create a 2D map of the surroundings and to localise the vehicle to that same map without using GPS, inertial systems or digital maps.
The result was the development of our now commercialised Zebedee technology – a handheld 3D mapping system incorporates a laser scanner that sways on a spring to capture millions of detailed measurements of a site as fast as an operator can walk through it.
While the system does add a simple inertial measurement unit which helps to track the position of the sensor in space and supports the alignment of sensor readings, the overall configuration still maximises information flow from a very simple and low cost setup.
It achieves this by moving the smarts away from the sensor and into the software to compute a continuous trajectory of the sensor, specifying its position and orientation at any time and taking its actual acquisition speed into account to precisely compute a 3D point cloud.
The crucial step of bringing the technology back to the robot still has to be completed. Imagine what is possible when you remove the barrier of using an autonomous vehicle to enter unknown environments (or actively collaborating with humans) by equipping robots with such mobile 3D mapping technologies. They can be significantly smaller and cheaper while still being robust in terms of localisation and mapping accuracy.
From laboratory to factory floor
A specific area of interest for this robust mapping and localisation is the manufacturing sector where non-static environments are becoming more and more common, such as the aviation industry. Cost and complexity for each device has to be kept to a minimum to meet these industry needs.
With a trend towards more agile manufacturing setups, the technology enables lightweight robots that are able to navigate safely and quickly through unstructured and dynamic environments like conventional manufacturing workplaces. These fully autonomous robots have the potential to increase productivity in the production line by reducing bottlenecks and performing unstructured tasks safely and quickly.
The pressure of growing increasing global competition means that if manufacturers do not find ways to adopt these technologies soon they run the risk of losing their business as competitors will soon be able to produce and distribute goods more efficiently and at less cost.
It is worth pushing the boundaries of what information can be extracted from very simple systems. New systems which implement this paradigm will be able to gain the benefits of unconstrained autonomous robots but this requires a change in the way we look at the production and manufacturing processes.
This article is an extension of a keynote presented at the robotics industry business development event RoboBusiness in Santa Clara, CA on October 25 2013.
My nine year old son has lived his whole life in a house that doesn’t have a cable connected to a telephone – not to mention to a laptop or mouse. This is largely because wireless technology has given us the freedom to live life wirelessly using devices like laptops, TVs and smartphones. And its popularity just keeps on growing. It’s estimated that there will be 5.8 million WiFi hotspots across the globe by 2015 and 800 million WiFi–enabled households by 2016.
Soon, wireless devices will be everywhere in our daily lives, measuring and optimising things that we never thought were possible – and we won’t even know it. Think high definition 3D video that streams seamlessly to tiny wireless devices without having to worry out about signal strength, coverage or network congestion. Whether you want to park your car without driving, feed your dog when you’re on holidays or program your fridge to automatically add milk to your shopping list when you’re running low – all of this is possible with wireless technology.
Today, we’re taking wireless technology out of our labs and working with a range of partners to solve important purposes. Our wireless ad-hoc system for positioning (WASP) technology is helping improve the performance of athletes and ensuring the safety of miners, firefighters and emergency service personnel. We’re also helping farmers monitor soil fertility, crop growth and animal health through integrating wireless networks with centralized cloud computing.
In twenty years’ time who knows how far we could get. Maybe all cars on the road will be tracked, integrated and controlled over wireless links. Just think – no more traffic jams or crashes. I bet Rhonda and Ketut will be happy with that.
With the increasing popularity and growth of wireless technology for business, residential and mobile users, there’s a big demand for new research and development in the future. And we’re chuffed to be at the forefront of these exciting changes.
Check out our infographic showing how we’re using Wi-Fi technologies today (click image for full-size):
Learn more about our work in wireless networks.
Iain Collings presented a keynote on the future of wireless research at The Australasian Telecommunications Networks and Applications Conference on Wednesday 20th November.
Media: Dan Chamberlain. P: +61 2 9372 4491 M: 0477 708 849 Email: firstname.lastname@example.org
By Flo Conway-Derley
Today marks the official operational launch of the iVEC Pawsey Centre — Australia’s newest supercomputer facility in Perth, Western Australia.
Supercomputing resources at iVEC’s Pawsey Centre will be available for data-intensive projects across the scientific spectrum, including radio astronomy, geosciences, biotechnology and nanotechnology.
In particular, a significant portion of the supercomputing power will be dedicated to processing radio-astronomy data from major facilities, such as our Australian SKA Pathfinder telescope.
ASKAP will need the processing power of the Pawsey Centre to crunch some serious data. When fully operational, about 250 terabytes per day will stream from the telescope, data which the supercomputer will need to process in more-or-less real time.
In 2009, our astronomers used the Australia Telescope Compact Array (ATCA) in Narrabri to create a picture of the galaxy Centaurus A. This galaxy is large — around 200 times the size of the full moon — and it took more than 1200 hours of data collection and 10,000 hours of computing to make the image.
With ASKAP and the Pawsey Centre, the image would take around ten minutes.
Some other interesting facts about the Pawsey Centre:
- It was named after Dr Joseph Pawsey, who is widely acknowledged as the father of Australian radio astronomy.
- It houses a supercomputer able to exceed one quadrillion operations every second, or one “petaflop”.
- It has 40 petabytes of storage capacity — that’s 40,000,000,000,000,000 bytes, or 223,000 DVDs.
- Once finished, the Centre will house 20 tonnes of computer equipment and 400 km of fibre optic cable within the 1000 sqm building.
- Half of the Pawsey Centre’s floorspace has been earmarked for the computing needs of the future international Square Kilometre Array (SKA) telescope project.
- A groundwater cooling system, developed by our CSIRO Geothermal Project, is used to cool the supercomputer, rather than water towers. This system brings water up from groundwater bores 140m deep and cycles 90 litres a second to cool the machine.
- The amount of water saved by using a groundwater cooling system to cool the Pawsey Centre supercomputer is equivalent to the amount of drinking water consumed in South Perth.
Note: CSIRO, as centre agent for iVEC, has led the development of the Pawsey Centre. It owns and maintains the building, which is constructed on CSIRO-owned land adjacent to the Australian Resources Research Centre facility. For more information about iVEC, visit http://www.ivec.org
After 3,372 days of stress and studying, it’s time for year twelves to ditch the books and celebrate.
Over 28,000 school leavers will take to the Gold Coast this week for the annual Schoolies Festival. And while so-called ‘toolies’ might be unwelcome guests at this party – there’s a different kind of tool that is.
It’s called the Patient Admission Prediction Tool (PAPT) - and it could make a world of difference to those who party a little too hard.
PAPT allows staff in emergency departments to see what their patient load will be like in the next hour, the rest of the day,the next week or busy periods like Schoolies week.
The technology tells staff how many patients will be coming through the door, what they’ll be suffering from and how serious their condition will be.
Gold Coast Health are using PAPT to plan the staff, medical supplies and beds needed to care for Schoolies. It will also help them manage waiting times for our other patients who are still arriving with other serious injuries.
Perhaps not surprisingly, PAPT has predicted that most hospital admissions during Schoolies will be from alcohol intoxication.
We can also expect to see cuts and bruises to the head and feet, sprains to hands, wrists and ankles, asthma attacks, reactions to severe stress, drug poisoning and a few broken thumbs, toes and noses.
So while we have the tools to treat these injuries more efficiently, schoolies are still encouraged to be responsible and look after their mates.
Learn more about our work in health services.
Media: Sarah Klistorner, P: +61 2 9372 4662, M: +61 477 716 031, Sarah.Klistorner@csiro.au
By Ian Opperman, Director, Digital Productivity and Services National Research Flagship
Australia has low unemployment, a high standard of living, fabulous beaches and great weather. We don’t have tight, large scale coupling between our creative types and our industry types. At least not in the ICT sector.
Many multinationals have their own creative centres, their own “dreamers”, and are typically very good at bringing the dreamers together with people and structures who implement their dreams (the “doers”), and finding the financial resources (the “dollars”) to make it all happen.
There are also some excellent examples of home-grown innovation getting this right, but only a few who have made it big – it is the exception rather than the rule.
We traditionally think of ourselves as a country of miners and farmers, of rugged individuals. Our folklore is jumbucks and billabongs rather than computer nerds to the rescue. Many still believe we live in a “dig it up and ship it” or a “shear it, shoot it, ship it” economy. In practice, we live in a services economy, much of which is digital. More than 70 per cent of our GDP is services based and this is growing.
Continuing improvements in communications networks and technology will profoundly impact many aspects of the economy through “tele” (tele-work, tele-health, tele-education). Innovation in digitally enabled services will dramatically change which services are delivered, improve quality of life, as well as drive productivity improvements.
The downside is that access to a globally connected world brings challenges in terms of the potential for a digital divide (digital “haves” and “have nots”), creation of unprecedented services competition (other countries selling to us), challenges for privacy and identity security, and increasing reliance on increasingly vulnerable systems. What we cannot do is sit back and let it just happen.
We have no shortage of ideas and clever people. Australia is considered a world leader in radio communications and in a number of high-impact science areas, but we also have an ageing population, many industry sectors with declining growth, a health sector that is now our biggest employer, and exposure to natural disasters including floods, droughts and bushfires. We can develop solutions to help ourselves, and services to export to the rest of the world. But the challenge always is achieving scale. The ICT finalists are inspiring examples of dreamers bringing their ideas forward. If you are a member of the doers, have dollars or can bring the data, strike up a conversation with our dreamers.
We all know the Melbourne Cup as the race that stops the nation; bringing together punters, public holiday enthusiasts and those of us who like to don a fancy hat.
One day shy of the race, we’re bringing together an iconic Thoroughbred.
For many years Phar Lap, Australia’s most famous horse, has been in pieces. Phar Lap’s heart—currently preserved in a jar of formaldehyde—is at the National Museum of Australia in Canberra, his hide is at the Melbourne Museum, and his skeleton is at the Museum of New Zealand.
Today, students from three schools around the country will see all the pieces of Phar Lap at once, without leaving the classroom.
This is where the robot comes in. Our Museum ‘bot, who lives at the National Museum of Australia, is driving the reunion of Phar Lap’s pieces across the Tasman with the help of an expert guide from the National Museum of Australia.
Logging in through their classroom smart board or computers, students will control their own view of Phar Lap’s heart using the 360 degree panoramic camera on the robot’s head. They can ask the museum guide questions and can click on items in the exhibit to bring up images and more information. Phar Lap’s other pieces, his hide and skeleton, will be seen on the same screen as fast broadband hooks the students up with museums in Canberra, Melbourne and New Zealand simultaneously.
“While the classroom sweep can be a bit of fun on Melbourne Cup day, we are giving students a much richer cultural and educational experience that they’ll hopefully remember for a long time” said Robert Bunzli, manager of the museum robot program at the National Museum.
“The students absolutely love hearing about animals and the part they have played in Australian history. Horses are a particular favourite of course, and most of them have heard of Phar Lap but don’t know anything about him. ”
And from what the students have told us, they agree. ”The tour was good and we enjoyed seeing the massive heart. It was better than flying to Canberra,” a student said. And our favourite, “OMG whoever invented the robot is a genius”.
Our robotics expert, Dr Jonathan Roberts, says the museum robot has shown the combination of immersive learning technology and fast broadband can deliver educational experiences to students no matter where they live.
“We are now looking to extend the application of our mobile telepresence system into other areas including remote training, retail, mining and manufacturing industries. At the moment we are investigating how this system could be used to remotely deliver health services such as providing specialist services to regional and remote communities, conducting medical training or facilitating remote ward rounds.”
* * *
Media: Sarah Klistorner, Communications Advisor. +61 2 9372 4662, Sarah. Klistorner@csiro.au
By Lidija Bosnjak
It’s been nearly 40 years since Star Wars Episode IV hit our screens – but sadly humans haven’t yet evolved to develop Jedi powers.
That’s not to say we aren’t trying.
You might remember the 2002 movie Minority Report. This futuristic flick made what’s known as ‘3D gesture control’ famous in the modern world. This technology is now a popular feature of many computer games that are controlled using body movements.
But it’s not all fun and games. The technology can also be applied in the medical field. It can help researchers design enhanced therapeutics and better understand their function by allowing them to visually observe and manipulate 3D structures of molecules like proteins.
Unfortunately this technology can be quite challenging for researchers to use. It often requires them to learn complex keyboard and mouse combinations so they can rotate the structures, select parts of large molecules and redefine the centre of rotation.
So we’ve teamed up with the Garvan Institute of Medical Research to develop the Molecular Control Toolkit – a new system that makes it more intuitive to control 3D molecular structures.
The Toolkit allows users to manipulate 3D objects in their hand by connecting multiple gesture and voice recognition devices (currently the LeapMotion or Kinect). It’s designed to be adaptable to any molecular graphics system and even supports voice commands when available. Check it out in action.
It’s the first toolkit for molecular graphics that supports multiple devices with a set of controls that are rich enough to be used in the daily work of many life scientists. And it’s pushing the boundaries of biological data visualisation and interaction.
The online retail sphere is rapidly growing, with more people using their smart phones and tablets to make everyday purchases.
But we’re taking online shopping to a completely new level by teaming up with online eyewear retailer Glasses.com.
Using our Smart Vision technology, Glasses.com has developed a new iPad app that allows customers to virtually try on specs from the comfort of their own home.
The Smart Vision software allows the app to produce a true-to-life 180° view of the user’s face showing how each pair fits in 3D. It also lets them compare side-by-side images of each style. Check it out:
Because the technology places a 3D model of the glasses over a 3D rendering of the user’s face, shoppers can even reposition the glasses, tapping the screen to ‘slide’ frames up and down the bridge of their nose just as they would in real life.
Until now technology for virtually trying on glasses has been pretty simplistic and looked more like online gaming graphics rather than a photo realistic experience.
Recognising the growing need for a more accurate try-on service, Glasses.com approached our Digital Productivity Flagship to create a 3D face scanning method that could utilise the camera on a smart phone or tablet.
“We didn’t want to just replicate the offline shopping experience – we wanted to improve it. We’ve created a tool that’s so true-to-life it’s not only fun, but actually useful in making a purchasing decision,” said Jonathan Coon, CEO of Glasses.com’s parent company 1-800 CONTACTS.
Unfortunately Aussie shoppers might have to wait a while before the app hits the Australian iTunes store. But if you happen to have a US iTunes account, try it out for yourself and let us know what you think.
More info on our website.
Media: Daniel Chamberlain, Daniel.Chamberlain@csiro.au T: +61 2 9372 4491 M: +61 477 708 849
For the past two days I’ve been stuck in bed sick with a horrendous cough. By the end of day two I’m over it. I contemplate the 5 minute drive down the road to the doctor’s surgery, but the thought of sitting in a room full of other sick people with god-knows-what bugs floating around kind of creeps me out.
If only just like last’s week’s restaurant reservation booked online or that Skype call with my friend overseas, I could just schedule an online tele-consultation with my doctor without leaving the comfort of my own bed. I admit it might seem like a Gen Y problem but why isn’t a visit to the doctor easier in this day and age? Surely the idea of a video-conference with my GP to assess my signs and symptoms is better than trusting Google and coming up with my own self diagnosis right? It might just be a common cold or perhaps something more serious, but within a few minutes at least I’d know whether I really needed to take that trip to a germ invested waiting room for some antibiotics.
But, what if I lived in a remote community in Western Australia where my 5 min drive to the doctors could actually be 4 or even 6 hours drive. Now that’s a long way to go for a consultation! I’d have to take a whole day off work just to visit a doctor and in the end they may tell me I’m fine after all and I would’ve wasted a whole day driving in the car. And what if my flu was something more serious such as an eye disease like glaucoma or diabetic retinopathy.
As we face the pressure of rising costs and demands on our health system due to an ageing population and an increase in chronic disease, we need to think differently about how we deliver health services says Dr Sarah Dods, Leader of Health Services for CSIRO’s Digital Productivity and Services Flagship.
“We are currently spending 20 cents in every tax dollar on health and that is forecast to increase to 40 cents in every dollar by 2043. At that stage health will consume our entire state government budgets if we don’t change the way that we do things.”
Telehealth services offer us an opportunity to do old things in new ways and new things in ways we never thought of. They can help us to responsibly improve productivity, improve access to health services so that escalating health issues can be addressed earlier, and offer better quality of care to patients.
Telehealth services made possible by the arrival of fast broadband services across Australia can deliver many health services especially into remote communities, reducing the need for travel; providing timely access to services and specialists; improving the ability to identify developing conditions and provide a means to educate and train and support remote healthcare workers.
They can also reduce the burden on our health system by keeping hospital ‘frequent flyers’ such as chronic disease sufferers or the elderly, which accounted for over 70% of Australia’s $103.6bn health expenditure during 2007-2008, to manage their conditions from home.
Video-based teleconsultations are now available to patients in some locations, but this is the tip of the iceberg in terms of what broadband can deliver.
At CSIRO we are on the way to help making the next generation of telehealth services a reality.As part of the NBN enabled Telehealth Pilots Program administered by the Department of Health and Ageing (DoHA) funded by the Department of Broadband and Communications and the Digital Economy, CSIRO has been awarded two large grants to help conduct important clinical trials needed to inform the wide scale roll out and integration of telehealth services for Australia. The trials will run for 12 months and involve over 1100 patients across the two studies in rural health clinics, hospitals, local health care districts, nursing homes and patients in their own homes across Australia.
“We are very excited to be at the helm of these projects bringing together the best minds across CSIRO from health services research, computer science, mathematics, statistics and social science, to work with our partners on Australia’s largest telehealth study” said Dr Dods
“Our team is also looking forward to working on our NBN-enabled Indigenous Tele-Eye Care project testing our Remote-I system over satellite broadband service. We will show how this telehealth service can be rolled out to multiple rural and remote areas and help address the difficulties these indigenous communities face in accessing specialist eye care services”.
To find out more about our NBN telehealth research trials.
Today our latest robots come to life. The stars of our Museum Robot project, B1 and B2, will use telepresence technology to roam the galleries of the National Museum of Australia. Using high speed broadband, the robots will allow remote visitors to control their own view of museum exhibits while interacting with a museum educator.
“The Museum Robot is a fantastic initiative and a perfect example of some of the applications made possible by the NBN. This kind of rich and interactive experience, nationally accessible, depends on the type of synchronous communication made possible by high speed broadband,” Senator Conroy said.
The robot has a motorised base with wheels, a touch-screen display, and a ‘head’ that is a 360 degree panoramic camera. It also houses several on-board computers and Wi-Fi antennas. The robot accompanies an educator around the Museum, applying its navigational and sensing capabilities to plan its route and avoid obstacles and pedestrians.
The trial is being conducted at the National Museum’s Landmarks Gallery, which features national treasures such as Phar Lap’s heart and the Holden Prototype No 1, the original Holden motor car. During the trial, the robot will be accessible via schools and libraries with an NBN connection.
Have you ever sat in a hospital emergency department? Imagine a busy waiting area filled with people waiting to be treated, sick children and the elderly, broken bones and patients wheeled in on ambulance trolleys needing urgent surgery. How do we make sure that everyone can be seen and treated within four hours and that there is a hospital bed ready and waiting for those that need it?
For the majority of Australian patients visiting a public hospital, the average waiting time in an emergency department is around 8 hours and 28 minutes but in some cases can be as high as 15 hrs or more. 
As our population is ages, becomes chronically ill and the cost of healthcare approaches 40c in our Australian tax dollar over the next 30 years, we are faced with the urgent challenge of improving the efficiency and delivery of our healthcare services.
Overcrowding in hospitals is one of the biggest challenges facing our healthcare system and has been labelled an ‘international crisis’ , which can have a significant impact on the quality of patient care and experience.
To address this issue, National Emergency Access Targets (NEAT for short) introduced by the Federal Government in 2011, will require public hospitals to ensure that 90% of all patients arriving at emergency departments are seen and either admitted, discharged or transferred within four hours by 2015.
This week is Australian Healthcare Week and Dr Sarah Dods, Research Leader for Health Services in our Digital Productivity and Services Flagship, is launching a whitepaper called ‘Evidence based strategies for meeting hospital emergency targets’ at the Healthcare Efficiency Through Technology conference in Sydney.
Dr Dods will be discussing how our patient flow modelling tools can help hospitals understand what they can do to cut emergency waiting times to meet these new emergency targets and improve hospital efficiency.
This could include; analysis of bed configuration, patient flow, the effect of overcrowding, predicting which patients might be readmitted and when there may be an influx of patients due to a strong flu season, for example. All of these areas of analysis can provide a whole-of-system approach to hospital operations to help hospitals meet emergency targets.
One example of this that is already assisting hospitals is our Patient Admission Prediction Tool (PAPT), where we have been working to reduce hospital waiting times and identify bottlenecks in Queensland hospitals by predicting how many patients will turn up in emergency departments and when.
Contrary to the belief that emergency patient volumes are unpredictable, the number of admissions per day can be predicted with remarkable accuracy. PAPT uses historical data to provide an accurate prediction of not only the expected patient load but their medical urgency and specialty, and how many will be admitted and discharged.
The system allows hospital management to accurately forecast service demands for inpatient and emergency department beds well in advance, enabling hospitals to manage beds and schedule elective surgeries for quieter times. The software also allows on-the-ground staff to see how many patients are likely to arrive in the hour, the rest of the day, into next week or even holidays with varying dates such as Easter.
Check out PAPT in action in this story on Channel Ten news.
Transcript available here
To find out more, visit the health services webpage or read the new whitepaper Evidence driven strategies for meeting hospital performance targets.