By James Davidson
Cylons, Skynet, HAL 9000, Agent Smith, Haley Joel Osment. With characters like these, it’s no wonder people are concerned about the intelligent machines of tomorrow. But is there really any reason to fear? The truth is artificial intelligence (AI) is proving to be quite helpful…at least so far.
Clever, self-learning computer systems are helping us answer some of the world’s biggest problems – like how to predict bushfire hotspots. Unlike traditional methods where our best guesses are subjective, intelligent computers can use machine learning to replicate events based on advanced pattern recognition.
This month, our researchers revealed an AI system that could help us plan for future fires. It’s based on artificial neural networks (ANNs) which have actually been around since the 80s. These models allow computers to learn from data and provide a pretty accurate estimate of future events, eliminating many assumptions.
Today, ANNs are being used for deep cognitive imaging, an advanced form of pattern recognition. Based on this idea, our team of machine learning experts have built a deep cognitive learning system using ANNs to predict fire incidents across Australia. This could be the first step in providing information for emergency services planners to decide where to focus future firefighting resources.
So how does it work?
To put it simply, a computer is shown an image that represents a set of data. It’s then shown another image that has resulted from the first. The computer doesn’t know how or why the two are related, but it learns to estimate the outcome based on the first input. Essentially, the computer learns the cause-and-effect relationship so it can also predict the effect side of the equation in different scenarios.
Our team trained their computer to learn the relationship between Australian climate maps and fire hotspots. To do this, they presented it with maps of Australia’s past climate using data from the Bureau of Meteorology. Next, they showed it maps of fire hotspots complied from satellite imagery data collected by NASA.
The computer wasn’t told how the two maps were related, other than the fact that the first map resulted in the second. But, almost magically, it was able to use the ANN to learn how to reproduce the fire maps.
Then, they got even trickier. They showed the computer a scenario based on Australia’s climate between 2001 and 2010. It was able to replicate the real world occurrence of fire hotspots with 90 per cent accuracy at the 5 x 5 kilometre scale. Not bad!
It’s early days for this AI, but unlike the scary smart machines of film fiction, this work poses no threat to human life. Instead, it could go a long way towards saving lives by improving our understanding of how different climate scenarios impact fire regimes across Australia.
We’re also working on a suite of other smart tools for disaster management and recovery. Learn more in our media release.
November 27-28 is the Building a System of Systems for Disaster Management event in Victoria. We’re looking at how Australia’s key agencies can improve the way they access vital information during emergencies.
Six out of ten Australians don’t eat enough fibre, and even more don’t get the right combination of fibres.
Eating dietary fibre – food components (mostly derived from plants) that resist human digestive enzymes – is associated with improved digestive health. High fibre intakes have also been linked to reduced risk of several serious chronic diseases, including bowel cancer.
In Australia, we have a fibre paradox: even though our average fibre consumption has increased over the last 20 years and is much higher than in the United States and the United Kingdom, our bowel cancer rates haven’t dropped.
This is probably because we’re eating a lot of insoluble fibre (also known as roughage) rather than a combination of fibres that includes fermentable fibres, which are important for gut health.
The different types of fibre
Eating a combination of different fibres addresses different health needs. The NHMRC recommends adults eat between 25 and 30 grams of dietary fibre each day.
For convenience, dietary fibre can be broadly divided into types:
- Insoluble fibres or roughage promote regular bowel movements. Sources of insoluble fibre include wheat bran and high-fibre cereals, brown rice, and wholemeal breads.
- Soluble fibres slow digestion, lower plasma cholesterol levels, and even out glucose uptake to the blood. Sources of soluble fibre include oats, barley, fruits, and vegetables.
- Resistant starches contribute to health by feeding good bacteria in the large bowel, which improves its function and reduces risk of disease. Sources of resistant starch include legumes (lentils and beans), cold cooked potatoes or pasta, firm bananas, and whole grains.
Resistant starches are perhaps the least well known of the different types of fibre, but they may be the most important for human health.
International studies find a stronger association with reduced bowel cancer risk for starch consumption than total dietary fibre.
Resistant starch provides a likely mechanism for this association because it promotes gut health through the short-chain fatty acids produced by good bacteria. The short-chain fatty acid butyrate is the preferred energy source for cells that line the large bowel.
If we don’t eat enough resistant starch, these good bacteria in our large bowel get hungry and feed on other things including protein, releasing potentially damaging products such as phenols (digestion products of aromatic amino acids) instead of beneficial short-chain fatty acids.
Eating more resistant starch protects the bowel from the damage associated with having a hungry microbiome. It can also prevent DNA-damage to colon cells; such damage is a prerequisite for bowel cancer.
Consuming at least 20 grams a day of resistant starch is thought to promote optimal bowel health. This is almost four times more than a typical western diet provides; it’s the equivalent to eating three cups of cooked lentils.
In the Australian diet, resistant starch comes mostly from legumes (beans), whole grains, and sometimes from cooked and cooled starches in dishes such as potato salad.
This is in stark contrast with other societies, such as India where legumes are a significant part of the diet, or South Africa where maize porridge is a staple often eaten cold.
Cooling starches allows the long chains of sugars that make them up to cross-link, which makes them resistant to digestion in the small intestine. This, in turn, makes them available to good bacteria in the large bowel.
A healthy digestive system is critical for good health, and fibre promotes digestive health. While most of us feel uncomfortable talking about our bowel movements, having an understanding of what is optimal in this department can help you adjust the amount of fibre in your diet.
There’s a wide array of bowel habits in the normal population, but many health experts agree that using tools such as the Bristol stool chart can help people understand what bowel movements are best. As usual with medical advice, if you’re concerned you should start a conversation with your doctor.
A high-fibre diet should give you a score of four or five on the Bristol stool chart, and less than four could indicate that you need more fibre in your diet. If you increase your fibre intake, you will also need to drink more fluids because fibre absorbs water.
But gut health is not as simple as just ensuring regular bowel motions. Australians are, on average, eating sufficient insoluble fibre, but not enough resistant starch, which promotes gut health by feeding good bacteria in the large bowel.
Resistant starches are fermentable carbohydrates, so you might wonder if eating more of them will increase flatulence. Farting is normal and the average number of emissions per day is twelve for men and seven for women, although that varies for both sexes from two to 30 emissions.
Nutritional trials have shown high-fibre intakes of up to 40 grams daily, including fermentable carbohydrates, don’t lead to significant differences in bloating, gas or discomfort, as measured by the Gastrointestinal Quality of Life Index.
Nonetheless, it’s sensible to increase your fibre intake over weeks and drink adequate water. You might change to a high-fibre breakfast cereal one week, change to a wholegrain bread the next, and gradually introduce more legumes over several weeks.
A slow increase will allow you and your good bacteria to adjust to the high-fibre diet, so that you aren’t surprised by changes in your bowel habits. The composition of bacteria in your large bowel will adjust to suit a high-fibre diet, and over weeks these changes will help you process more fibre.
Getting enough fibre is important, but getting a combination of fibre is imperative for good digestive health.
Most people know that eating insoluble fibre improves regular bowel movements, but the benefits of soluble fibre in slowing glucose release and resistant starch in promoting beneficial bacteria are less well known. Including a variety of fibres in your diet will ensure you get the health benefits of all of them.
Remember the ol’ days of dial-up internet? When you got disconnected every time the phone rang and used up all your drive space to download one little file? Man, life was hard.
Luckily in the 90s our peeps came up with a little something called WiFi – and hallelujah all of our first world problems were solved.
Using the same mathematics that astronomers initially applied to piece together the waves from black holes, the potential of WiFi became ‘patently’ clear to its inventors. Today, its myriad of applications have fundamentally changed how we think of and use technology in our daily lives. In fact, by the end of this year more than 5 billion devices will be connected to our WiFi patented technology. The discovery is one of our most successful inventions to date and is internationally recognised as a great Aussie science success story.
This infographic explains how WiFi technology was created and how it actually works (click for full size):
While WiFi was developed as part of our previous ICT Centre and Radiophysics Research Division, our main wireless networks laboratory is now a part of our new Computational Informatics Research Division and has approximately 50 researchers located at our Marsfield site in Sydney.
These days, we are working with industry partners around the world on new challenges such as using wireless tracking tools to help improve the performance of athletes and ensuring the safety of miners, firefighters and emergency service personnel. We’re also helping farmers monitor soil fertility, crop growth and animal health by integrating wireless networks with centralized cloud computing.
Learn more about how we patented Wireless LAN technology.
Media: Dan Chamberlain. P: +61 2 9372 4491 M: 0477 708 849 Email: email@example.com
By Leon Rotstayn, Senior Principal Research Scientist, Marine and Atmospheric Research
Climate scientists have established a convincing case for the link between increasing concentrations of greenhouse gases and observed warming of the Earth since the 19th century. The Fifth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) stated, “Human influence on the climate system is clear.
“This is evident from the increasing greenhouse gas concentrations in the atmosphere, positive radiative forcing, observed warming, and understanding of the climate system.” It also concludes that “it is extremely likely that human influence has been the dominant cause of the observed warming since the mid-20th century.”
Aside from carbon dioxide, another human influence on climate comes from aerosols, which exert a cooling effect. Aerosols (that is, atmospheric particles, not propellants used in spray cans) have masked some of the warming that is caused by increasing greenhouse gases.
Without the masking effect of aerosols, global temperatures would have increased more than they have since the 19th century.
I recently led a study that examined the effects of declining aerosols in 21st century climate projections. We found that global warming is likely to accelerate in the next few decades, if the cooling influence of human-generated aerosols declines as predicted.
What are aerosols?
Aerosols are atmospheric particles, which have an overall cooling influence on climate by reflecting sunlight back into space. They also have an indirect effect by making clouds brighter; this further increases the reflection of sunlight back into space. Sources of human-generated aerosols include the use of fossil fuels and burning of vegetation.
Although human sources of aerosols are broadly similar to those of carbon dioxide, there is an important difference.
Emissions of carbon dioxide are intrinsically linked to the energy content of the fuel, so increasing energy use leads to increasing emissions of carbon dioxide. But aerosols are produced as a by-product of the combustion process, and in many cases there are technologies that can reduce emissions of aerosols (or gases that subsequently form aerosols in the atmosphere).
Because aerosols have harmful effects on human health and the environment, such technologies have been deployed in the industrialised world for some time. As long ago as the mid-1970s, emissions of sulfur dioxide from coal-fired power stations started to decline in Europe and North America, due to controls that were introduced to combat acid rain, which was destroying forests. These controls also had the effect of reducing sulfate, an aerosol that exerts cooling effects on climate.
More recently, authorities in China recognised the problems caused by aerosol pollution, and began to introduce emission controls, similar to those first seen in the industrialised world in the 1970s. Observations in China show that aerosol pollution peaked in 2006, and has started to decrease since then, despite continuing rapid economic growth. However, high levels of aerosol pollution are still causing serious concerns about health effects in China, suggesting that there is a strong need to further reduce emissions.
In many other developing countries, aerosol emissions are still increasing.
Over the last several years, climate modellers from many research centres carried out new climate projections, which provided input to the IPCC Fifth Assessment Report. These climate projections are driven by a range of scenarios (or “pathways”), which have different assumptions about changing levels of greenhouse gases.
A common feature of all the pathways is that aerosol emissions decline sharply during the 21st century. The projected decline is based on the assumption that once wealth per capita reaches a certain level in each country, there will be an increased focus on cleaner, healthier air.
In other words, it is assumed that during the 21st century the developing world will follow a path similar to the industrialised world, where aerosol emissions have declined in recent decades.
What happens to climate if aerosols decline?
Whereas increasing aerosols have masked global warming in the past, projected declines in aerosol emissions would unmask the warming effects of increasing greenhouse gases.
We are currently going through a transition. Until recently, aerosols have been acting like a “handbrake” on global warming. Over the next few decades, the decline of aerosols is expected to accelerate global warming, adding to the effects of increasing greenhouse gases.
Results from CSIRO climate modelling suggest that the extra warming effect from a decline in aerosols could be about 1 degree by the end of the century. But the size of this effect is very uncertain, so we compared the results from the CSIRO model with those from a range of international models.
We found that models with a stronger aerosol cooling effect in the 20th century tend to simulate greater warming in the 21st century. In other words, climate models with a stronger aerosol masking effect also have a stronger unmasking effect as aerosols decline.
Understanding aerosol effects is one of the biggest challenges for climate scientists. Aerosol processes are highly complex, and the magnitude of aerosol cooling effects in today’s climate is uncertain.
Every aerosol plume contains a mind-boggling soup of different chemical species; some of these (most notably black carbon) actually exert warming effects on climate, partly offsetting the cooling effects of other species such as sulfate. It is also unclear whether aerosols will really decline as rapidly as assumed in the projections.
Aerosols present an intriguing policy challenge. Concerns about toxic effects of aerosols on health and the environment provide strong reasons to reduce their emissions. But a uniform reduction in aerosol emissions is expected to accelerate global warming.
Based on this research, scientists have suggested that selectively reducing black carbon emissions is a possible option for mitigating global warming that will also have important health benefits.
Leon Rotstayn receives funding from the Australian Government Department of the Environment through the Australian Climate Change Science Programme.
More climate news on our Climate Response blog.
What do ants, Darwin and Texas have in common? Why, it’s Fullbright Scholar Israel Del Toro.
Born and raised in Texas and currently studying at the University of Massachusetts, Israel was given the opportunity to work with us in Darwin because of his expertise in ant ecology.
He created statistical and geographical models to predict how our ant communities might react to regional climate change. This information will help us conserve habitats and species across different ecosystems.
Sadly, even our little ants aren’t immune to the warming climate. Around 25 per cent of species in Israel’s study showed major declines in their range and could possibly face extinction as their habitats change over the next 65 years.
Our Darwin lab (informally known as the centre for ants) was the perfect location for Israel to carry out his research. Here we hold the world’s most extensive collection of Australian ants with over 5,000 different species – now that’s something to brag about.
“Working with ants is what got me hooked on ecology research. But ant diversity in the US is quite small compared to the wealth of species found in Australia. So for me, coming here to expand on my research interests was a logical next step in my career.”
Israel has just returned to America to finish his PhD. He plans on defending his dissertation early next year and wants to start a postdoc soon afterwards.
“This year has really opened up new doors for me. Doing research and remote fieldwork in the Top End has been amazing. There’s nothing quite like accessing field sites in helicopters in places like Kakadu National Park.”
For more information on careers at CSIRO, follow us on LinkedIn.
By Carrie Bengston
For us, Movember isn’t just about blokes growing facial hair and raising funds for men’s health – it’s a chance to collect data and muck around with technology.
Computer fluid dynamicist Fletcher Woolard is more used to animating geophysical flows like tsunamis and landslides. But this month, he thought he’d try something a little different – animating mo growth.
Using his skills in computer simulation, he photographed day by day, millimetre by millimetre, follicle by follicle how his mo was growing – and turned it into a cool time lapse video.
In just four seconds, you can see Fletcher’s facial hair growing at around 400,000 times the normal speed.
Unfortunately his efforts were still a long way off Ram Singh Chauhan, who has spent over thirty years crafting an impressive 4.29 metre long moustache. But hey, it’s not bad for less than a month’s growth.
Perhaps not surprisingly, this isn’t our first attempt at capturing hair growth data.
Back in 2008, our image analysis team developed software to count and measure hair regrowth. It was designed to test the effectiveness of hair removal products more accurately – which is traditionally a manual (and pretty boring) process.
The software took digital images from a specifically designed scanner pressed on to the skin, and used smart algorithms to automatically look for the hairs. Despite initial interest from several hair replacement studios, it sadly never made it to product stage
But all is not lost. Luckily in today’s world of mobile wireless technology, there’s an app for that – the mo tracker.
For more information or to get involved in Movember, head to Movember Australia.
By Michael Brünig, Deputy Chief, Computational Informatics
There isn’t a radio-control handset in sight as a nimble robot briskly weaves itself in and out of the confined tunnels of an underground mine.
Powered by ultra-intelligent sensors, the robot intuitively moves and reacts to the changing conditions of the terrain, entering areas unfit for human testing. As it does so, the robot transmits a detailed 3D map of the entire location to the other side of the world.
While this might read like a scenario from a George Orwell novel, it is actually a reasonable step into the not-so-distant future of the next generation of robots.
A recent report released by the McKinsey Institute predicts the potential economic contribution of new technologies such as advanced robotics, mobile internet and 3D printing are expected to return between US$14 trillion and US$33 trillion globally per year by 2025.
Technology advisory firm Gartner also recently released a report predicting the “smart machine era” to be the most disruptive in the history of IT. This trend includes the proliferation of contextually aware, intelligent personal assistants, smart advisers, advanced global industrial systems and the public availability of early examples of autonomous vehicles.
If the global technology industry and governments are to reap the productivity and economical benefits from this new wave of robotics they need to act now to identify simple yet innovative ways to disrupt their current workflows.
The automotive industry is already embracing this movement by discovering a market for driver assistance systems that includes parking assistance, autonomous driving in “stop and go” traffic and emergency braking.
In August 2013, Mercedes-Benz demonstrated how their “self-driving S Class” model could drive the 100-kilometre route from Mannheim to Pforzheim in Germany. (Exactly 125 years earlier, Bertha Benz drove that route in the first ever automobile, which was invented by her husband Karl Benz.)
The car they used for the experiment looked entirely like a production car and used most of the standard sensors on board, relying on vision and radar to complete the task. Similar to other autonomous cars, it also used a crucial extra piece of information to make the task feasible – it had access to a detailed 3D digital map to accurately localise itself in the environment.
When implemented on scale, these autonomous vehicles have the potential to significantly benefit governments by reducing the number of accidents caused by human error as well as easing traffic congestion as there will no longer be the need to implement tailgating laws enforcing cars to maintain large gaps in between each other.
In these examples, the task (localisation, navigation, obstacle avoidance) is either constrained enough to be solvable or can be solved with the provision of extra information. However, there is a third category, where humans and autonomous systems augment each other to solve tasks.
This can be highly effective but requires a human remote operator or depending on real time constraints, a human on stand-by.
The question arises: how can we build a robot that can navigate complex and dynamic environments without 3D maps as prior information, while keeping the cost and complexity of the device to a minimum?
Using as few sensors as possible, a robot needs to be able to get a consistent picture of its environment and its surroundings to enable it to respond to changing and unknown conditions.
This is the same question that stood before us at the dawn of robotics research and was addressed in the 1980s and 1990s to deal with spatial uncertainty. However, the decreasing cost of sensors, the increasing computing power of embedded systems and the ability to provide 3D maps, has reduced the importance of answering this key research question.
In an attempt to refocus on this central question, we – researchers at the Autonomous Systems Laboratory at CSIRO – tried to stretch the limits of what’s possible with a single sensor: in this case, a laser scanner.
In 2007, we took a vehicle equipped with laser scanners facing to the left and to the right and asked if it was possible to create a 2D map of the surroundings and to localise the vehicle to that same map without using GPS, inertial systems or digital maps.
The result was the development of our now commercialised Zebedee technology – a handheld 3D mapping system incorporates a laser scanner that sways on a spring to capture millions of detailed measurements of a site as fast as an operator can walk through it.
While the system does add a simple inertial measurement unit which helps to track the position of the sensor in space and supports the alignment of sensor readings, the overall configuration still maximises information flow from a very simple and low cost setup.
It achieves this by moving the smarts away from the sensor and into the software to compute a continuous trajectory of the sensor, specifying its position and orientation at any time and taking its actual acquisition speed into account to precisely compute a 3D point cloud.
The crucial step of bringing the technology back to the robot still has to be completed. Imagine what is possible when you remove the barrier of using an autonomous vehicle to enter unknown environments (or actively collaborating with humans) by equipping robots with such mobile 3D mapping technologies. They can be significantly smaller and cheaper while still being robust in terms of localisation and mapping accuracy.
From laboratory to factory floor
A specific area of interest for this robust mapping and localisation is the manufacturing sector where non-static environments are becoming more and more common, such as the aviation industry. Cost and complexity for each device has to be kept to a minimum to meet these industry needs.
With a trend towards more agile manufacturing setups, the technology enables lightweight robots that are able to navigate safely and quickly through unstructured and dynamic environments like conventional manufacturing workplaces. These fully autonomous robots have the potential to increase productivity in the production line by reducing bottlenecks and performing unstructured tasks safely and quickly.
The pressure of growing increasing global competition means that if manufacturers do not find ways to adopt these technologies soon they run the risk of losing their business as competitors will soon be able to produce and distribute goods more efficiently and at less cost.
It is worth pushing the boundaries of what information can be extracted from very simple systems. New systems which implement this paradigm will be able to gain the benefits of unconstrained autonomous robots but this requires a change in the way we look at the production and manufacturing processes.
This article is an extension of a keynote presented at the robotics industry business development event RoboBusiness in Santa Clara, CA on October 25 2013.