Why Aussie farmers should start sowing earlier than Anzac Day

Ear of wheat

Anzac Day is a time for honouring our soldiers, eating Anzac bikkies and enjoying a couple of cold bevies while watching the Footy. For Australian wheat farmers, Anzac Day also marks the important start to the sowing season, with late April through to May having long been accepted as the optimal time for sowing wheat in Australia.

But now our research is questioning this common logic. In fact, a team of our scientists in the Agriculture team are now recommending sowing earlier; any time from early April onward.

They’ve been trialling early sowing around southern Australia, and the results were staggering. By including early sown wheat in cropping programs, yield was increased by an average 13-47 per cent across all regions.

Traditionally Anzac Day is seen as the first day of the sowing season, our researchers are recommending farmer's get a earlier start for bigger yields

Traditionally Anzac Day is seen as the first day of the sowing season, our researchers are recommending farmer’s get a earlier start for bigger yields

Why would sowing earlier lead to higher yields?

Rainfall is critical for the establishment of the mid to fast growing wheat varieties currently popular with Australian farmers and autumn’s historically good sowing rainfall allows the flowering of cereal crops to occur at the best possible time. This is vital to yield and, subsequently, a farmer’s profit.

But a changing climate, declining autumn rains and more extreme spring weather means that conventional sowing times are no longer ideal. Farmers waiting until Anzac Day to sow may miss the best opportunity to get the highest yield.

So we’ve been looking at ways to do things differently.

While our grain researchers were pondering the challenge of maximising farm water efficiency, (research that won them a Eureka prize in 2014) they began considering crop-sowing strategies that would use the increasing summer and early autumn rainfall to establish wheat crops earlier. After all, the idea is to get as much of the wheat crop as possible to flower during the optimal period.

And it’s not enough for farmers to just start sowing their crops earlier because the early sowing of currently popular varieties of fast-maturing wheat presents a different problem: fast maturing wheat, matures, well… fast. This makes the risk of frost damage occurring during flowering stage likely, as fast maturing wheat sown in early autumn will flower right about the time night air starts dropping below a frosty two degrees Celsius.

Winter wheats have an in-built cold requirement that stops them from developing to earlier - perfect for earlier sowing

Winter wheats have an in-built cold requirement that stops them from developing too early – perfect for earlier sowing

Frost damage reduces grain quality and yield, so to navigate this challenge the team of researchers needed a solution. The answer is the rarely used, slow-maturing ‘winter wheat’ which can account for this issue.

Farming is an art and knowing the optimal flowering window is key to getting the best yield. So it isn’t enough for farmers to start sowing in early April, for the best results we recommend combining crops – beginning with winter wheat sowing in early April, and then staggering regularly used mid to fast maturing crops at ten day intervals.

This is great news for farmers: not only do they produce a bigger yield of wheat and improve their farm’s sustainability and profitability, but with the pressure off to sow their entire allocation around April 25, they may even find time to take part in the dawn service and watch the Anzac day clash!

You can read more about Dr Hunt’s research here: Optimising grain yield and grazing potential of crops across Australia’s high rainfall zone: a simulation analysis

This research is funded by the Grains Research and Development Corporation (GRDC)


Imported berries and Hepatitis A: a complex issue with no simple solutions

Food safety begins at the source. Harvesting Canola near Binalong, NSW.

Food safety begins at the source. Harvesting Canola near Binalong, NSW.

By Dr Narelle Fegan and Dr Andrew Leis 

Keeping our food safe

The recent outbreak of hepatitis A, which is thought to be associated with the consumption of frozen berries, has highlighted food safety concerns and sparked debates around country of origin labelling and testing of imported foods. Ensuring the safety of our food supply can be a complex process that involves maintaining good hygienic practices in the production and handling of foods at all stages between the farm and consumption.

With some foods, we can reduce the risk of foodborne illness through a heating process, which includes practices like cooking, canning and pasteurisation. However, with fresh produce (such as leafy green and fruits), a heating step is less desirable – we have to rely more on hygienic production to deliver a safe food product. There are quality assurance schemes in place to ensure that fresh produce is grown, harvested, packed and transported to limit contamination by foodborne pathogens.

These schemes rely on people involved in all parts of the production chain following the procedures outlined, to deliver a product that is safe to consume. Such quality assurance schemes operate across the globe and imported products are required to meet the same hygienic standards as food produced in the importing country.

Can testing of food ensure it is safe to eat?

Microbiological testing of foods is only one aspect of quality assurance schemes designed to help keep our food safe. Scientific evidence and history tells us that testing of products for pathogens is not an efficient way of determining if food has been contaminated.

This is particularly true of pathogens that occur very rarely in food (such as hepatitis A) as only a very small amount of the food will be contaminated, and we can’t guarantee we will sample the portion of food where contamination occurs. The difficulties associated with pathogen testing of foods include:

  • Contamination is not evenly distributed within the food and only certain portions of the food may contain the pathogen.
  • Testing for foodborne viruses destroys the portion of food that is tested.
  • Because the food is destroyed during testing, not all of the food can be tested as there would be nothing left for us to eat. Only some of the food can be tested – but  sampling plans have been developed to try and maximise the chances of detecting foodborne pathogens.
  • Testing methods for foodborne viruses in particular can be difficult to perform, as we have to try and isolate viruses and their genetic material from foods which are often very complex in nature (containing fats, sugars and salts, which can all make it more difficult to detect pathogens).

For these reasons testing of food is only one part of quality assurance schemes, with more attention focusing on hygienic production to limit the opportunities of food becoming contaminated with pathogens.

Image credit: Pixabay.com

Image credit: Pixabay.com

Why would frozen berries be at risk of carrying hepatitis A?

Freezing is a highly effective and convenient way to increase the shelf-life of foods, and unlike heat-based sterilisation techniques, it preserves most of the nutritional value of the food (some components, like vitamins, are quickly destroyed by heat). Freezing not only prolongs shelf life but also allows us to enjoy very seasonal products, such as berries, at any time of the year.

Preservation of viruses and bacteria during freezing is affected by the rate of freezing and the amounts of sugars and other molecules nearby that help to slow the growth of ice crystals. In a household freezer, water freezes quite slowly – consider the time it takes to freeze water in ice cube trays. Slow freezing favours the formation of ice crystals. As the crystals grow in size, they can kill some bacteria and viruses. On the other hand, high local concentrations of sugars and other molecules can protect the microorganisms from damage.

Frozen berries are generally safe to eat, and have only occasionally been involved in foodborne outbreaks. Hepatitis A virus infection as a result of eating contaminated food (not just berries) is also very rare, particularly in Australia where on average only five cases a year are associated with food consumption in Australia. This is very small compared to other foodborne pathogens in Australia such as Norovirus, where an estimated 276,000 cases a year are associated with food and bacteria, or Campylobacter where 179,000 cases are associated with the consumption of food.

CSIRO_HealthBites_Infographic_05_BeFoodSafe_FINAL_SML

What can we do to ensure the food we eat is safe?

It is not possible to ensure safety by testing a final product. Therefore, systems based on hazard analysis and identification of critical control points have been developed and adopted by governments and food producers through food regulations, industry guidelines and quality assurance schemes. However, human error through poor planning or poor execution can lead to one or more failures along the supply chain. The best thing we can do to ensure the food we eat is safe is to foster a culture of food safety. This means better educating all those involved in the food industry, as well as governments and consumers, so that they understand the safety risks associated with the production, manufacture and consumption of foods.Food safety needs to be seen as an investment, not a cost.

For more information, visit our website.


How we’re using science to turn wastewater into wine

It's a grape day for wine

It’s a ‘grape’ day for turning wastewater into wine.

Bill Gates caused a stir recently by drinking a glass of water that had, only five minutes earlier, been human waste.

No, Billionaire Bill hadn’t lost a dare. He was actually showcasing his faith in the latest wastewater processing technology – technology that could, if utilised properly, go a long way towards solving the global issue of access to clean drinking water.

Though, it’s not just drinking water that’s in the picture. Imagine, that instead of sipping from a glass of water, Bill was instead quaffing a Barossa Valley red, produced from a vineyard that uses wastewater to irrigate vineyards. It’s an entirely possible scenario (although we’re not sure how often Bill visits Tanunda).

For many, reconditioned wastewater is taboo for consumption, but as Bill so prominently demonstrated, wastewater processing technology is a viable way of both hydrating our planet AND reducing waste.

When affluence meets effluence. Photo credit: Screenshot via thegatesnotes

When affluence meets effluence. Photo credit: Screenshot via thegatesnotes

Which is why we’ve been working with some of Australia’s leading wineries to prove that wastewater can play an important role in wine production.

In a recently released report – Sustainable recycled winery water irrigation – we demonstrate how wineries could reuse their wastewater to safely irrigate their crops. Not only would the reuse of wastewater result in cost savings and better environmental practices, but it could even improve the quality and yield of the crops themselves.

The (Adelaide) hills are alive with sound of wine growing. Photo credit: CSIRO

The (Adelaide) hills are alive with sound of wine growing.

Our lead scientist on the report, Dr Anu Kumar, and her team developed the guidelines after rigorous field, laboratory and glass house trials with participating wineries in the Barossa Valley, Riverina and McLaren Vale regions.

Anu and her team looked at the options for the reuse of wastewater on the vineyards – irrigation, evaporation and disposal – and found that, on the whole, irrigation was the most sustainable.

The study found that wastewater containing less than 60 mg per litre of sodium, 1250 mg per litre of potassium and 625+1084 mg per litre of sodium plus potassium (in combination) was safe for application on grapevines. Of particular interest, the nutrients and organic matter in winery wastewater can even enhance soil productivity, increasing crop growth and yield.

In fact, some of the participating wineries were so satisfied with the results that they have begun implementing our guidelines themselves.

But Anu and her team have been upfront in explaining this isn’t a one size fits all solution. For instance, wastewater can also increase soil salinity, which is bad news for healthy soil.

“It really isn’t a one-approach method,” said Anu. “Individual wineries need to discuss how they use wastewater with experts to ensure that guidelines are being adhered too, as well as the strict regulatory conditions.”

When it comes to waste water, there is much to consider. Photo Credit:

When it comes to wastewater, there is much to consider.

Dr. Kumar and her research team will continue to work with their partners at the University of Adelaide and the Australian Grape and Wine Authority (AGWA) to share these findings with other wineries around Australia.

In a country like Australia that is so susceptible to drought conditions and water shortages, it’s important that we find more efficient and sustainable ways to use what can be such a scarce resource.

Now, to get Bill down to the Barossa for that glass of red…

In conjunction with Dr Kumar and her team, the Australian Grape and Wine Authority has published a useful resource kit which includes more information about winery wastewater management and recycling.


Staying MouseAlert, not MouseAlarmed, this Christmas

Sure, he might look cute, but this fella and his friends can cause a whole lot of trouble when they get together.

Sure, he might look cute, but this fella and his friends can cause a whole lot of trouble when they get together.

By Leon Braun 

“’Twas the night before Christmas, when all thro’ the house
Not a creature was stirring, not even a mouse …”

CSIRO scientists are keeping their eyes peeled for more than just Santa Claus this Christmas. With unusually high numbers of mouse sightings in Victoria this spring, CSIRO ecologist Peter Brown and colleagues at various Australian and New Zealand research agencies are monitoring mouse populations to see whether 2015 will bring a sigh of relief or send people scurrying for cover under a deluge of tiny, furry bodies.

While taken individually, mice can be rather cute (think Mickey, Mighty and Danger), en masse they can be absolutely devastating. In 1993, Australia’s worst ever mouse plague caused an estimated $96 million worth of damage, destroyed thousands of hectares of crops, blighted piggeries and ravaged poultry farms. The whiskered marauders chewed their way through rubber and electrical insulation, damaged farm vehicles, ruined cars and buildings. Another plague in 2010/11 was almost as bad, affecting 3 million ha of crops in NSW’s central west and the Riverina, as well as parts of Victoria and South Australia.

Eww.

This photo is named ‘Mice Mound”. There’s not much more we can add to that.

Along with economic hardship and disease, plagues bring severe psychological distress for people living through them.

“The sheer stress of dealing with mice in your kitchen every night takes its toll,” Peter says. “They’re everywhere: chewing, defecating, breeding.”

The good news is that with sufficient warning it is possible to prepare for mouse plagues, and to minimise the damage they cause, through early baiting and removing food supplies and cover. Over the years, our scientists have become increasingly accurate at predicting mouse plagues (they got it right in 1994 and 2001-2003) and have developed an ever more sophisticated range of tools to assist them. The latest weapon in their arsenal is “MouseAlert“, a citizen science website where keen-eyed rodent reporters can notify CSIRO about mouse sightings. The website is optimised for mobile phones, and Peter and his team hope to have an app out soon.

A recent mouse monitoring map. Marvelous!

A recent mouse monitoring map. Marvelous!

“Numbers are everything when you’re trying to predict a plague,” Peter says. “Traditionally we’ve used traps and chew cards [thin pieces of cardboard soaked in vegetable oil], but they have disadvantages, not least the fact that we’re not physically able to put them everywhere. MouseAlert allows us to capture data over a much wider area and potentially spot a plague well before it becomes a problem.”

Equally important as sightings, Peter says, are reports of where mice haven’t been.

“The jump from zero sightings to one or two can be an important indicator that mouse numbers are increasing,” he says. “By participating in citizen science, the public can help us identify these trigger points.”

So how are things looking this year? A little ominous, actually. Unusually high numbers of mice were seen in western Victoria in September. Depending on how much rain we get, they could build up to plague proportions by March or April next year. That’s why Peter wants mouse watchers to keep their eyes peeled:

“If it looks like there’s going to be a plague, we want to be able to give farmers plenty of time before sowing to prepare – or else put their minds at ease if it looks like there isn’t.”

So if you do see a mouse this Christmas Eve – stirring or not – get over to MouseAlert and report it. The pantry you’re saving could be your own!


Explainer: what is raw milk and why is it harmful?

There is no evidence that the health benefits of milk are compromised by pasteurisation. jacqueline/Flickr, CC BY-NC

There is no evidence that the health benefits of milk are compromised by pasteurisation. jacqueline/Flickr, CC BY-NC

By Edward Fox, CSIRO and Narelle Fegan, CSIRO

Milk is a highly nutritious food, and an important source of amino acids and minerals such as phosphorus and calcium, which contributes to bone health.

Historically, milk was prone to contamination by bacteria from cows that could cause severe illness in humans. This remains the case with raw (unpasteurised) milk. The tragic death of a Victorian toddler this week is a stark reminder of these risks.

Pasteurisation involves heating the product to 72°C for 15 seconds. The method was originally employed to destroy bacteria in wine and beer that caused these products to spoil. It was quickly realised that this process could also be applied to milk to destroy harmful bacteria, and make milk safer for human consumption.

Pasteurisation was first introduced in Australia in the late 1950s and remains a legal requirement for milk produced for human consumption in Australia.

Nowadays, some of the important bacteria that pasteurisation targeted, such as those that cause tuberculosis, are no longer as problematic. So why do we continue to pasteurise milk?

The animals we use for milking can sometimes carry other pathogenic organisms that are capable of causing disease in humans. They can be found on hides or shed in the faeces.

Even healthy animals may be a source of organisms that are harmful to people. Such pathogens may be present in the farm environment, including soil, water, on pasture and in animal feeds. These pathogens can enter the milk during milking and if such milk is consumed, it can cause disease.

The most common pathogens found in association with dairy farms and milking animals include bacteria such as Escherichia coli (E. coli), Campylobacter and Salmonella, but other pathogens such as parasites like Cryptosporidium, a type of gastro, may also be present.

As soon as milk is secreted from the udder, it is at risk of contamination by many different bacteria. cheeseslave/Flickr, CC BY

As soon as milk is secreted from the udder, it is at risk of contamination by many different bacteria.
cheeseslave/Flickr, CC BY

Campylobacter and Salmonella can cause severe diarrhoea and certain types of E. coli, particularly those known as Shiga toxin-producing E. coli (STEC), can cause very severe disease which impairs kidney function and may result in death.

Milk is highly nutritious to bacteria. Bacteria can quickly proliferate if their growth is not inhibited. Stopping the growth of bacteria in milk requires either heating to kill the bacteria, or chilling, which will not kill the bacteria but will slow down their growth.

E. coli, for instance, can go from ten cells to 100 million cells in just over six hours at 30°C. Only ten cells may be required to make someone ill. If such an organism is likely to be present, it’s important that any potential growth is stopped.

These harmful bacteria have caused outbreaks and disease associated with the consumption of raw milk in many countries. Data from the United States indicates that over a 13 year period to 2011, there were 2,384 illnesses, 284 hospitalisations and two deaths associated with the consumption of raw milk.

In Australia, raw milk contaminated by bacteria such as Campylobacter and Salmonella caused at least nine outbreaks of disease between 1997 and 2008, leading to 117 cases of illness.

So why do people choose to drink raw milk?

Advocates of raw milk often claim improved health benefit and nutritional value, or desiring a product which has not undergone further processing, retaining bacteria naturally present in milk.

But there is no evidence that the health benefits of milk are compromised by pasteurisation.

The defining difference between pasteurised and raw milk is the bacteria that are present. As soon as milk is secreted from the udder, it is at risk of contamination by many different bacteria as it makes its journey to our table. This includes harmful bacteria. These bacteria can lead to severe illness in humans, particularly children and the elderly.

For these reasons, raw milk continues to have a far higher risk of causing illness. Pasteurisation remains an important step in ensuring we can continue to enjoy safer, nutritious milk.

Further reading: Bath milk crisis must prompt better cosmetic safety regulation

This article was originally published on The Conversation.
Read the original article.


World Soil Day: a chance to worship the ground we walk on

By Leon Braun

It’s downtrodden, underfoot and often under appreciated, yet so crucial to our existence that one of our scientists describes it as “the complex natural medium that supports all life on Earth”. It holds our crops, stores and purifies our water, and provides habitat for amazing creatures like the giant Gippsland earthworm, which can reach up to 3 m in length. But most of us only think about it when we’re trying to get it out of footy socks on laundry day.

It’s soil – and today (and all next year) it gets a bit of long-overdue recognition. December 5 is World Soil Day, and the United Nations has declared 2015 to be International Year of Soils. That’s a good thing, because globally, soils are under threat: from erosion, poor land management and urbanisation. At the same time, we need soils more than ever to produce the food we need for a growing population, to help manage climate change and to ensure ecosystem health.

Cracked soil at Chowilla, South Australia.

Cracked soil at Chowilla, South Australia.

Luckily for Australia’s soils, they have CSIRO looking out for them. We started researching soils in 1929, published the first soil map of Australia in 1944, and have been working hard ever since to improve our understanding and management of soils. We’re looking at ways to make agricultural soils more productive and to ensure they’re used sustainably, so future generations can continue to reap their bounty. And we’re working internationally too, so it’s not just Australia that benefits.

Our latest achievement (with allies from around the country) is the Soil and Landscape Grid of Australia, a digital map of Australia’s soils with two billion ‘pixels’ of about 90 by 90 metres, down to a depth of two metres below the surface. It contains information such as water holding capacity, nutrients and clay, and has applications for everyone from farmers deciding where to plant their crops to conservationists looking for habitats for endangered native species. You can read more about it here.

We’re also home to the Australian National Soil Archive, which has just gotten a new home in Canberra. The archive contains about 70,000 samples from almost 10,000 sites across Australia, the oldest dating back to 1924. Each sample represents a time capsule of the Australian landscape at the time it was collected, so we can measure things like caesium dispersal from the British nuclear tests at Maralinga and the impact of phosphate-based fertilisers on agricultural land. The archive is a vital national asset for soil researchers and industry, and has even been used by the Australian Federal Police to examine the potential of new forensic methods. Finally, data from the archive powers our first official app, SoilMapp, which puts information about Australian soils at your fingertips. This is incredibly useful, whether you’re growing canola on a farm in Western Australia or planning a major roads project in Victoria.

So as you go through your day today, eat your lunch, wipe your shoes, just remember: it takes 2000 years to form 10 centimetres of fertile soil suitable for growing our food, but just moments for that soil to blow away or get covered in a layer of asphalt. Something to think about next time you sit down to a meal – or do your laundry.


Go with the grain: technology to help farmers protect crops

 

Tractors may have revolutionised farming but to protect biosecurity, farmers could do with some extra help. Ben McLeod/Flickr, CC BY-NC-SA

Tractors may have revolutionised farming but to protect biosecurity, farmers could do with some extra help. Ben McLeod/Flickr, CC BY-NC-SA

By Paul De Barro, CSIRO and Grant Smith, Plant Biosecurity Cooperative Research Centre

New technology to tackle biosecurity challenges down the track is one of the five megatrends identified in today’s CSIRO report Australia’s Biosecurity Future: preparing for future biological challenges.

As manpower in the agriculture and biosecurity sectors declines, we must look to technological innovation to protect crops. Monitoring and surveillance, genetics, communication and data analysis have been identified in today’s report as future work priorities, along with developing smaller, smarter, user-friendly devices.

But this is easier said than done. There are a number of potential barriers that need to be addressed to make sure that appropriate technologies are used to maximum effect. It might sound obvious, but making sure farmers can – and want to – use new technology is a crucial step.

Declining workforce

With an ageing population and fewer young people entering agriculture, we are seeing the loss of the wealth of knowledge and experience held by long-time farmers.

Many farmers have a deep understanding of the day-to-day activities that can protect properties and reduce the spread of pests and diseases across the country, and this on-farm biosecurity knowledge may be lost.

Farming a wide brown land means there’s a lot of ground to cover … and monitoring devices can make a farmer’s job much easier. Ed Dunens/Flickr, CC BY

Farming a wide brown land means there’s a lot of ground to cover … and monitoring devices can make a farmer’s job much easier.
Ed Dunens/Flickr, CC BY

 

We are also seeing a decline in specialists in areas crucial to biosecurity management such as taxonomy, plant pathology and entomology. This is prevalent throughout the biosecurity landscape, reducing our overall pest and disease response capability.

With fewer people training in taxonomy, we’ve estimated that 50% of Australia’s diagnostics capability will be lost by 2028.

Without adequate surveillance in place, pests can cripple emerging industries. In recent seasons, we have seen two new diseases devastate local farmers in the Northern Territory:

  1. the recent invasion of banana freckle, which authorities are working to eradicate
  2. the cucumber green mottle mosaic virus (CGMMV) infected melon crops near Katherine this year. Lack of CGMMV knowledge meant a delay identifying the disease and starting treatment.
Banana freckle. Scot Nelson/Flickr, CC BY-SA

Banana freckle.
Scot Nelson/Flickr, CC BY-SA

 

Surveillance is critical to the delivery of effective biosecurity, both for early detection of a disease and for effective response. Yet delivery of effective surveillance faces a growing challenge which becomes greater in the more remote parts of Australia.

Constraints on surveillance include declining investment among jurisdictions, declining expertise or limited availability of personnel, expense and occupational health and safety requirements.

Technology innovation

In response to these challenges there is a strong drive to draw on technological innovation to deliver biosecurity previously provided by people.

Research is already underway with new applications of technology for surveillance and detection, sensitive diagnostics, as well as preventative pre-border technologies.

Access to low-cost sensors and development of automated systems are opening up opportunities for rapid identification and response to pests and diseases. Sensors smaller than a pea can, for example, help monitor the health of oysters in real time.

sr320/Flickr, CC BY-SA 

sr320/Flickr, CC BY-SA

 

Pestpoint, a mobile device application being developed by staff of the Plant Biosecurity Cooperative Research Centre (PBCRC) provides access to an online community of people working in the agricultural sector who need to identify plant pests in order to make decisions about how to manage those pests.

By using genetic techniques, scientists with the PBCRC are developing rapid tests using molecular sequences for identifying pests and diseases. The next phase is to transfer these tools to biosecurity practitioners, including diagnosticians and port inspectors.

Sounds great … but there are barriers

The adoption of a new technology hinges on how easily it can be incorporated into the existing biosecurity system, which means that the technology needs to be integrated into a human system:

  1. the connection to institutional arrangements governing biosecurity regulation, response and compliance
  2. the social acceptability of deploying smart technologies and information systems.

The Queensland Biosecurity Strategy: 2009–14 highlighted that biosecurity risks are inherently social, and that a better understanding of human behaviours, values and attitudes has the ability to improve engagement.

Similarly, the 2007 New Zealand Biosecurity Science Strategy indicated that the application of social research could increase biosecurity compliance and reporting, and support post-border invasion response programs.

Farmers and indigenous communities in remote and regional Australia are currently working together on a project to understand how each group decides to manage plant pests and diseases, and to increase their capacity to engage in biosecurity surveillance activities.

In the face of declining resources and investment, science and technology offer opportunities to create greater efficiencies in biosecurity while at the same time driving competitive advantage in primary industries.

The Conversation

Paul De Barro receives funding from the Bill and Melinda Gates Foundation and the Cotton Research and Development Corporation.

Grant Smith is a co-PI on the PBCRC bacterial diagnostics project described in this article. He is a member of various organisations including Australasian Plant Pathology Society (APPS), the Royal Society of New Zealand (MRSNZ) and the Project Management Institute (PMI).

This article was originally published on The Conversation.
Read the original article.


Follow

Get every new post delivered to your Inbox.

Join 4,602 other followers