Smartphone app a life-saver for heart attack patients

Smartphone

A simple app can improve the rehabilitation of heart attack survivors

By Mohan Karunanithi, CSIRO; Marlien Varnfield, CSIRO, and Simon McBride, CSIRO

Patients recovering from heart attacks are almost 30% more likely to take part in rehab at home using a new smartphone app compared to those who had to travel to an outpatient clinic or centre.

What’s more, those who used the online delivery model – known as the Care Assessment Platform – were 40% more likely to adhere to the rules of the program and almost 70% more likely to see it through to completion.

The clinical trial, conducted by CSIRO and Queensland Health through the Australian E-Health Research Centre, also showed that the online model was just as clinically effective as a traditional rehab program.

Cardiologists were so pleased by the results that the next generation version of the platform is soon to be offered in a number of Queensland hospitals including Ipswich, Metro North and West Moreton Hospital and Health Services.

Why go digital?

Clinical guidelines recommend patients attend and complete a cardiac rehabilitation program following a heart attack. Studies have shown that those who do this have much better long-term health outcomes.

They are less likely to be re-admitted to hospital and will have a better quality of life. Most importantly, they are far less likely to have another cardiac event or die from their condition.

Traditionally, rehab programs take the form of group-based exercise, nutritional counselling, risk factor assessment and educational activities. They are designed to help patients return to an active, satisfying life.

Despite the benefits, uptake is generally poor. Successful recovery relies on patients starting and completing a rehabilitation program, but many find travelling to a health care facility on a weekly basis to be an onerous requirement – particularly for those who work or care for others, or live in regional or remote Australia where these services are not available.

The Care Assessment Platform features health and exercise monitoring tools, weekly mentoring consultations, delivers motivational materials through text messages and contains multimedia that educates patients about disease symptoms and management.

Most importantly, it gives patients a more flexible option. By integrating rehab treatment with a patient’s daily life, they are more likely to complete the program and make their new healthy lifestyle permanent. This overcomes one of the key barriers to patient participation and recovery.

The cost of heart disease

There have been huge advances in our treatment of heart disease in recent years. Researchers from the University of New South Wales have shown that rapid response teams, first introduced in Australia in the 1990s, have halved cardiac arrests and associated deaths in our hospitals. This saves around 12,000 Australian lives each year.

Despite the success of innovations like this, cardiovascular disease still kills one Australian nearly every 12 minutes. And for many of these patients, it is not their first cardiac event.

With more than A$5.5 billion spent every year on acute and chronic management of heart disease, any digital technology that improves recovery rates offers huge potential to reduce the burden and cost to the community.

What does a fully digital health system look like?

Australia’s health system faces significant challenges including rising costs, an ageing population, a rise in chronic diseases and fewer rural health workers. Total government expenditure on health has trebled in the last 25 years.

Clearly, something needs to change.

Emergency department

Technology can help ease the burden on emergency departments.

We must reduce the reliance on our hospitals by helping patients before they are admitted. Digital tools can do this by moving many services into the home through broadband delivery and models of care based on rich digital information.

In another study, CSIRO is running Australia’s largest clinical telehealth trial. In this trial, we’ve equipped a group of elderly patients with broadband-enabled home monitoring systems.

Patients can use machines to measure vital signs such as blood pressure, blood sugar, heart abnormalities, lung capacity, body weight and temperature.

Data is then immediately available to the patient’s doctor or nurse, allowing them to provide appropriate care interventions much earlier. This helps patients stay out of hospital and improve their quality of life.

Ultimately, those patients who are chronically ill will still need to attend hospital on occasion, so technology also has a role to play in improving the effectiveness of our admission systems.

Emergency departments are critically overcrowded and struggle to respond to day-to-day arrivals in a timely manner. Big data analytics can be used to predict how many patients will arrive at emergency, their medical needs and how many will be admitted or discharged. This can then be used to calculate how many beds will be required to meet patient’s needs.

New app-ortunities

The next step for our Care Assessment Platform research team is to adapt our mobile technology for rehabilitation for other chronic conditions such as pulmonary disease and diabetes. We’re also working hard to quantify the cost savings the program can deliver.

Australia has a track record in finding cures and new treatments for diseases. In order to sustain this, we also need to find new ways to deliver quality affordable care.

There is enormous potential for big data, analytics and decision support systems to help achieve this, reducing the burden on our health system and improving the wellbeing of all Australians.

The Conversation

Mohan Karunanithi receives funding from Queensland Health and The Prince Charles Foundation.

Simon McBride is affiliated with Health Informatics Society of Australia and the Australian College of Health Informatics.

Marlien Varnfield does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.

This article was originally published on The Conversation.
Read the original article.


Genetic evolution: how the Ebola virus changes and adapts

As viruses replicate, their genome changes. Image: EPA/Ahmed Jallanzo

As viruses replicate, their genome changes. Image: EPA/Ahmed Jallanzo

By Glenn Marsh, CSIRO

The current outbreak of Ebola virus in West Africa is unprecedented in size, with nearly 4,800 confirmed or probable cases and more than 2,400 deaths. People have been infected in Guinea, Liberia, Sierra Leone, Nigeria and Senegal.

The World Health Organization declared this outbreak a “public health emergency of international concern” in August and estimates it will claim a staggering 20,000 lives within the next six months.

A second completely independent and significantly smaller Ebola virus outbreak has been detected in the Democratic Republic of the Congo.

Like all viruses, the Ebola virus has evolved since the outbreak began. So, how does this occur and how does it impact our attempts to contain the disease?

Tracking Ebola

Ebolavirus and the closely related Marburgvirus genera belong to the Filoviridae family. Both of these genera contain viruses that may cause fatal haemorrhagic fevers.

The Ebola virus genus is made up of five virus species: Zaire ebolavirus (responsible for both of the current outbreaks), Sudan ebolavirus, Reston ebolavirus, Bundibugyo ebolavirus and Taï Forest ebolavirus.

In order to better understand the origin and transmission of the current outbreak in West Africa, researchers from the Broad Institute and Harvard University, in collaboration with the Sierra Leone Ministry of Health, sequenced 99 virus genomes from 78 patients.

The study, reported in Science, shows the outbreak resulted from a single introduction of virus into the human population and then ongoing human-to-human transmission. The scientists reported more than 300 unique changes within the virus causing the current West African outbreak, which differentiates this outbreak strain from previous strains.

The current Ebola outbreak has infected nearly 5,000 people. Image: EPA/Ahmed Jallanzo

Within the 99 genomes sequenced from this outbreak, researchers have recorded approximately 50 other changes to the virus as it spreads from person to person. Future work will investigate whether these differences are contributing to the severity of the current outbreak.

These 99 genome sequences have been promptly released to publicly available sequence databases such as Genbank, allowing scientists globally to investigate changes in these viruses. This is critical in assessing whether the current molecular diagnostic tests can detect these strains and whether experimental therapies can effectively treat the circulating strains.

How does Ebola evolve?

This is the first Ebola virus outbreak where scientists have sequenced viruses from a significant number of patients. Despite this, the Broad Institute/Harvard University study findings are not unexpected.

The Ebola virus genome is made up of RNA and the virus polymerase protein that does not have an error-correction mechanism. This is where it gets a little complicated, but bear with me.

As the virus replicates, it is expected that the virus genome will change. This natural change of virus genomes over time is why influenza virus vaccines must be updated annually and why HIV mutates to become resistant to antiretroviral drugs.

Changes are also expected when a virus crosses from one species to another. In the case of Ebola virus, bats are considered to be the natural host, referred to as the “reservoir host”. The virus in bats will have evolved over time to be an optimal sequence for bats.

Knowing how the Ebola virus adapts will help health officials contain future outbreaks. Image: EPA/Ahmed Jallanzo

Crossing over into another species, in this case people, puts pressure on the virus to evolve. This evolution can lead to “errors” or changes within the virus which may make the new host sicker.

Ebola viruses are known to rapidly evolve in new hosts, as we’ve seen in the adaptation of lab-based Ebola viruses to guinea pigs and mice. This adaptation occurred by passing a low-pathogenic virus from one animal to the next until the Ebola virus was able to induce a fatal disease. Only a small number of changes were required in both cases for this to occur.

While this kind of viral mutation is well known with other viruses, such as influenza virus, we are only truly appreciating the extent of it with the Ebola viruses.

What do the genetic changes mean?

The Broad Institute/Harvard University study reported that the number of changes in genome sequences from this current outbreak was two-fold higher than in previous outbreaks.

This could be due to the increased number of sequences obtained over a period of several months, and the fact that the virus has undergone many person-to-person passes in this time.

However, it will be important to determine if virus samples from early and late in the outbreak have differing ability to cause disease or transmit. The genetic changes may, for example, influence the level of infectious virus in bodily fluids, which would make the virus easier to spread.

Analysing this data will help us understand why this outbreak has spread so rapidly with devastating consequences and, importantly, how we can better contain and manage future outbreaks.

Glenn Marsh receives funding from Australian National Health and Medical Research Council and Rural Industries Research and Development Corporation.

This article was originally published on The Conversation.
Read the original article.


Historic collections could be lost to ‘digital dinosaurs’

An image of Australian shearers taken on glass plate negative is now preserved in a digital collection. Powerhouse Museum Collection/Flickr

An image of Australian shearers taken on glass plate negative is now preserved in a digital collection. Powerhouse Museum Collection/Flickr

By Michael Brünig, CSIRO

Australian’s museums, galleries and other cultural institutions must adopt more of a digital strategy with their collections if they are to remain relevant with audiences.

Only about a quarter of the collections held by the sector have been digitised so far and a study out today says more needs to be done to protect and preserve the material, and make it available to people online.

Challenges and Opportunities for Australia’s Galleries, Libraries, Archives and Museums is a joint study by CSIRO and the Smart Services CRC.

It notes that Australia’s galleries, libraries, archives and museums (the GLAM sector) represent our accumulated achievements and experiences, inspire creativity and provide a place for us to connect with our heritage.

They are also crucial to our economy with the GLAM sector estimated to have a revenue of about A$2.5 billion each year. That’s not only a lot of paintings and artifacts, but a lot of jobs as well.

But despite its size and scope, we found that digital innovation in the sector has been inconsistent and isolated. If these cultural institutions don’t increase their use of digital technologies and services, they risk losing their relevance.

So what changes do they need to make in order to thrive in the digital economy?

Opening doors and minds

With Australia’s rapid uptake of online and mobile platforms, people are now choosing to access and share information in very different ways.

It’s safe to say that the only constant in this space is change. Research suggests that expectations for more personalised, better and faster services and more well-designed experiences will continue to increase.

Virtual tours are now possible at the National Museum of Australia.

Virtual tours are now possible at the National Museum of Australia.

This is why our cultural institutions need to review the kind of visitor experience they are providing. We found only a few organisations had made fundamental changes to their operations that would allow them to place digital services at their core, rather than as an add-on activity.

This is in contrast to the dramatic changes we’ve seen when it comes to adopting digital technologies in our daily lives.

In order to be successful, digital experiences need to be an integrated and cohesive part of an institution’s offering.

Take what is happening at the National Museum of Australia. It’s now possible to take a tour of the museum via a telepresence-enabled robot.

This means school students – particularly those in rural and regional Australia – can explore exhibits virtually, without even leaving the classroom. Interestingly, we hear that this actually increases their desire to visit the museum in person.

Digital-savvy innovations such as this need to be at the fore of our institutions’ thinking if they want to engage with the community and break down barriers to participation.

Engaging with the public

To be successful in this new era, institutions need to meet people on their own (digital) terms. We can no longer expect visitors to queue at the turnstiles waiting for opening time. Organisations need to bring experiences to the user so that they can access them wherever and however they prefer.

Some of Australia’s cultural institutions are starting to get this.

Another image available freely online as part of the Powerhouse Museum Collection. Powerhouse Museum/Flickr

Another image available freely online as part of the Powerhouse Museum Collection. Powerhouse Museum/Flickr

The NSW State Library has appointed a Wikipedian-In-Residence to contribute expertise and train the public in publishing information online.

The National Library of Australia has attracted a large online user base with its online Trove service attracting almost 70,000 unique users each day.

The Powerhouse Museum has made parts of their photographic collections available on Flickr via Creative Commons licensing. This has caused a surge in the level of use and allowed the public to contribute information, adding value to the collection.

While these examples provide a lot of hope for the sector, the unfortunate reality is that they are few and far between. Most of Australia’s cultural institutions have not kept pace with this change and are missing the opportunity to better connect and actually increase their revenue.

Digitise this!

Australia’s eight national, state and territory art organisations hold archives that, if laid out flat end-to-end, would span 629km. This is on top of a staggering 100,000 million artworks, books and audio-visual items in Australia.

But only a quarter of these items are digitised, with some of Australia’s collections still being managed through “old school” mechanisms such as log books and card indices.

Imagine if there was a fire at one of our great institutions? We would risk losing cultural and heritage material of significance. Parts of our history could be completely lost. Even without such a devastating event, if we don’t make our collections more accessible, in a sense they’ll be lost to many of us anyway.

As a country, not only do we need to get moving when it comes to digitising our collections, we also need to explore new and innovative ways to do this. Traditionally, digitisation has meant scanning flat documents, photographing objects or creating electronic versions of catalogue data.

But what if we could do so much more? Researchers are now focused on the next challenge: digitising objects and spaces in three dimensions.

Researchers from the University of Wollongong with support from the Smart Services CRC are focusing on capturing 3D models and the textures of surfaces using low-cost equipment such as a Kinect camera from an Xbox.

3D map of The Shrine of Remembrance, Melbourne

3D map of The Shrine of Remembrance, Melbourne

At CSIRO, we’ve even used our own handheld scanner Zebedee to map culturally and environmentally significant sites suchb as the Jenolan Caves, Melbourne’s Shrine of Remembrance and even a semi-submerged wreckage of the HMQS Gayundah.

We’re also creating high-quality 2D and 3D image libraries based on the National Biological Collections, letting us document biodiversity in the digital era.

Embracing the digital economy

While our study reveals that Australia’s cultural institutions are certainly at risk of becoming “digital dinosaurs”, it also demonstrated how those organisations that are embracing digital are reaping the benefits.

It provides recommendations for the GLAM industry in order for it to maximise its digital potential, including:

  • shifting to open access models and greater collaboration with the public
  • exploring new approaches to copyright management that stimulate creativity and support creators
  • building on aggregation initiatives such as the Atlas of Living Australia
  • standardising preservation of “born digital” material to avoid losing access to digital heritage
  • exploiting the potential of Australia’s Academic and Research Network (AARNet) and the National Broadband Network (NBN) for collection and collaboration.

By adopting these recommendations and building on some innovative examples in the sector, Australia’s GLAM industry will be well placed to embrace digital, rather than be engulfed by it.

This article was originally published on The Conversation.
Read the original article 


99.999% certainty humans are driving global warming: new study

Gambling for high stakes

Would you take a gamble with these odds? Image by Kraevski Vitaly Shutterstock

By Philip Kokic, CSIRO; Mark Howden, CSIRO, and Steven Crimp, CSIRO

There is less than 1 chance in 100,000 that global average temperature over the past 60 years would have been as high without human-caused greenhouse gas emissions, our new research shows.

Published in the journal Climate Risk Management today, our research is the first to quantify the probability of historical changes in global temperatures and examines the links to greenhouse gas emissions using rigorous statistical techniques.

Our new CSIRO work provides an objective assessment linking global temperature increases to human activity, which points to a close to certain probability exceeding 99.999%.

Our work extends existing approaches undertaken internationally to detect climate change and attribute it to human or natural causes. The 2013 Intergovernmental Panel on Climate Change Fifth Assessment Report provided an expert consensus that:

It is extremely likely [defined as 95-100% certainty] that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic [human-caused] increase in greenhouse gas concentrations and other anthropogenic forcings together.

Decades of extraordinary temperatures

July 2014 was the 353rd consecutive month in which global land and ocean average surface temperature exceeded the 20th-century monthly average. The last time the global average surface temperature fell below that 20th-century monthly average was in February 1985, as reported by the US-based National Climate Data Center.

This means that anyone born after February 1985 has not lived a single month where the global temperature was below the long-term average for that month.

We developed a statistical model that related global temperature to various well-known drivers of temperature variation, including El Niño, solar radiation, volcanic aerosols and greenhouse gas concentrations. We tested it to make sure it worked on the historical record and then re-ran it with and without the human influence of greenhouse gas emissions.

Our analysis showed that the probability of getting the same run of warmer-than-average months without the human influence was less than 1 chance in 100,000.

We do not use physical models of Earth’s climate, but observational data and rigorous statistical analysis, which has the advantage that it provides independent validation of the results.

Detecting and measuring human influence

Our research team also explored the chance of relatively short periods of declining global temperature. We found that rather than being an indicator that global warming is not occurring, the observed number of cooling periods in the past 60 years strongly reinforces the case for human influence.

We identified periods of declining temperature by using a moving 10-year window (1950 to 1959, 1951 to 1960, 1952 to 1961, etc.) through the entire 60-year record. We identified 11 such short time periods where global temperatures declined.

Our analysis showed that in the absence of human-caused greenhouse gas emissions, there would have been more than twice as many periods of short-term cooling than are found in the observed data.

There was less than 1 chance in 100,000 of observing 11 or fewer such events without the effects of human greenhouse gas emissions.

Good risk management is all about identifying the most likely causes of a problem, and then acting to reduce those risks. Some of the projected impacts of climate change can be avoided, reduced or delayed by effective reduction in global net greenhouse gas emissions and by effective adaptation to the changing climate.

Ignoring the problem is no longer an option. If we are thinking about action to respond to climate change or doing nothing, with a probability exceeding 99.999% that the warming we are seeing is human-induced, we certainly shouldn’t be taking the chance of doing nothing.

The Conversation

The authors do not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article. They also have no relevant affiliations.

This article was originally published on The Conversation.
Read the original article.


Australia’s astronomy future in a climate of cutbacks

Parkes radio telescope

What future for the Parkes radio telescope amid the CSIRO cutbacks? Image: Wayne England

By Lewis Ball, CSIRO

The future looks very bright for Australian radio astronomy but it was somewhat clouded earlier this year when CSIRO’s radio astronomy program took a dramatic hit in the Australian federal budget.

CSIRO has cut its funding for radio astronomy by 15%, down A$3.5 million to A$17 million for the 2014-15 financial year. The result will be a reduction of about 30 staff from the plan of just three months ago.

The cuts will impact most heavily on CSIRO’s in-house astronomy research, on the operation of the Parkes radio telescope – instantly recognisable from the movie The Dish – on the less well known but tremendously productive Australia Telescope Compact Array near Narrabri and on the Mopra Telescope near Coonabarabran, all in New South Wales.

The Australia Telescope Compact Array.

The Australia Telescope Compact Array. Image: D. Smyth

About two-thirds of ATNF’s staffing reduction will be effected through not filling planned new roles, most prominent of which was to be a CSIRO “SKA Chief Scientist”. A third of the reduction will be through involuntary redundancies. Eight staff across sites in Sydney, Parkes, Narrabri and Geraldton have already been informed that their roles are expected to cease.

The speed of implementation of such a substantial funding reduction forces swift action. This has unsettled staff and the broader astronomy community, but it hasn’t changed the broad direction of CSIRO’s astronomy program.

World leaders in radio astronomy

Australian scientists and engineers are world leaders in radio astronomy, both in understanding our universe and in developing some of the most innovative technologies used to gain that understanding, and have been for 50 years.

CSIRO’s Australia Telescope National Facility (ATNF) has been integral to the discovery of the first double pulsar system (a long-sought holy grail of astronomy), the identification of a previously unknown arm of our own galaxy, the Milky Way, and the invention of Wi-Fi now so embedded in everyday communications.

For the past decade CSIRO has been steadily changing the way it operates its radio astronomy facilities. CSIRO’s highest priority is the pursuit of science enabled by the development of an innovative new technology that provides an unprecedented wide field of view.

This uses “Phased Array Feeds” (PAFs) as multi-pixel radio cameras at the focus of dishes. PAFs are being deployed in the Australian SKA Pathfinder (ASKAP), in Western Australia, which will be the fastest radio telescope in the world for surveying the sky.

ASKAP telescopes in WA

High-speed radio astronomy surveys will be possible thanks to the PAF receivers (green chequerboard at top of the quadrupod) on the ASKAP telescopes in Western Australia.

ASKAP is in the early stages of commissioning. It is just now starting to demonstrate the new capabilities obtainable with a PAF-equipped array.

ASKAP is an outstanding telescope in its own right but is also a pathfinder to the huge Square Kilometre Array (SKA). This enormous project will build the world’s biggest astronomy observatory in Australia and southern Africa. It’s also the most expensive at a cost of around A$2.5 billion.

Cutbacks at The Dish

To resource these exciting developments, CSIRO has been reducing costs and staffing at its existing facilities, including the venerable Parkes Dish. This is a painful but necessary process. The most recent funding cuts will result in more pain.

Astronomers will no longer have the option of travelling to the Compact Array to operate the telescope to collect their data. They can run the telescope from CSIRO’s operations centre in Sydney, or from their own university, or from anywhere in the world via an internet connection.

Astronomers who use the Parkes telescope have been doing this for the past year after a very successful program to make the 50-year-old dish remotely operable. That is pretty amazing for a machine built before the advent of modern computers.

Parkes telescope

The Parkes dish gets the remote treatment.
Image: John Sarkissian

For many decades Parkes staff have swapped detector systems or “radio receivers” in and out of the focus cabin, the box at the tip of the tripod that sits about 64 metres off the ground. Each receiver operates at different wavelengths and offers quite different types of science.

It seems likely that CSIRO will offer just two Parkes receivers for at least the next six to 12 months, since it will no longer have the staff needed to swap receivers. Similar reductions in the capability of the Compact Array will also be needed to fit within the budget.

The future

While the current changes are painful, the future is incredibly exciting. The direction of Australia’s astronomy is described in the Decadal Plan for Australian Astronomy for 2006–2015. It identifies participation in the SKA and access to the world’s largest optical telescopes as the two highest priorities for Australian astronomy.

We are making progress on both fronts, despite some significant challenges. The process to develop the plan for the next decade is well in hand under the stewardship of the National Committee for Astronomy.

Phased arrays are also at the heart of the Murchison Widefield Array (MWA), another innovative SKA precursor that has been in operation for a little over a year.

ASKAP and the MWA are located in the Murchison region of Western Australia, chosen because it has a tremendously low level of human activity and so astonishingly little background radio noise.

This radio quietness is the equivalent of the dark skies so important for optical astronomers. Less noise means astronomers are better able to detect and study the incredibly weak radio signals from the most distant parts of the universe.

This freedom from radio interference is a unique resource available only in remote parts of Australia and is essential for ASKAP, MWA and much of the science targeted by the SKA.

PAFs

Prototype of the more sensitive second-generation PAFs to be deployed on ASKAP undergoing tests in Western Australia in August 2014.
Image: A. Hotan

The wide fields of view of ASKAP and the MWA enable unprecedented studies of the entire radio sky. Astronomers will measure the radio emission of millions of galaxies and complete massive surveys that for the first time will connect radio astronomy to the more mature field of optical astronomy.

Mapping the sky with EMU and WALLABY

The two highest priority projects for ASKAP are called the Evolutionary Map of the Universe (EMU) and the Widefield ASKAP L-Band Legacy All-Sky Blind Survey (WALLABY).

Both will survey millions of galaxies and together they will trace the formation and evolution of stars, galaxies and massive black holes to help us explore the large-scale structure of the universe.

The MWA is already producing great science targeted at the detection of intergalactic hydrogen gas during what’s known as the “epoch of reionisation” when the first stars in the universe began to shine.

With the SKA we aim to understand what the mysterious dark matter and dark energy are. We may also provide another spin-off such as the Wi-Fi technology, which came from CSIRO efforts to detect the evaporating black holes predicted by Stephen Hawking.

Advances in data-mining or processing techniques driven by the astonishing data rates that will be collected by the thousands of SKA antennas deployed across the Australian and African continents might provide the most fertile ground of all, illustrating once again the long-term benefits of investing in cutting-edge science.

Lewis Ball has received funding from the Australian Research Council. CSIRO Astronomy and Space Science receives funding from a variety of government sources, and from NASA/JPL.

This article was originally published on The Conversation.
Read the original article.


Fast-tracking access to experimental Ebola drugs

Most agree that if an individual is likely to die and an experimental therapy has a reasonable chance to prevent death, then it should be given. Image: EPA/Ahmed Jallanzo

Most agree that if an individual is likely to die and an experimental therapy has a reasonable chance to prevent death, then it should be given. Image: EPA/Ahmed Jallanzo

By Glenn Marsh, CSIRO

The current outbreak of Zaire Ebola virus in Western Africa is the largest ever recorded. More than 1800 people have been infected and nearly 1000 people have died. But while drug therapies are close to being available, they may not be ready in time for the current outbreak, even if safety trials are fast-tracked.

Several therapeutic treatments being developed by other organisations are in experimental phases of testing and show great promise in treating Ebola virus infections in animal models. These include antibodies (one of the body’s natural defence mechanisms to fight infections), RNAi molecules (that target the genetic material of the virus) and several more traditional pharmaceutical drugs.

Safety trials

Before being administered to people, each of these new potential therapies would require human clinical trials, starting with a phase I safety trial. In phase I, the products under investigation are administered to healthy volunteers to evaluate how safe the treatments are, including determining a safe dose range and potential side effects. These trials generally involve 20 to 80 individuals.

Phase II trials, used to determine efficacy, are complicated to carry out for rare viral diseases such as Ebola. Traditionally, in phase II trials two groups are treated, the first group receives the treatment or vaccine while the other group receives a placebo, or mock treatment. Evaluating the efficacy of these compounds will only be possible with direct testing during an outbreak.

But during an outbreak, and for ethical reasons, it may not be possible to administer a placebo to one group of people while treating others with a potentially life-saving therapy.

Experimental therapies

ZMapp is a mix of three antibodies, all directed to the Ebola virus glycoprotein (GP), which block attachment and entry to cells, the first step of the virus infection cycle.

ZMapp is produced in a Nicotiana plant, related to tobacco, and is being developed as a treatment for Ebola virus infection by Mapp Biopharmaceutical Inc. along with many other partners. These antibodies are “humanised” monoclonal antibodies, stopping the human immune system from recognising them as foreign.

ZMapp was the experimental therapy administered to the two American medical aid workers infected with the Ebola virus. The medics were working in Liberia, trying to contain this outbreak.

This experimental cocktail of antibodies builds on the success of similar antibody therapies, which have protected macaques from a lethal dose of Ebola virus when administered 24 hours after infection. Media reports indicate the two health-care workers have shown signs of improvement.

ZMapp is not the only experimental therapy to show promise in animal studies in preventing lethal disease for Ebola virus. Tekmira Pharmaceuticals Corporation are currently undergoing a phase I clinical trial of TKM-Ebola, a RNAi therapeutic targeted towards Zaire Ebola virus. This therapy was demonstrated to give 100% protection in macaques from an otherwise lethal challenge of Zaire Ebola virus.

In March 2014, the US regulator, the Food and Drug Administration (FDA), fast tracked TKM-Ebola as a drug for an unmet medical need. However, the phase I clinical trial was recently suspended due to safety concerns, with individuals receiving higher doses developing an inflammatory, flu-like response to treatment.

Due to the relative risk from the disease versus the treatment, in the last week, the FDA has eased the restrictions on TKM-Ebola. This may allow for TKM-Ebola to be used in infected patients.

Liberian children lay flowers in memory of all Liberians who have died of the Ebola virus. Image: EPA/Ahmed Jallanzo

Liberian children lay flowers in memory of all Liberians who have died of the Ebola virus. Image: EPA/Ahmed Jallanzo

Other drugs, which have been shown in the laboratory to be effective against Ebola viruses include BCX4430 and Favipiravir, both of which are nucleoside analogues that block viral RNA replication.

Developed by BioCryst Pharmaceuticals Inc., BCX4430 is a broad-spectrum antiviral that inhibits different many viruses. BCX4430 has been demonstrated to protect rodents from Ebola virus and macaques from Marburg virus, a closely related virus.

Favipiravir, which is in late-stage clinical development for the treatment of influenza, has reduced the severity of disease and risk of death in a mouse model of disease.

Fast-tracking access

So, should these therapies be used now to treat infected people, bypassing clinical safety trials?

Giving treatments, which are unlicensed and untested in humans, is an ethical issue. Likewise, not administering a potentially life-saving therapy is also problematic. These decisions would have been carefully considered prior to the treatment of the two American aid workers with the unlicensed ZMapp antibody.

In recognition of the difficult ethical issues that arise in this debate, the World Health Organization is meeting to discuss the current outbreak and relevant issues. Much of the debate centres on a popular belief: if an individual is likely to die and an experimental therapy has a reasonable chance to prevent death, then it should be given.

But there are other issues to consider: what if the experimental therapeutic makes the disease worse? And who decides who to treat when only small numbers of doses are available?

Additionally, for many of these experimental therapies, only a small number of doses are currently available to be used for treatment. Many of these therapies would require weeks, if not months, to produce sufficient doses for large scale use.

Although there is currently no end in sight for this outbreak, research and clinical trials of these new therapies for Ebola virus needs to continue. That way, when the next Ebola virus outbreak occurs, there will be licensed options available and the discussion about whether unlicensed drugs should be used will be negated.

Glenn Marsh receives funding from NHMRC.

This article was originally published on The Conversation.
Read the original article.


Coal seam gas emissions lower than US: first Australian study

The methane-detecting four-wheel-drive, measuring emissions around Queensland and NSW coal seam gas wells. Tests were also done upwind of each site to avoid cows or other methane sources skewing the results. CSIRO, CC BY-SA

The methane-detecting four-wheel-drive, measuring emissions around Queensland and NSW coal seam gas wells. Tests were also done upwind of each site to avoid cows or other methane sources skewing the results.

By Damian Barrett and Stuart Day

One of the most common questions Australians ask about coal seam gas is whether the gas wells leak – and if so, how much?

In the first Australian study of its kind, new CSIRO research now gives an indication of how much those “fugitive emissions” might be, and how we can start to reduce them.

Commissioned by the federal Department of the Environment and now published on its website, the pilot study measured emissions around 43 coal seam gas production wells – six in New South Wales and 37 in Queensland – out of the more than 5000 wells currently operating around Australia. The results reveal that:

  • nearly all of the 43 wells tested showed some fugitive emissions;
  • the emissions rates were very low (in most cases less than 3 grams of methane per minute – equivalent to methane emissions from around 30 cows);
  • in many cases, those emissions could be reduced or even stopped entirely; and
  • the average measured levels from the Australian wells were 20 times lower than reported in a study of fugitive emissions from US unconventional gas sites, published last year in the leading international journal Proceedings of the National Academy of Sciences

In Australia, fugitive emissions from coal mining, oil and gas production account for about 8% of Australia’s greenhouse gas emissions.

From the latest report on Australia’s greenhouse gas emissions, published by the federal Department of the Environment in April this year. Quarterly Update of Australia’s National Greenhouse Gas Inventory: December 2013. Click to enlarge

From the latest report on Australia’s greenhouse gas emissions, published by the federal Department of the Environment in April this year. Click to enlarge

Although those fugitive emissions are estimated and reported under the National Greenhouse and Energy Reporting Act, there has often been a high degree of uncertainty associated with these estimates in Australia – particularly from coal seam gas production.

That’s why this new research is important, as it offers a first indication of fugitive emissions from coal seam gas under Australian conditions.

Pressure regulator

The report found a particular type of pressure regulator installed at many wells was a common source of methane leakage.

The report’s results

Our new report, Field Measurements of Fugitive Emissions from Equipment and Well Casings in Australian Coal Seam Gas Production Facilities, shows that of the 43 wells studied, three had no detectable leaks.

Of the rest, 37 wells emitted less than 3 grams of methane per minute, and 19 of those showed very low emission of less than 0.5 grams of methane per minute.

However, at a few wells (6 of the 43) much higher emissions rates were detected, with one well registering emissions 15 times higher than the study average. That was found to be mainly due to methane discharging from a vent on a water line.

On closer scrutiny, some of the leaks were due to faulty seals on equipment and pumps, which could be easily fixed, while other emissions were associated with exhaust from gas-fuelled engines used to power water pumps that are not regarded as “fugitive” emissions.

We tested for emissions using a four-wheel-drive fitted with a methane analyser. The car made several passes downwind from the well to measure total emissions emanating from the site.

To ensure that other potential methane sources, such as cattle, were not inadvertently included, similar measurements were made upwind of each test site. We also took a series of measurements at each well to locate sources and measure emission rates.

The methane-detecting 4WD and its equipment

The methane-detecting 4WD and its equipment

Why worry about fugitive emissions?

Fugitive emissions occur when methane escapes from production facilities, wells, pipes, compressors and other equipment associated with coal mining or natural gas extraction. Other human induced methane emissions occur through grazing of domestic stock, agricultural production and from landfills.

In nature, methane is released from geological sources and biological processes occurring in wetlands, swamps, rivers and dams. About 15% of human emissions of methane are derived from fossil fuels.

While burning gas for energy has lower greenhouse gas emissions compared to other fossil fuels like coal, methane has a global warming impact at least 25 times that of carbon dioxide (when measured over a 100 year period).

Even small losses of methane during gas production, processing and distribution have the potential to reduce the relative greenhouse benefit of natural gas as a fuel for electricity production.

Fugitive emissions can be costly for the coal seam gas industry because escaping gas represents a loss of a valuable commodity.

What’s next for CSG emissions research?

These new findings from 43 wells are a good start, but they are clearly only the beginning, given that represents fewer than 1% of Australia’s coal seam gas wells. More measurements are required to representatively sample the remaining 99% of wells before we can make definitive statements about methane fugitive emissions in Australia.

CSIRO scientists, through the Gas Industry Social & Environmental Research Alliance (GISERA), are undertaking further research into methane emissions in Australia including understanding the natural or background emissions of methane that come from seeps in the ground in Queensland’s Surat Basin.

This research aims to identify background sources of methane and determine the best detection and measurement methods.

Results from measuring naturally occurring methane seepage, as well as the results of this new report and others, will add to the bigger picture of assessing the coal seam gas industry’s whole of life cycle greenhouse gas emission footprint. Most importantly, we hope they will provide more answers to Australians’ question about coal seam gas.

This article was originally published on The Conversation.
Read the original article


Follow

Get every new post delivered to your Inbox.

Join 3,639 other followers