Friday 26 April 2013

To intervene, or not to intervene, that is the question.


Although an important first step in the fight against invasive species, quality known data is not always sufficient on its own to make tough management decisions.

Recent research1 has shown taking account of the information gap between the known and unknown could be an important tool in deciding between conventional and new methods for dealing with invasive species. Specifically the info-gap approach can help make potential assessments of non field tested techniques easier.



The principle behind the technique lies in utilising the info-gap model to map the disparity between known and unknown, whilst defining acceptable levels of outcome, essentially allowing determination of the best course of action.

The research1 looked in particular at the apple moth (native to Australia) and its impacts in California. With the known being the financial impact of the moth if no intervention is taken (i.e. status quo) and the unknown of intervention (i.e. eradication). The study found that if decision makers wanted an acceptable level of economic loss to be below $1.4 billion, then the preference should be for eradication.

However in order to be certain about the known, this type of approach relies on data collection and storage within a quality framework. If this is not the case the question then becomes, are existing data processes widening the info-gap beyond where it needs to be? If data can reasonably be known, then collecting and storing it in a quality way must surely be a first vital step in the decision making process.


Sources
http://ec.europa.eu/environment/integration/research/newsalert/pdf/325na2.pdf

Photo
pmarkham Flickr photostream (http://creativecommons.org/licenses/by-sa/2.0/)

Friday 19 April 2013

Global Data Platform Helps Drive Research


One aspect which is key to this increase and continuation in use of data from the GBIF platform is trust in the data made available. Ultimately the responsibility lies with the original data collectors and authors.

Since 2008, research utilizing GBIF data has increased from 52 to over 230 in 2012. The platform works through a series of national nodes, which submit data they collect nationally to GBIF. In that way GBIF acts as a kind of global records center, with data rights remaining with the original organisation.




One recent (October 2012) national node addition is Brazil. Through the Biodiversity Information System on Brazil (www.sibbr.org), an assessment of their national capability and infrastructure is being undertaken. The assessment is reported to encompass over 200 organisations. Given Brazil’s relative importance to global biodiversity, such a pragmatic approach is perhaps reassuring.

In order for such a system of global data to facilitate quality research, each author needs to deliver quality data in the first place. Given the need for a vast range of organisations (i.e. research through to citizens) to collect data to fill gaps in our current knowledge, this may seem daunting. Considerations should include accuracy of collection, possibility of data loss during processes, taxonomic data labeling, raw data labeling,  metadata collection, reuse and longevity. Specialist software can help organisations achieve quality in these areas, but only if the software is tailored to the users needs.

We are constantly developing our software to meet our simple mission, to make quality data efficient. In order to do this we need feedback from you, the biodiversity community.

So which parts of your data processes do you find frustrating? Is there anything you think could be automated to make your life easier?


Sources

Photo
NASA Goddard Photo and Video - Flickr collection (http://www.flickr.com/photos/gsfc/6012329930/)

Friday 12 April 2013

Are We Ready for a Deluge of Deep Sea Data?


Is the marine biology community ready for what seems like an inevitable deluge of data on deep sea species?

A recent partnership between James Cameron’s DEEPSEA CHALLENGE and Wood Hole Oceanographic Institution1, is set to increase the science communities ability to gather data on deep sea species. Using advanced technology, developed by Cameron’s team, marine biologists around the world are surely about to generate a wealth of data.



The release of an Apple Store app by the World Register of Deep Sea Species2 is perhaps a further indication of what is to come. The 20,000 plus species currently described may just be a drop in the ocean. Our own experience is that over half of species from deep sea projects tend to be unknown, which is a commonly reported phenomenon from surveys3 around the globe.

Considering it can take years before the taxonomic community accepts a species description, are current data processes up to the task? Using specialist software to store species records and link them to identification aids, like photo and video, facilitates easy and stable storage of this precious data. Please visit www.thomsonecologysoftware.com for more information.


Links

Photo - NOAA, Ocean Explorer (Flickr Account)

Friday 5 April 2013

Efficient Marine Research Aids Marine Protection


Efficient marine research, utilizing specialist software, has helped to increase knowledge of the UK’s marine environment. Two articles published by CEFAS (http://www.cefas.defra.gov.uk/), have increased understanding in terms of productivity levels and environmental status. Both of which can aid marine planning and inform compliance with EU legislation.



The first, presents estimates on the ‘Macrofaunal production along the UK continental shelf1. The study found that ‘...on average annelids contributed an overwhelming majority of the total production, with different regions varying in the relative contributions from other phyla such as molluscs, crustaceans and echinoderms.’

The second, ‘SPI-ing on the seafloor: characterising benthic systems with traditional and in situ observations’2, utilized Sediment Profile Imagery (SPI) to obtain a clearer in situ understanding of the species and sediment relationship. Thereby negating issues of homogeneity, associated with traditional sampling methods. The study focused on two areas of the North Sea, Dogger Bank and Oyster Grounds, as they are representative of larger areas and some of the most intensively exploited marine areas across the globe.

In both cases the underlying research, have benefited from adopting the specialist software UNICORN. This facilitated data efficiently and ‘…aided standardization of the outputted abundance and biomass data..’1,2 at ‘…appropriate taxonomic levels for numerical analyses..’1.

A number of organisations already use UNICORN, including Government Agencies. More information can be found on our website (www.thomsonecologysoftware.com/unicorn)

References
2 http://dx.doi.org /10.1007/s10533-012-9811-3

Photo - NASA Goddard Photo and Video (Flickr Account)