Friday 20 September 2013

Should we speed up taxonomy?

The extreme pressures facing biodiversity on our planet are well documented, and our chance to discover new species is likely diminishing on a daily basis. Should we speed up taxonomy to increase description rates?



It has recently been reported1, that the average time between discovery and formal scientific description of a new species is 21 years. The same report also mentions an example where the time line was significantly longer, over 100 years. I think most would agree that even 21 years seems like a long time. One thing is certain, we can not afford to loose any of the quality offered by traditional taxonomy, the issue remains can and should we speed up the process?

Seemingly the technology may just exist to speed up processes, and even potentially increase the quality at the same time. A recent article2 demonstrates the use of scanning electron microscope (SEM), used extensively in other scientific disciplines. Specifically it is the potential usefulness of the detailed SEM image, which can be rotated and potentially examined in 360o from one plane, providing scientists with exact images of the original specimen, and unlimited angles of view. Furthermore, the required technology to utilise SEM images is widely accessible1

Is it desirable to imagine a scientist finding, collecting data (including DNA) and SEM imaging new species in the field, then instantly sharing the images with project partners and/or the public to help identify and describe? Could such imaging techniques be the first step to speeding up taxonomy? Would computer aided recognition of images help identification?


Sources



Photo
cuatrok77 Flickr account - CC by 2.0

Friday 23 August 2013

Have we solved the world’s biggest garbage problem?

Although it is widely known that our biggest garbage dumps are in fact in the oceans, only recently has serious consideration been given to solve this problem. We take a quick look at the why and how?



For a long time we have been aware that marine litter is harmful to biodiversity and us, primarily due to plastic not biodegrading, and finding its way into the food chain. Further, the tiny particles of plastic which do enter the food chain can soak up toxic chemicals, compounding the risk1.

It seems policy makers have woken up to these very real problems, and at Rio in 2012, a commitment to a “significant reduction” in marine litter by the year 2025 was made2. In Europe, marine litter has been recognised as a main threat to achieving ‘good environmental status’ by 20202. So it seems the stage has been set for real momentum to tackle this big issue, but how might we do that?

Well, a young Dutch student thinks he may have the solution. Boyan Slat, a 19 year old aerospace engineer, has come up with the Ocean Cleanup Array3. The idea involves anchoring ocean sifters to seabed, and letting ocean currents do the rest. The aim would be the filter 7.25 million tons of plastic over a five year period4. However, some potential issues have already been raised, including the hazard to marine biodiversity.

At the end of May we blogged about the Protei project5, another potential solution to the ocean garbage problem. The potential idea here is to combine the developed shape shifting sailing robot with the power of open source technology. Essentially meaning that anyone can pick-up the knowledge from this project and apply it to any suitable problem, including ocean garbage.

What do you think of the Ocean Cleanup Array? Do you think sailing robots might provide the answer? Are there any other potential solutions out there?

Sources

Photo
NOAA's National Ocean Service Flickr Account - CC by 2.0 

Friday 16 August 2013

Are we already at the dawn of continuous biodiversity data?

In my May 31st blog, I wrote that the stage seemed to be set for continuous biodiversity data. The first rays of light from this new dawn may already be here.



A recent article1 highlights how technology is already being used to further real time monitoring of biodiversity. The article talks about the Automated Remote Biodiversity Monitoring Network (ARBIMON), and its application in providing decision makers real time biodiversity data. Two things are particularly fascinating about this methodology: 1) that data can be collected in real time across wide areas; and 2) that the data can be collected in real time across a wide breadth of biodiversity.

The ARBIMON project is a fantastic start, using mostly off the shelf products.  Hopefully the best is yet to come though, with others picking up these techniques and refining and finding ways to make the process even cheaper and easier. Just look at history of most technologies, which start as novel techniques, only to be refined and have their costs driven down. After all, the incentive to have real time data to inform management of biodiversity has never been greater, both in terms of the need for the data and also the reduction in funds available to collect and process the data.

Other real time data projects, for example sense-t2, are building on academic research and focusing on commercial challenges. Such applications include in aquaculture, agriculture, forestry and water. Even indirectly, increasing the efficiency of existing operations in these industries may help to alleviate future pressures on biodiversity. However, the hope is that through commercial refining, we might see a real drive in cheaper technology, increasing the potential for use in biodiversity monitoring across the globe.

Are you aware of projects which might help the real time biodiversity data cause? If so, what are they and what are they doing? If not, what would you like to see happen?


Sources


Photo
lowjumpingfrog Flickr Account - Creative Commons Attribution 2.0 Generic

Friday 9 August 2013

The power to positively influence development?

The power of the citizen scientist has never been greater. But how great is this power, and can it even extending into influencing positive development practices?



A recent Pensoft article1 highlighted the scientific impacts that citizens and technology are achieving. Specifically, the impact that geo-referenced photographs, uploaded onto online data stores, are having in the scientific world. Not least, confirmation of the existence of an endangered species, fifty years after its first description. But how can increased biodiversity data help facilitate positive impacts for development?

Well, the majority of developments begin with extensive project planning. Usually this involves the choosing between several options, requiring large amounts of data. Decisions can be based on a range of factors, including biodiversity impact. The best developers know that by minimising their impact during the planning stage, they can prevent destruction of important biodiversity and save time and money on expensive remediation and mitigation further down the line.

So how can you help? In order to make the best environmental decisions, everyone involved in developments needs access to the latest biodiversity data. Citizens taking geo-referenced photographs, through online data stores and record centers, are helping to facilitate this, like never before. 

Source


Photo
Alison Christine Flickr Account - Creative Commons Attribution 2.0 Generic  

Friday 2 August 2013

Is conservation truly engaging?

Simply put, conservation efforts depend on winning the hearts and minds of a community, whether local or global. Are the tools available being used effectively?



Is a photograph still worth a 1000 words? It is often said that a visual image delivers a much clearer message to the majority of people, than words alone. Certainly the power of the photograph has long been a mainstay of the charity world to invoke emotion, and you could argue there is no better medium for the job of raising instant awareness and empathy for a cause. The International League of Conservation Photographers1 is an example of an organisation who clearly believes this. By linking with scientists and NGOs, the photographers aim to further environmental and cultural conservation. But are photographs being utilised effectively, and can conservationists learn anything from politics?

During the 2012 American presidential election, social media played a vital role in securing Obama his second term2. Not surprising when you consider one study3 showing that young people are twice as likely to vote if they are politically active online and another that 39% of all American adults participate in online political activism. Ok, so you might ask what politics has to do with conservation. When you consider that both politics and conservation have the same ultimate goal, to cause action in an individual and community, then perhaps it is worth taking note.

Is the photograph still vital to conservation? Is social media being used as effectively as it should be? Is conservation truly engaging?

Our annual photography competition is currently open to amateur photographers, so feel free to check it out: http://www.thomsonecology.com/

Sources


Photo
MapBox Flickr Account - Creative Commons Attribution 2.0 Generic

Tuesday 23 July 2013

When creation is simply not enough

As the social and policy pendulums swing, our environment is being subjected to an increasing number of performance targets. But are we equipped to truly help our environments meet these targets?



The desire to reverse impacts on our habitats and ecosystems has resulted in legislation across many parts of the world. For example, in the European Union (EU) there is the Water Framework Directive, requiring ‘Good Ecological Status’ of rivers by 2015. Targets of this nature, inevitably lead to: 1) current status baselines; 2) remediation and restoration works; and 3) evaluation of efforts.

In the area of rivers, the EU IMPACT project1 is hoping to inform effective habitat restoration. From this project comes a free and open source fish dispersal model (FIDIMO2), allowing the likelihood of fish to (re-)colonise restored or remediated habitats to be calculated. This kind of tool and information should help deliver more effective river remediation and restoration efforts, at least for fish.

But, do we have enough knowledge to deliver effective remediation and restoration work in the first place? If not, what other tools and models are available to help? What more can and should be done?


Sources


Photo
Peter aka anemoneprojectors Flickr Account - Creative Commons

Tuesday 16 July 2013

Agreement to Help Secure Valuable Global Resource

The World Register of Marine Species (WoRMS, www.marinespecies.org) is used by over 80,000 marine biologists from across the world every month and represents the only expert validated list of global marine life. A recent agreement will help secure funding for this valuable global resource.



The agreement means that royalties will be paid to the organisation behind WoRMS, the Society for the Management of Electronic Biodiversity Data (www.smebd.eu), from every sale of Thomson Ecology’s software tools. Mark Costello, WoRMS Steering Committee and ex Chair, who signed the agreement on behalf of WoRMS explains the benefits:
We believe that making the WoRMS list available, and regularly updated, through software tools in the marine field will have a positive impact on the quality of scientific reporting. WoRMS content is already freely available to anybody from the website, but Thomson Ecology provide added value to users by incorporating it within their desk-top software for users. The income generated for WoRMS by Thomson Ecology will help us to improve the quality, scope and sustainability of the database, in service of the scientific community.”

Tom Gardiner, senior product manager for Thomson Ecology, explains the benefits to users:
The WoRMS list has become an industry standard since its creation, and our users have increasingly fed this back to us. Incorporating the list into our software, means users around the world can now easily increase the value of their data asset. Our software tools have always helped increase efficiency and now users can benefit from reports and outputs being aligned to WoRMS.”

Thomson Ecology will now begin work to integrate the WoRMS list into existing software tools, TREx and Unicorn. To celebrate the agreement, Thomson Ecology is offering vouchers to purchase the new version of TREx for £50 (+VAT) when it is released in a couple of months (see promotion details). This offer is open until end August 2013.

To request any voucher(s) you can email your requirements:
software@thomsonecologysoftware.com


Sources

WoRMS News Article
Thomson Ecology TREx Promotion

Photo
NOAA's National Ocean Service Flickr Account (Creative Commons)

Friday 5 July 2013

The key element of effective marine planning and management...

One thing seems certain; there is room for improvement in our current planning and management of the marine environment. How can we as scientists help achieve this?



At a recent coastal and marine management conference1 in London, a general theme became clear: Our current marine planning and management regimes can and should be improved. This was right across the board from Government and Industry to NGOs. What also became clear to me, as a trained scientist, was that science itself had been part of the problem. How can these issues be overcome? What is the key?

Communication. I believe it is that simple. Specifically I am referring to communication which is clear, simple and above all suitable for the audience. Take the failed example of the 1998 NASA Mars Climate Orbiter2 landing. According to an Investigation Board, the reason for failure was the lack of clarity on whether metric or imperial units were being used. Cue a gigantic amount of money and time being lost. The problem is, with the marine environment we do not necessarily have this time or money to waste.

Areas that seem to require effective communication are likely to include: 1) scientists helping to define the policy questions to answer; 2) presenting information to policy makers; 3) achieving a truly interdisciplinary approach; and 4) Engagement with stakeholders. All these areas require clear and simple communication to facilitate realistic and deliverable outcomes, which will ultimately help achieve effective marine planning and management. As Einstein once said “Everything should be made as simple as possible, but no simpler”

Further, effective communication can deliver the type of truly interdisciplinary projects needed in this big data and fast climate changing world we inhabit. All may not be lost from the past however. A wealth of collected data already exists from past projects, all we need to do now is make it relevant and up to date as well as add value through linking disciplines and their data together.

Do you agree, that communication is key? Are there any other key issues en route to achieving effective marine planning and management?


Sources


Photos
USFWS Endangered Species Flickr - Creative Commons 2.0 Generic

Friday 28 June 2013

Is biodiversity truly part of sustainability planning?

A new tool to put sustainability firmly in the urban planning process seems to have omitted a crucial element of sustainability: biodiversity. Is this a result of us not fully understanding our impacts?



As part of the EU BRIDGE1 project the Decision Support System2 tool aims to help urban planners integrate sustainability into strategic urban planning. The system helps, by modelling the flow of energy and materials (e.g. water, energy and pollutants) to and from the urban and outside environments, providing planners with valuable insight into the potential impacts of planning options.

Tools which help put sustainability at the heart of urban planning are to be welcomed, what is not clear is whether this particular tool takes account of biodiversity per se. Although in introducing the tool, the team establish a clear link between a ‘city’ being comparable to a ‘natural ecosystem’, has the interaction of biodiversity been accounted? At one end, this might be movement and interaction of pets and at the other potential impacts to physical and genetic biodiversity movement.

Including biodiversity may not require any further significant investment, rather a coupling and clearly combined objectives. For example, greening the urban environment can have many benefits, such as urban cooling, pollution control and biodiversity infrastructure. Without biodiversity being part of this picture, we may be missing opportunities. 

Further, with technology and therefore data collection becoming ever cheaper and ‘citizen friendly’ and with the advent of the ‘internet of things’, our ability to monitor our impacts and those of planning decisions is becoming easier. A recent example of this is the BBC's documentary on Life of Cats3. This project used tiny cameras and tracking devices to watch and analyse how people’s pet cats interacted with the urban and outside environments. Such use of technology will surely help close the information gap, and perhaps facilitate biodiversity truly being a part of urban sustainability.

Do we truly understand our impacts on biodiversity as a result of the urban environment? If not, is this preventing modelling of our impacts on biodiversity? How can we use advancements in technology to help?


Sources


Photo
d.boyd Flickr Account - Creative Commons

Friday 21 June 2013

Severn Barrage inquiry finds too much environmental uncertainty

The recently revised Severn Barrage proposals raise questions over its potential environmental impact. Will power generation in the Severn ever be economically and environmentally acceptable?



An inquiry report1 by the UK Energy and Climate Change Committee found that the environmental impacts of the revised Severn Barrage tidal power scheme were currently too unclear. The report1 states that “further research, data and modelling will be needed before environmental impacts can be determined with any certainty”. Raising particular concern over the need for “..an unprecedented scale..” of compensatory habitat, “..casting doubt..” on the project being compliant.

Lessons from across the globe point towards a cautious evidence based approach being needed. During the inquiry2, examples from La Rance (France) and Bay of Fundy (Canada) were both referenced. It is the Canadian example which seems to bear the closest similarity to the conditions of the Severn Estuary, with ecological impacts, from Canada being cited as fish mortality and habitat degradation and loss. However both the technology used for power generation and the knowledge available have advanced.

Can lessons be learnt and if so will it be enough to make the Severn proposals compliant and acceptable? One of the original Severn Barrage reports3 reported an intertidal habitat replacement cost of £65,000 per hectare. The latest proposal, from Hafren Power4 estimates less than 5000 hectares of habitat will be directly lost. Looking only at the direct habitat which might be lost, and on a like for like basis (1:1) this direct habitat replacement cost could equal £325 million.

However, if existing habitat could be acceptably modified, perhaps the issue of habitat loss might not be so great. The latest business case4 from Hafren Power, seems to suggest some novel approaches to try an address this issue. Such as proposals to raise any potentially impacted habitats and better control over estuary water movement.

Whether the project ever becomes environmentally viable remains to be seen, but certainly robust data sets and evidence must be used to ensure environmental impacts are accounted for and mitigated. Can the economic and environmental considerations ever be balanced?


Sources


3 -http://www.wyeuskfoundation.org/Severnbarrage/downloads/ABP%20mer%20Sever


Photo
Dave Hamster Flickr Creative Commons

Friday 14 June 2013

Are we ready for an increase in marine mammal acoustic surveys?

Monitoring of marine mammals is not new, but is in the context of the offshore renewable industry. Are the technology and tools ready for what is likely to be an increase in this type of data?



In Europe, underwater noise is covered under Indicator Number 11 of the Marine Strategy Framework Directive (MSFD)1. This legislation aims to reduce underwater noise levels by 2020. As a result Germany has set strict guidelines for the maximum levels during offshore wind farm development activities. It is anticipated that other European countries who have not yet set such levels may do so, including the UK.

Given the recent inclusion of an industry context, and in the importance of this issue to policy makers, an increase in surveys and data is to be expected. But do we have well developed easy applications and tools to help a wide audience conduct and analyse such surveys and data?

A similar area which has seen developments is in the survey and identification of bat species. One such project, iBats2, has developed a free application to help identify species behind the calls heard. Although the accuracy of such applications can vary, the iBats project has attempted to capture community feedback. In the marine world, a competition3 run on the Kaggle platform has helped improve Cornell’s whale detection model from 72% to 98%.

Is there work going on which will help provide tools to analyse marine acoustic data? Do they already exist?

Sources
3 - http://marinexplore.org/blog/the-kaggle-challenge-improves-cornells-whale-detection-model-to-98/

Photo
Tolomea Flickr Account - Creative Commons 2.0 Licence

Friday 7 June 2013

Online Collaboration to Aid Sponge Taxonomy

A new online collaboration aims to speed up and strengthen the taxonomic naming of sponges. This is just one example of how the internet has and can help increase the global scientific output.

Sponges play a vital role in our marine ecosystems, principally by acting as water cleaners. Their taxonomy has traditionally been challenging, partially because of the difficulty in bringing together all the required data, to accurately check and name species.



The SpongeMaps tool, is an online collaboration which aims to help quicken the speed of data delivery and provide a focal point for stronger formal naming and releasing of taxonomic data. Specifically the tool brings a variety of data, including morphology, images, geo-referenced data, chemical structures and molecular barcodes into the focal point for experts and public alike.

These types of projects, which aim to speed up taxonomic processes are important to our ongoing ability to accurately describe and document biodiversity in the long term. Primarily because taxonomy makes it possible to quickly identify species on an ongoing basis, but also acts as an anchor to other relevant information regarding that species.

Behind the SpongeMaps tool, the taxonomic lexicon is underpinned by the World Porifera Database, which is in turn a product of the World Register of Marine Species (WoRMS). Sources of authoritative and comprehensive registers of species names, like these, are vital in establishing a common standard for using taxonomic data across the globe.

As part of our commitment and belief in the importance of global taxonomic standards, our team are delighted to announce an agreement with WoRMS to incorporate the global list of marine species into our data tools (see announcement).


Sources
Global Names Project

Photo
Phillipe Guilliame Flickr Account - Creative Commons Licence

Friday 31 May 2013

Are we approaching the dawn of continuous biodiversity data?

With technology becoming ever cheaper and more powerful, a movement to open hardware and increasing application in other disciplines, are we heading for a continuous stream of biodiversity data?

The stage does seem to be set to make use of technological advances. Three main elements may have contributed: 1) there is a well established movement towards ecosystem and community level assessments at the policy level; 2) there is an ever increasing need and gap for real time data to make informed policy, planning and management decisions; and 3) governments and monitoring authorities are predisposed to finding ever increasing ways to be resource efficient.



This type of approach is beginning to be used for other environmental data needs. One such project, currently looking for Kickstarter funding to develop further, is the Smart Citizen1 project. The aim is that through cheap hardware, citizens will deploy environmental monitoring devices across the globe. Collected data will then be sent to online platforms for viewing, analysis and sharing.

Another project, which also received funding through the Kickstarter platform, is the Protei2 sailing robot. Originally designed in response to the Gulf of Mexico oil spill, the hope of the team behind it is application in fisheries, natural reserve, coral reef and algal bloom monitoring.

Interestingly, both the Protei and Smart Citizen projects have adopted an open hardware philosophy. Essentially meaning that anyone and everyone has access to plans and detailed specifications for project hardware. Facilitating others to view and innovate using existing advancements and knowledge.

There have been developments in the area of biodiversity citizen science. However these have relied on making human collection of data more accessible to all citizens. Now and in the future we have a chance to develop continuous data streams, helping us fill knowledge gaps and tackle some of the biggest biodiversity challenges we all face.

Our team are currently working with partners on some of the many elements needed to make this concept a reality. However, there is lots still left to do, and we are always in talking to potential researchers and organisations to collaborate with and help drive innovation.


 Links
1 (Smart Citizen) - http://www.kickstarter.com/projects/acrobotic/the-smart-citizen-kit-crowdsourced-environmental-m?ref=category


Photo
Under Creative Commons Licence - http://www.flickr.com/photos/gabriellalevine

Friday 24 May 2013

Will the advent of open online data publishing lead to advances in science?


Having platforms available for publishing quality scientific data, whatever the discipline is clearly a good first step. But in order to truly advance science, do we need a wealth of multi-disciplinary data, which is easily cross linked and reused.




Three particular open data sources come to my mind, although I am sure there are more. These are DRYAD, figshare and Global Biodiversity Information Platform (GBIF). Each of the three is different, leading to the potential problems of data silos being created and overlapping and splitting of scientists between platforms. However, wherever you choose to publish your data, the principles of quality data remain the same.

What would be useful, if not already available is a super platform which brings together access to all these online datasets, wherever they are. That way data could easily be gathered from cross disciplines and a range of scientists within each discipline.

Do you currently publish data online? If so where, and why? Are there any tools that would help the reuse and finding data?


Links
http://figshare.com/
http://datadryad.org/
http://data.gbif.org/welcome.htm;jsessionid=F348B6E06936D790E24DDE96AB3EBFE6

Photo
NASA Goddard Photo and Video, Creative Commons 2.0 - http://www.flickr.com/photos/gsfc/5319841331/

Friday 17 May 2013

Can we beat the extinction rate to name all species?


Do we have the capability to name all the species on the planet before they go extinct?

Recent research1 argues that we can. Essentially we might have convinced ourselves unnecessarily that the barriers are too high. Take the commonly held belief that the number of taxonomists is decreasing. The research1 points to the increase in taxonomic data being published in repositories (e.g. WoRMS) as a sign that taxonomic output is in fact increasing.



Are we in the most productive taxonomic era ever? This argument seems to supported by the recent announcement that the Pensoft publication Zookeys reached its 300th issue recently. Adding over 100 issues in the last year alone.

The research1 also points to the fact that the extinction rate is poorly quantified and that the number of species is over exaggerated. Resulting in underestimation of the time available and overestimation of the number of species left.

Potential issues surrounding accurate prediction of the number of species are highlighted in research2 looking at species which have only been described once or ‘Oncers’. The research2 reports that these species are indicative of: 1) endemism; 2) being constrained to narrow niches; 3) poor practices during re-identification; and 4) original descriptions remain unknown. The research2 found that the number of species within the genus Gymmodinium was likely to be 234 and not 268 as previously thought.

One thing is for certain, technology and infrastructure, through the hard work of certain groups and researchers are playing a huge role in facilitating the effort to describe all species before they go extinct.


Sources

Photo
NASA Goddard Photo (Creative Commons Licence) - http://www.flickr.com/photos/gsfc/8569701912/in/photostream

Friday 10 May 2013

The role of DNA in addressing our biggest ecological challenges


Addressing some of the biggest ecological challenges we all face, require knowing what species we have in the first place. Although not a new concept, can DNA techniques play an important role in future? Or are they just a novel concept?




An international research project1 has used DNA techniques to describe five new species of lichen forming fungi from what was phenotypically a single species. The results of such studies (i.e. unique DNA barcodes) provide a unique way to identify different species outside of traditional taxonomy.

 In order to be truly useful in closing the species information gap, infrastructure must be in place, both internally within an organisation (e.g. equipment and software) and externally (e.g. DNA barcode repositories). Although data repositories, such as GenBank and MarBOL are well established, are other aspects available in widely accessible formats?

What are your thoughts?
  

Source


Photo
Pellaea under Creative Commons Licence (http://www.flickr.com/photos/7147684@N03/1236353263/)

Friday 3 May 2013

Marine Biodiversity Offsetting: Is our marine data up to the challenge?



A report1 commissioned by the Crown Estates in the UK, has scoped the potential for adopting biodiversity offsetting within the UK marine environment. But even in such a well studied place as the UK, is the available data up to the challenge?

The scoping report1 specifically identifies a number of key issues and challenges which would need to be addressed to successfully establish biodiversity offsetting in the UK marine environment. One challenge being to overcome the availability of accurate and recent biodiversity information. Specifically it is the relationships between biodiversity and physical environment (i.e. biotope mapping) and at a finer scale the need for biodiversity community and ecosystem relationships. Although some mapping and data exists, this is a mixture of known and extrapolated information at the broader scale.



Coincidentally, a research article2 looked at a ‘Decadal view of biodiversity informatics: challenges and priorities’. This study looked at the continued development of this field in facilitating decision making (e.g. policy, environmental change, land-use and ecosystem services). It came up with twelve recommendations for the next decade, essentially moving towards a biodiversity systems approach to our understanding.

If biodiversity offsetting in the marine environment is to be successfully implemented (i.e. the creation of long term viable offsets), then surely biodiversity informatics must play a vital role. Specifically by helping to capture the complexity of life and its relationships on a system rather than individual level.

However, we would not be in a position to contemplate biodiversity offsetting at this stage without the progress made by the informatics community during the past decade. Huge strides have been made in creating taxonomic, metadata and semantic frameworks and infrastructure with which to facilitate sharing of accurate and quality data. Although as the research article2 suggests there is a need to tackle not only new data, but also increasing use of existing technologies and then exploiting technologies in novel ways.

In the short term it appears from at least the data point of view, there are still some challenges to address in successfully applying biodiversity offsetting in the UK marine environment. It is only through undertaking such scoping1 and broad scale assessments2 that we can begin to join the pieces of the jigsaw together and ensure efficiency in achieving the best outcomes. We just need to make sure that someone is actually putting the jigsaw together.

Sources
2 http://www.biomedcentral.com/1472-6785/13/16

Photo

NOAA's National Ocean Service Flickr Photos (http://www.flickr.com/photos/usoceangov/5794209837/in/set-72157626914645232) - Attribution 2.0 Generic Creative Commons Licence

Friday 26 April 2013

To intervene, or not to intervene, that is the question.


Although an important first step in the fight against invasive species, quality known data is not always sufficient on its own to make tough management decisions.

Recent research1 has shown taking account of the information gap between the known and unknown could be an important tool in deciding between conventional and new methods for dealing with invasive species. Specifically the info-gap approach can help make potential assessments of non field tested techniques easier.



The principle behind the technique lies in utilising the info-gap model to map the disparity between known and unknown, whilst defining acceptable levels of outcome, essentially allowing determination of the best course of action.

The research1 looked in particular at the apple moth (native to Australia) and its impacts in California. With the known being the financial impact of the moth if no intervention is taken (i.e. status quo) and the unknown of intervention (i.e. eradication). The study found that if decision makers wanted an acceptable level of economic loss to be below $1.4 billion, then the preference should be for eradication.

However in order to be certain about the known, this type of approach relies on data collection and storage within a quality framework. If this is not the case the question then becomes, are existing data processes widening the info-gap beyond where it needs to be? If data can reasonably be known, then collecting and storing it in a quality way must surely be a first vital step in the decision making process.


Sources
http://ec.europa.eu/environment/integration/research/newsalert/pdf/325na2.pdf

Photo
pmarkham Flickr photostream (http://creativecommons.org/licenses/by-sa/2.0/)

Friday 19 April 2013

Global Data Platform Helps Drive Research


One aspect which is key to this increase and continuation in use of data from the GBIF platform is trust in the data made available. Ultimately the responsibility lies with the original data collectors and authors.

Since 2008, research utilizing GBIF data has increased from 52 to over 230 in 2012. The platform works through a series of national nodes, which submit data they collect nationally to GBIF. In that way GBIF acts as a kind of global records center, with data rights remaining with the original organisation.




One recent (October 2012) national node addition is Brazil. Through the Biodiversity Information System on Brazil (www.sibbr.org), an assessment of their national capability and infrastructure is being undertaken. The assessment is reported to encompass over 200 organisations. Given Brazil’s relative importance to global biodiversity, such a pragmatic approach is perhaps reassuring.

In order for such a system of global data to facilitate quality research, each author needs to deliver quality data in the first place. Given the need for a vast range of organisations (i.e. research through to citizens) to collect data to fill gaps in our current knowledge, this may seem daunting. Considerations should include accuracy of collection, possibility of data loss during processes, taxonomic data labeling, raw data labeling,  metadata collection, reuse and longevity. Specialist software can help organisations achieve quality in these areas, but only if the software is tailored to the users needs.

We are constantly developing our software to meet our simple mission, to make quality data efficient. In order to do this we need feedback from you, the biodiversity community.

So which parts of your data processes do you find frustrating? Is there anything you think could be automated to make your life easier?


Sources

Photo
NASA Goddard Photo and Video - Flickr collection (http://www.flickr.com/photos/gsfc/6012329930/)