Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES) Review

Recently we reviewed the draft conceptual framework to guide the delivery of IPBES. The IPBES is the Intergovernmental Platform on Biodiversity and Ecosystem Services (http://www.ipbes.net/). We recognise the challenges associated with developing this framework: while biodiversity and ecosystem services are all encompassing, they are poorly defined in theme, space, and time, and are inherently linked to society’s institutions and economy. We also acknowledge the importance of a conceptual framework for ensuring uptake and involvement of all key stakeholders of the IPBES. We applaud the expert working group who met in Bonn earlier this year for developing the draft conceptual framework and the attempt to capture the complexity inherent in the mandate for the IPBES. We also recognise the challenge of developing a conceptual framework that adds value to predecessors and that speaks to the four core functions of the IPBES .

Our review focused on three themes:

1.      Treatment of biodiversity: including definitions and relationship with ecosystem services

2.      Treatment of spatial and temporal scales

3.      Knowledge generation and decision making: including emphasis on how decisions are made and the importance of scenarios.

You can see the full content of our review here. We grouped our comments in relation to these themes, and attempted to clearly outline suggested actions to redress them. In some cases the three themes were interconnected. In an attempt to clarify our suggestions, we (well, mainly Liz!) developed a revisedschematic of the conceptual framework based on our comments (see below). We are looking forward to contributing to other intersessional activities of the IPBES – it was fun to gather our thoughts on how we conceptualise ecosystem services and biodiversity and the important role that imagining potential futures has in bridging the science-policy interface.

 

image001

Ecosystem Services Meets Systematic Conservation Planning

by Liz Law

This week saw the inaugural joint meeting of the Ecosystem Services Discussion Group and the Marxan Party to discuss software and tools for planning and prioritization of Ecosystem Services.

The Ecosystem Services framework has developed in recent years, encapsulating land stewardship to foster the many benefits that we derive from our ecosystems. These benefits are many and multifaceted, ranging from agricultural production and climate change mitigation, to regulating watersheds, and stimulating inspiration in diverse cultural settings. However, like biodiversity, planning for ecosystem services requires balancing the management requirements of a diverse range of sometimes opposing land uses, resulting in potentially complex, multi-criteria problems.

Ecosystem services, meet Systematic Conservation Planning.

Systematic Conservation Planning has grown from the need to solve multi-objective allocation problems in a repeatable, transparent way. Typically focused on multiple species or ecosystems, Systematic Conservation Planning has increasingly accounted for real world complexities such as direct and opportunity costs, equity of impact, physical and thematic connectivity between planning units, and contribution of multiple land use types. Continue reading Ecosystem Services Meets Systematic Conservation Planning

Getting Started in R

By Liz Law

So you want to learn R, but just don’t know where to start…

Help!

The R website is your friend: http://cran.r-project.org/

The index on the left has several points of interest:

  • Manuals & Contributed – these pages have many “introduction to R” and quick reference things in English and Belorussian (among other languages… )
  • Task Views – this links to several pages describing a whole bunch of “packages” that are used for different fields. For example, Environmentrics (ecology), Econometrics (economics), Spatial, Optimization programming, Multivariate stats, Graphs, basically all the things you could imagine.

I use RStudio http://www.rstudio.com/ide/ – I find it a really nice way of viewing code, history, workspace, plots, help etc on the one page. But there are a number of other gui interfaces, they can be found on the R website here http://www.sciviews.org/_rgui/

I found “Quick-R” http://www.statmethods.net/index.html a useful reference page to start as well, it explains things cleanly with code examples. Another useful way to start, particularly if you are also rusty at basic stats, is to go through Crawley’s “Introduction to Statistics Using R” (note, full text is available online through the UQ library).

All packages will have a pdf manual, generally readily found by searching for “r cran” and the package name. However, it is always worth searching if the package also has other documentation (often called a vignette) or a website which goes through worked examples of the process. These can be more useful, particularly when Continue reading Getting Started in R

Tips and tricks for ArcGIS, Excel and R

Liz Law reports on the Wilson Conservation Ecology Lab meeting for Aug 5th 2011

Why? Because it’s a Bustard! (Photo of Australian Bustard near Morven, QLD, by Liz Law)

How many times have we spend a good few hours (even days) trying to do something in ArcGIS/Excel only to find out, after we finished, that we could have done it in a fraction of time another way?

For me, this has occurred quite a lot. So, I decided to run our weekly lab meeting on tips, tools and functions that can save us time. Here is a quick summary of what we talked about:

Model builder and other geoprocessing tools

Making data with interpolation

Editing shapefile layers

Getting help with ESRI stuff

VLOOKUP

SDMTools

Model builder and other tools: It took me ages to find out that you can add custom toolboxes, in which you can drag in copies of your favourite tools, and also create your own using model builder.

I think model builder is a really useful tool within ArcGIS toolbox. Essentially it is a space where you can build and visualise geoprocessing models (i.e. a series of tools, functions, scripts, etc.). I find it really useful to record, communicate, and repeat your model (analysis process). I also find it a little more intuitive to batch process, or to run several processes using the output of the prior one as the input of the next (without collecting masses of intermediate files). More information on model builder is available on the ESRI website, and they also provide a free online training seminar.

Luis Verde noted that large models can get a bit buggy and can give a generic error message. When this happens, he recommends using the “make feature layer” to make a temporary copy of your inputs prior to each tool. For some reason this works.

There are also lots of geoprocessing models that already exist for a number of different tasks. Some of these can be found and downloaded at the ESRI “geoprocessing model and script tool gallery”. Other packages of tools operate as plug-ins, for example ET Geowizards, and Hawths tools (aka GME). Ayesha Tulloch says some of these are great: for example, the ET Geowizards “Explode multipart polygon” tool is way less buggy than the one provided by ArcGIS (and actually does the job right the first time). However, Ayesha also cautions that while the old Hawths tools was pretty awesome, the newer version (GME) is not. It doesn’t even work with any ArcGIS prior to 10.0, which is funny, because a lot of the functions that we use to use it for are apparently available in 10.0 anyway…

Making data with interpolation:  Jane MacDonald has not been working much with ArcGIS, but her co-workers have. And she is worried about an emerging trend to use interpolation to make data layers from point observations. While it is really easy to get results using this technique, the basic principles of models apply: junk in = junk out. You really need to question whether you are getting accurate maps, for example by validating using reserved point data, and/or comparing with GLM outputs.

Editing shapefile layers: Karen Mustin has been going through the joys of editing shapefiles. There are about a billion ways to do editing things in ArcGIS. I recommend checking out the ESRI online training seminars of which there are about 4 ones on “editing tips and tricks”. In particular, if you don’t know what “snapping” and “sticky move tolerance” are, or how to modify them, I highly recommend you seek advice BEFORE editing your layers.

Getting help with ESRI stuff: If you have spent ages trying to understand ArcGIS, gone through all the normal help and forums, and you still have unanswered questions, you can always give ESRI a call. However, many large institutions have people that are designated ESRI gateways, says Jude Keyes. Our UQ people are Jürgen Overheu (GPEM), Gai Trewinnard-McNeill (ITS), and Steven Clark (ITS).

Vlookup: Excel is probably one of the most commonly used spreadsheet programs, but probably one of the most poorly utilised ones as well. Many people turn to proper database software (and for good reason) if they have large databases that they have to run many query variations on, but you can also create a database in Excel, or do pretty basic query-like tasks. For example, Angela Guerrero suggests if you have a lookup table that you want to use to append values to a list you can use functions like VLOOKUP. If you don’t know what this is, look it up!

SDMTools: Saving the best till last, Luke Shoo blew us out of the water with his “SDMTools”. This R-package, developed by Luke and his colleagues, provides a set of tools for post processing the outcomes of species distribution models: comparing models, tracking changes in distributions over time, visualising outcomes, selecting thresholds, calculating measures of accuracy and landscape fragmentation statistics, and more. Absolutely amazing, extremely useful, and supremely beautiful.

Thanks everyone for a fantastic and very useful meeting!

Complex… not complicated!

Liz Law reports on a recent early career researcher “Networks and Agents” workshop run by the CSIRO Centre for Complex Systems Science

What is complex systems science (CSS)? In a nutshell, it is the study of systems that are considered fundamentally complex: those in which there are many parts involved, and often relationships between parts are just as important as the parts in determining system properties. Complex systems are set apart from those that are “merely” complicated via the tendency to display emergent properties over scale, self organisation over time, involving feedbacks and non linearity. Common examples include understanding collective behaviour, in particular under contexts of natural resource management or institutional decision making (socio-ecological systems), as well as natural phenomena such as ecosystems and climate.

Modelling and simulation of these systems is employed to understand their dynamics, optimise strategies, and predict responses. CSS has a natural base in network theory, but also integrates agent based modelling, systems theory, game theory, and artificial intelligence, among other things. One of the things that really interested me about CSS was that it seems to ignore traditional disciplinary boundaries – complex systems are everywhere – and therefore there is a fantastic opportunity for the transfer of concepts, tools and methodologies across disciplines.

Some highlights of the workshop included:

Michael Breakspear (Queensland Institute of Medical Research) enlightened us on how CSS is being used to understand how our brains work: how neuroscience is combining information on the brains anatomical networks (the physical “wiring”), with functional dynamics (relationships between activity in different parts of the brain), to shed light upon “effective” networks (causal effects). Turns out some of the most connected parts of the brain is used most not when we do complex tasks, but rather when we daydream…

Richard Fuller from (CSIRO Ecosystem Sciences/UQ) discussed presentation of science, and in particular how sometimes presentation of research regarding CSS may not be easy. Media outlets may be great for research that has a clear, succinct, and non-controversial message, but can be minefields for more challenging messages.

Markus Brede (CSIRO Marine and Atmospheric Research) discussed how cooperation can evolve (in game theory) with allowances for learning and building of trust (both in direct repeated interactions, and observation of the behaviour of others), voluntary participation, altruism (kin selection), and structured populations. In some networks cooperation could be attributable to key well connected “leaders”, to the opportunity to avoid interactions with players who were not fair, and also to bias who we want to learn from (aspiration bias). Integration of some of these rules could create areas of cooperation, no matter how strong the game (how tempting it was to default).

Kirsty Kitto (QUT) gave us a preview of her recent work using quantum formalism to provide a geometric framework to model how agents make decisions when these are fundamentally contextualised. This framework can also formalise communication between agents (including when and where communication might lead to a change in agent position). I’m not going to pretend I understood it all, but it looked really cool, and definitely Kitto is one to look out for in the near future as she and her group develop this framework further to describe aggregate behaviour of multiple agents, and eventually use it for modelling in CSS.

Ryan McAllister (CSIRO Ecosystem Sciences) discussed some of his recent work using experimental economics to explore economic behaviour in the face of variability – and how trust can be a key driver of success. Look out for more from him regarding how social institutions may be enhanced to deal with the uncertainty and variability predicted under climate change!

It was great to see the various methodological approaches of CSS, and really interesting to see how different researchers used them: while Michael used them to understand the system, and Markus to explore key drivers in the system, Ryan was more focussed on using the models in applied situations to develop testable hypotheses. Then in stepped the delightful Pascal Perez (University of Wollongong), who uses CSS tools (participatory agent based modelling) not only to try to understand and explore natural resource based systems, but also with a large focus on understanding how humans interacted with natural systems, and for providing a platform for managers and stakeholders to effectively communicate.

With so many different disciplines, methods, and objectives, complex systems science is certainly a little confusing at first. But the enthusiasm for taking complicated things and reducing them to the “merely” complex really cuts at these traditional boundaries. So thanks again to CSIRO CSS for this really interesting workshop – and for those of you interested, keep an eye out, it’s run every year.

The “success” of the COP16 climate negotiations in Cancun

Liz Law

After the media circus around COP15 in Copenhagen, COP16 barely managed a blip on the radar.

What was it like? Interesting. Absurd. Atrocious. Intriguing. These words would all fit the bill.

I wasn’t sure what to expect at first, and still am not quite sure what to make of it.

The setting

There was strong indication that there was not to be a repeat of the Copenhagen circus. For one, there was to be no snow, only the long white sand and turquoise blue of Caribbean beaches. From the opening the ES Cristiana Figueres made it clear that they had every intention of achieving an agreement in Cancun, if only to create the impression that the UNFCCC process can achieve something. Compromise was strongly pushed. She also made strong statements regarding transparency of the process, a response to the backlash against the closed doors in previous meetings.

The spatial allocation of venues was clearly an exercise in avoiding any “incident” – such as negotiators and diplomats meeting with demonstrations and protestors. The main negotiations took place at an exclusive resort, miles from anywhere, quite isolated from the public by multiple security checkpoints, and surrounded by military on both land and sea. At over US$500 a night, and limited capacity, you can imagine how this would restrict the people who could stay on location. Media offices were so far from the main rooms that they had to have a mini bus to take them there, and the closed doors and restricted capacities in some rooms seemed to be intentional strategic barriers for many attendees.

The side events were held at a different venue, about 30 mins away from both the city and the main negotiation venue. Again strictly controlled by a military presence, there was virtually no opportunity for non-conference attendees to make any sort of presence there. They really should be called “sideshows”, they are an eclectic series of random events from industry promotions, science reports, advocacy rants, and community sing-alongs. Almost all showered you with a forest load of reading material to compliment the non-climate friendly air conditioning. Networking was clearly the primary aim for most people’s game, but it was also useful to gain a wider perspective on what is certainly a multifaceted mess. Continue reading The “success” of the COP16 climate negotiations in Cancun