Thursday, December 10, 2009
AM at the 70th Midwest F&W meeting
I just got back from the 70th Midwest Fish and Wildlife conference, which is a regional meeting shared among 10 states in the Midwest. I was one of the presenters at a symposium on Adaptive Management organized by the Nebraska Cooperative Fish and Wildlife Research Unit. One of the best things about the symposium was an opportunity to start doing some "introspection" on AM. What works? What doesn't? What are we doing here anyway?
The show was kicked off by Ken Williams, Director of the Cooperative Research Unit program at the USGS. Ken is a powerful and charismatic speaker who always goes at 100 miles an hour through mind blowing concepts. His message was simple, but direct - if anyone tells you there's only one way to do AM they're blowing smoke. OK, I paraphrase, but that's the gist. The essence of AM however defined is "Do, Learn, Do again".
The next speaker was Jamie McFadden, a graduate student from UNL, who is conducting a review of recent published work describing AM in an effort to identify characteristics of successful AM projects. Of course, her first challenge was to figure out how to define success! Her scheme focused on how close projects came to implementation, defined as actually making a management decision - Doing.
Armond Gharmestani of the EPA made some useful points about the relationship between law and AM - in particular pointing out the requirement of current administrative law to do all the planning "up front", which is extremely problematic for AM. In my opinion this is probably the single biggest institutional hurdle to overcome for folks wanting to do AM.
Clint Moore of the USGS presented two examples of collaborative efforts between the USGS and USFWS to implement AM - the AM consultancies and the AM for Refuges cooperative effort. The first is small scale - 2-3 days to lay out a plan that a single refuge manager can implement. The example was of managing burn treatments to maximize species richness on a refuge in Minnesota. The second type of project is intended for projects that span multiple regions and many refuges with a shared management problem, Reed Canary Grass in the example Clint gave. Clint's talk highlighted one of the key problems with implementing projects at that scale - a lack of people with both the time and technical expertise to maintain the AM project once the direct technical support is withdrawn.
Dave Galat from the Missouri Coop Unit gave a talk on the implementation of AM on the Upper Mississippi River. This is an interesting example for me because the UMR AM effort is a USACE habitat restoration project similar to what I am involved in on the Missouri River.
Tuesday, December 8, 2009
Adaptive Medication
Thus I was intrigued by Pascale Hammond's discussion of how a clinical health care provider works and why this is science. Frankly I couldn't agree more - but the AMed spin raises an interesting point about what happens with all the data generated by those diagnostic tests of differential diagnoses. Unless I give my GP explicit permission, that information is not usable to improve medical understanding. Millions, nay, Billions or Trillions of dollars spent every year testing hypotheses and none of it used to advance the underlying science. I understand that there are privacy issues, but ... but ... this is my potential future longevity that is suffering! Maybe there would be a way for individual patients to "volunteer" to have their diagnoses and test results entered into a global double-blind database in much the same way that gene sequences are.
Saturday, December 5, 2009
The black box of risk
The general lesson to take from such experiences is that it is rarely the models that are at fault; it is instead the use of those models in ways that are inappropriate and can lead to flawed decisions, sometimes with very large consequences. Too often the models are treated like black boxes and their use is overlooked as being the domain of technical experts. To make better use of risk models in business decisions, we need to open up the black box and better understand the role of models in decision making. Because of the potential for conflicts of interest, it is important to have independent eyes looking at the models and their use, a role that too often goes overlooked.Although Roger is talking about hedging investment funds, I think the same could be said for models of risk to endangered species. Building the models is easy - even incorporating risk, error and uncertainty is not difficult. But understanding the outputs of those models demands a certain level of knowledge. And worst of all, we have not worked hard to understand or develop decision making in applied ecology, let alone understand the use of models in decision making. The politicization of climate change science should stand as a warning to us - as the stakes rise the challenges of honestly providing scientific advice and maintaining credibility become much greater.
What do we do about it?
- Ecological managers at all levels must recognize that they are making decisions.
- Making decisions implies some effort, however implicit, to forecast what will happen in the future after the decision is made.
- We have excellent theoretical tools for forecasting the future of populations and ecosystems (not communities, unfortunately).
- Ecological managers should be trained in using those tools while making decisions.
Tuesday, December 1, 2009
Staying supple
One of the things that's not so good is the lack of flexibility created by those same detailed procedures. Two things in particular that changed after the agreements were signed 1) Phragmites australis spread throughout the central Platte, and 2) Ironoquia plattensis, a new caddisfly species was discovered. Surprise! The future is not like the past. This is exactly the situation that AM should be able to deal with - new information that changes how management actions will play out. Phragmites is problematic because it dramatically changes how the vegetation community will respond to short duration high flow events - it is very resistant to scouring and drowning. It seems as though the reality of Phragmites is settling in, but it took a tremendously long time, and caused much angst. So the key problem is how to design a program that is sufficiently well tied down that people will sign on, but remains flexible enough to deal with new circumstances far outside the scope of the original concept.
Maybe too much to ask, especially when the stakes are high.
Postmortem AM
That said, it is always nice when data collected during a management action is well analyzed. There's a nice example of that from the Outer Hebrides, where Thomas Bodey and colleagues used stable isotope analysis on whiskers collected from culled American Minks. The carcasses are in the hand - why not extract information from them, if it will help make the eradication campaign more efficient. The nice thing about an eradication campaign in an archipelago is the clear iterated nature of the decision making - distribution of effort in habitats on new islands provides a nice opportunity to apply lessons learned from previous campaigns.
And in other news, check out Methods.Blog, where Rob Freckleton is collecting samples of new methods in ecology and evolution from across a wide range of journals. He is also the editor of a new journal from Wiley-Blackwell devoted to methods in ecology and evolution. Looks interesting!
Saturday, November 28, 2009
Obama does AM!
Obama discussed his professorial leadership style in a recent interview with U.S. News & World Report. He said he is not afraid of doubt and is comfortable with uncertainty: "Because these are tough questions, you are always dealing to some degree with probabilities. You're never 100 percent certain that the course of action you're choosing is going to work. What you can have confidence in is that the probability of it working is higher than the other options available to you. But that still leaves some uncertainty, which I think can be stressful, and that's part of the reason why it's so important to be willing to constantly reevaluate decisions based on new information."Nice. But clearly he hasn't read Frank Knight's book "Risk, Uncertainty and Profit" because he describes uncertainty as probability. Environmental managers in the public service take heart, because your boss is comfortable with uncertainty. Grapple with it, please.
Saturday, November 14, 2009
Credibility eliminates uncertainty
It is known that perceived usefulness (which presumes credibility) in the eyes of policy makers often depends on whether the forecast (or other expert input) corresponds to the potential user’s preconceived beliefs, ...This from the 2005 report of the National Academy of Sciences on decision making for the environment.
I knew it! Expressing prior beliefs as prior distributions is important. Now, what to do about undermining your own credibility by demonstrating the effects of users beliefs on your forecasts .. .
Tuesday, November 10, 2009
Predictions, Forecasts, projections and scenarios
I find their definition of a projection curious: "a statement about what would happen, based on the extrapolation of past and current trends (e.g. population projections)." The way they use the term projection it is based on observed data extrapolated out into the future - assuming no change in what is presently going on. This is consistent with how Hal Caswell describes matrix projection models of populations, although those are quite different from simple extrapolation based on trend data. What makes this definition curious is that it is different from how MacCraken defines a projection: "a projection is a probabilistic statement that it is possible that something will happen in the future if certain conditions develop. " Specifically - what happens if conditions ARE NOT the same as they are now, and in particular, if conditions are affected by management options between now and then.
Hmm, so I guess I'm making projections sensu MacCracken.
Monday, November 9, 2009
Forecasts are not predictions!
Food for thought. Still don't know whether to use forecast or prediction.
Thursday, November 5, 2009
Communities of practice
I think this notion of a CoP for AM/SDM in conservation/environmental management is crucial if we're going to get better at what we do. I participate in a couple of other communities - the R-Help list and www.plantedtank.net. Both are publicly accessible, unlike the SDMCOP mailing list, although in order to add material you have to register - which means at least providing a valid email address. This provides for a mixture of public and private participation, as well as many different levels of participation - both aspects that are thought to promote successful communities of practice.
I think one of the things a CoP website/forum could do is provide a place to vent about things that don't work - although maybe that's better not done in public ... nonetheless it would be great to be able to write about successes, failures, and works in progress. There is a tremendous amount of experience out there that is growing rapidly, but it is hard to get it exposed in traditional academic outlets.
Wednesday, October 7, 2009
The three bombs
Monday, September 14, 2009
AM for parks and uncertainty vs. risk
Thursday, September 10, 2009
Adaptation science needs adaptive management
Holger Meinke, S Mark Howden, Paul C Struik, Rohan Nelson, Daniel Rodriguez,
Scott C Chapman, Adaptation science for agriculture and natural resource management
-- urgency and theoretical basis, Current Opinion in Environmental Sustainability,
Volume 1, Issue 1, October 2009, Pages 69-76,
ISSN 1877-3435, DOI: 10.1016/j.cosust.2009.07.007.
Roughly right
‘It is better to be roughly right than precisely wrong’I believe this is true for ecological systems, but where it runs into problems is when we have to interact with engineers, for whom the idea of "roughly right" seems to be anathema. Macroeconomists don't interact with engineers, I guess.
John Maynard Keynes (1883–1946)
Engineers can't predict the detailed dynamics of turbulent flow in a pipe any better than I can predict the dynamics of a small population - probably worse, in fact. They can predict the onset of turbulence very well, and the gross properties of the flow. I can do that too - average growth and the variance of the population, and even include the effects of observation error and estimation error in the parameters. What is puzzling me is why ecologists fail to use these powerful theoretical concepts in their interactions with engineers - we have the rigor. Let's use it.
Wednesday, September 9, 2009
Zen for AM
Dr. Scott Field
Tuesday, September 8, 2009
Is anything new under the sun?
It is interesting to me to see Carl's scepticism about prediction and forecasting from complex models given the emphasis the North American school places on such models. His assertion that the models are there to explore and generate hypotheses, which are then tested using management experiments, is very interesting. I think the biggest issue that I can see is that management experiments carry a risk of failure, or at least represent a tradeoff between what the best that is expected (given current knowledge) and what the experiment will achieve. Quantifying that risk could be critical to selling the idea of a management experiment.
It seems as though the North American school emphasis on developing a shared understanding, or at least alternative hypotheses about how the system works, is appropriate if stakeholders actually hold different views about how the system works. However, in the case that interests me the most, there actually isn't any disagreement about how the system works. Fundamentally the stakeholders (federal agencies, in this case) have different tradeoff curves between two key objectives. Rather than engage with that debate, the focus is on measuring whether the alternative preferred by one agency (the most risk averse one) is "adequate". That's fine, but the real debate will resurface as soon as the analysis demonstrates that it isn't.
Hmmmm, interesting times ...
Thursday, September 3, 2009
The value of a closed door
hmmm, time for a stochastic dynamic program to allocate my effort between habitats!
Saturday, August 22, 2009
INTECOL X - Brisbane
I've been attending the 10th INTECOL Congress in Brisbane Australia this week. Thanks to the close proximity of the conference to AEDA at UQ, there were some great sessions on adaptive management, modeling, and later this week on expert elicitation of priors. A few quotes:
If you don't know what a differential equation is, then you're not a scientist – Prof Hugh Possingham
All models are beautiful, and some are even useful – Dr. Brendan Wintle
And then there was the definition of AM given by Richard Kingsford and attributed to Norm Myers.
Adaptive Management is like teen sex. Lots of people say they are doing it, only a few actually are, and they are doing it badly.
My own talk ended up in a session on Landscape Ecology – mostly reflections on why some structured decision making workshops work, and some don't. I probably should go back to school and get some proper training before doing any reflecting on social processes, but who's got time to do that? My colleague Rodrigo Bustamante from CSIRO Marine and Atmospheric Science made probably the best point – that the participants have to have "bought into" the workshop idea, and at least the general idea of using the workshop to analyze a decision before coming along. This is where pre-workshop conference calls are essential to discuss what the workshop is meant to achieve.
Tuesday, August 11, 2009
Useless arithmetic?
Orrin Pilkey and Linda Pilkey-Jarvis have recently made a bit of a splash arguing that mathematical models are useless for environmental management – I haven't read their 2007 book yet, but it received some popular press. Primarily they argue that optimism about models helping environmental managers make better decisions is misplaced. Recently I came across an article in Public Administration Review where they summarize their arguments for policy makers in 10 easy lessons. This quote sums up their paper well:
Quantitative models can condense large amounts of difficult data into simple representations, but they cannot give an accurate answer, predict correct scenario consequences, or accommodate all possible confounding variables, especially human behavior. As such, models offer no guarantee to policy makers that the right actions will be set into policy.
They divide the world of models into quantitative models, exemplified by their favorite whipping horse, the USACE Coastal Engineering Research Center equation for shore erosion, and qualitative models, which
" … eschew precision in favor of more general expectations grounded in experience. They are useful for answering questions such as whether the sea level will rise or fall or whether the fish available for harvest will be large or small (perspective). Such models are not intended to provide accurate answers, but they do provide generalized values: order of magnitude or directional predictions."
What I find somewhat humorous is their assertion that instead of quantitative models we should use qualitative models based on empirical data. All of their examples of model failure are great ones, but in every instance they suggest using trend data and extrapolation as a substitute. This is simply using a simpler statistical model in place of a complex process based one. If trend data is available, then by all means it ought to be used. But what if data (read "experiences") aren't available? Is the alternative to make decisions based on no analysis at all? How is that defensible? And what if the world is changing – like the climate – is experience always going to be a good guide?
While I agree with most of their 10 lessons, I must take issue with one of the lessons. Lesson 4: calibration of models doesn't work either asserts that checking a models ability to predict by comparing model outputs with past events is flawed. While it is true that you can make many models predict a wide range of outcomes just by tweaking the parameters, this isn't something that should be done lightly – and isn't, by modelers with any degree of honesty. There are many ways to adjust the parameters of simulation models against past data using Maximum Likelihood methods, or for more complex models, approaches such as pattern based modeling advocated by Volker Grimm and colleagues. As they suggest, this is no guarantee that the model will continue to predict into the future – but if the model structure is in fact based on an accurate scientific understanding of the processes involved then it better come close. If it doesn't the principle of uniformity on which science relies must come into question. The scientific understanding could be flawed as well, but this could always be true whether you use mathematical models or not. It is also the reason why there shouldn't be only one model (Lesson 8: the only show in town may not be a good one). They also find the prediction of new events (validation) to be questionable, but again, this can be done well, and when independent data are available it is the best way to confirm that the models are useful. Personally I find the failure of a model to predict an event, either past or future, to be an ideal opportunity for learning. Obviously something about reality is different from what we expected, so what is it?
What is particularly intriguing is that they view Adaptive Management as an alternative to quantitative modeling!
I think in the end what they are taking issue with is not quantitative models per se, but rather a monolithic approach to environmental management and policy making. Lesson 7: Models may be used as "fig leaves" for politicians, refuges for scoundrels, and ways for consultants to find the truth according to their clients needs I think gets at this point directly. It isn't models that are the problem, but rather how they get used. So, the solution to avoiding the problems they cite is to use models openly, as the IPCC does. Multiple models, ideally, or at least a range of parameters that express alternative views of reality. Avoid black boxes – follow the "open source" approach to computer programs used to make predictions. And, analyze the data! I think I've said that before.
Pilkey-Jarvis, L and O. Pilkey 2008 Useless Arithmetic: Ten points to ponder when using mathematical models in environmental decision making. Public Administration Review. May, 470-479.
Tuesday, July 7, 2009
Zen
"Seeking the Buddha is much like riding an ox in search of the ox itself."
This from a book on Zen Buddhism. Just replace "Buddha" with Adaptive Management. Works for me.
Wednesday, June 24, 2009
The Principle Challenge
For land management, the principle challenge relates to prediction. Managers need to know how today's management decisions will impact tomorrow's ecosystem processes.Ra! Ra! Sis Boom Bah! Yes! And guess what, that means using models. All the fancy remote sensing in the world is no good unless you can use that data to meet the principle challenge. The trouble is, managers are often reluctant to recognize that models can be helpful. In my recent experience, if models are "known" to have flaws (see quote by George Box), or produce a range of predictions because of statistical error in parameter estimates or inherent variation (demographic stochasticity), then they are labeled useless. Better to use gut instinct to make decisions.
I do believe models are useful even when they are not (and they never will be) perfect.
Jeffery T Morisette et al. (2009) Tracking the rhythm of the seasons in the face of global change:
phenological research in the 21st century. Frontiers in Ecology and the Environment 7:253-260
Thursday, June 18, 2009
Zen Buddhism and Adaptive Management
I've been entertaining a visitor this week, Dr. Scott Field, currently a lecturer at the Naval Postgraduate School in Monterey. 5 years ago we worked together on a project analysing fox monitoring data from Eyre Peninusula in South Australia. This week Scott's been conducting the 2nd Quinquennial fox monitoring analysis - and he has results! Nice ones. Moral of the story - analyze your data. It helps. Really.
Scott's recent work has focused on conflict resolution in International Relations, and he had a few extremely interesting comments about Adaptive Management. One thing that causes problems is people rejecting analyses and predictions that could help them with decision making. The difficulty arises if they are not able to understand the underlying methods used to derive those analyses, then they have no way to discuss them. This leads people to react emotionally, rather than critically. Thus, it is critically important to conduct analyses in groups and provide lots of opportunities for feedback and interaction. This still isn't going to solve the problem when you are dealing with sophisticated analyses of complex data. But recognizing that issue will help me to not respond in kind.
The broader notion that we've come up with is that Buddhism has alot to offer the practice of adaptive management - hey, stay with me for a bit. One of Buddhism's "Four Noble Truths" is that Attachment leads to suffering. Attachment can be to things, but also to ideas. So the clear connection to AM is that when someone is attached to an idea about how the world works, then it is hard for them to expose that idea to data and analysis - the risk is that they might have to give up their idea, leading to suffering. Thus finding a good solution to an environmental problem involves giving up attachments to ideas. The trouble is there is no way to force someone else to give up their attachments.
There are other components of Buddhism that have lessons for AM, but I'm just learning about them so won't write any more just now.
Monday, June 15, 2009
Blending math and biology - nothing new there!
I can imagine no more beneficial chance in scientific education than that which would allow [biology and mathematics] to appreciate something of the imaginative grandeur of the realms of thought explored by the other.Wow. So he was already recognizing this gap in 1930! We've got some catching up to do.
Thursday, June 11, 2009
Wrapping around vs. setting up
As a result, the teams that I am working with are trying to take an existing batch of actions, including experiments at various scales, different monitoring programs, and restoration actions, and wrap an AM framework around them. This is exciting on one hand, because stuff is really getting done - including experimental habitat restoration complete with monitoring. On the other hand, it is incredibly difficult, because now there are alot of people with vested interests in projects and programs who are reluctant to embrace change. This is understandable, because a changed program management structure may not include "your" program, or at a minimum may require you to interact with people that you didn't interact with before. Thus a huge part of getting AM off the ground on the Missouri River involves managing people's expectations - and just plain communicating with them. Often.
So far, I'd have to say that I have a much greater appreciation for the governance piece than I did before!
Tuesday, June 9, 2009
Objectives! and PIGS
And as always, it comes back to objectives - and here's a quote to drive the point home:
If you don't know where you are going, any road will get you there.
- Lewis Carroll
Monday, June 8, 2009
Open data, free analysis?
For the analyst it creates a different kind of headache, because now you're going to have to defend the myriad choices you made enroute to getting an answer. Because in many cases there are no right answers, only better and worse ones. I should have to defend those choices, but it increases the time required to finish an analysis.
I came across a post about open data publication in the world of genomics - interesting stuff. A world where academic labs of 20 people are "small". Hard to imagine when I'm wrestling with the transition from a lab of 1 (me) to a lab of 4 (me, a postdoc, and two grad students). But the critical point made was about the differences in incentives leading to differences in data publication - submission of raw or analyzed sequences to public databases. Genome centers of 1000 people get funded based on genomes produced - hence they "publish" their data to databases quickly. Academic labs get funding based on paper outputs, so they lag in submitting data to public databases until they have the papers in press - getting scooped sucks. So we could fix that problem by changing the funding model for small labs as well as large, but, and its a big but for me too - how do you get funding to do analysis?
I was frankly delighted to see that someone else also thinks analysis doesn't come for free. In my world, I regularly meet people who have data, and think its relevant to a management problem. But they don't have the expertise to turn that data into something relevant to their problem. Unfortunately it is hard for me to help - I'm only one person, and already completely swamped. The classic natural resources model of "put a student on it" doesn't work well for analysis, because it takes YEARS to develop the necessary skills. Frankly it took me decades. Grant milestones can't wait for that. Ideally, there is some method for a student to start developing the skills, absent pressure to deliver, then when they are ready they can practice by working on a real project. So - who pays for that early development? One solution is teaching assistantships - get them helping undergrads. Well - great, if your department works that way (mine doesn't).
So, what to do in an Adaptive Management world where data is guarded, analysts are scarce, and problems are immediate? The current solution is to make decisions without analysing the data. Publishing data - online, available in raw form - would mean that many additional hands and minds could come to the task of working out what it means, and what the best way to get there is. The USGS publishes stream discharge data in real time. That's not realistic for Tern and Plover fledging success data, but annually - it could be annually. A central database with relevant data would make many things easier for my Missouri River work. Models should also be a part of that database! Open source model for Adaptive Management.
Food for thought, but no answers today.
Thursday, June 4, 2009
Making good decisions
Climate change will create a novel and dynamic decision environment. The parameters of the new climate regime cannot be envisioned from past experience. Moreover, climatic changes will be superimposed on social and economic changes that are altering the climate vulnerability of different regions and sectors of society, as well as their ability to cope. Decision makers will need new kinds of information and new ways of thinking and learning to function effectively in a changing climate.
The first two chapters are a goldmine of references on the social science of decision making. What's reassuring is the strong overlap with Department of Interior approaches to adaptive management that I've been pushing. The key learning piece for me is the emphasis here on the process and governance of decision support - something that I personally have started to recognize I'm incompetent at. Lots of good stuff.
The 3rd chapter is on learning - and Adaptive Management is one of the approaches they discuss. I found the discussion illuminating - it references the North American School of adaptive management, but not the more recent work on the Australian school, or the move by the Department of Interior to use AM. It does discuss several possible reasons why "active" adaptive management, the use of managment experiments to reduce uncertainty are difficult. They characterize decision making objectives as "unchanging" in AM, which I disagree with - at least in Australian School AM approaches we accept the ability and need to update objectives periodically.
Sunday, May 31, 2009
Managing ecosystems is hard!
I messed around with adding a few plants to compete with the algae for nutrients, snails and shrimp to eat the algae, but it was a lost cause. Nothing I did put a dent in the algae - although I was still trying "ecosystem management" - no chemical herbicides. Eventually I allowed myself to try physical removal, but even that was only ever a temporary fix. The algae always grew back. I did discover quite a diverse microfauna of flatworms, Hydra (these I had added in culture), and some copepods. No idea where the copepods and flatworms came from.
So - physician, heal thyself! What were my objectives? I decided that I wanted to create a stable ecosystem with at least two trophic levels and no green slime. Allochonthous productivity (i.e. fish flakes) would be allowed. I started with a completely new substrate, reduced lighting, reverse osmosis filtered water (no phosphorous!), and a larger pool of aquatic plants (primary production). Same snails (Herbivores), shrimps (detritivores) and guppies (ammonia factories? Eye candy?). So far, things are going well. My female guppy has produced lots of baby guppies, water quality is excellent, and the snails keep the glass spotless. The hornwort, duckweed, and Amazon sword plants are doing great, the other plants are clinging to life or have succumbed to ... probably snail consumption. I provide a trace element fertilizer and feed the guppies once a day with a variety of dried foods or crushed peas. The snails get a piece of lettuce to chew on once a week.
So - what have I learned about ecosystem management?
- The abiotic environment matters duh, that should have been a no brainer - after all the definition of an ecosystem is a community of species plus the abiotic environment. But I think terrestrial ecology doesn't give one the appreciation of power of the abiotic environment that an aquatic environment does.
- Biological control isn't perfect Even though my snails keep the glass and substrate clean, the green slime persists in odd corners, especially up in the fine leaves of the hornwort.
- Top down controls need support Although I haven't tried this yet, I think if I were to stop feeding the snails "on the side" they would starve to death pretty fast. There is NO algae in the tank. Keeping them fed means that I have more herbivores than the primary production in the tank can support, and thus makes the top down control of algae more effective.
- Models are critical especially when your sample size is one. Now, I don't have a simulation model of my aquarium, but I do have a pretty good conceptual model of what's going on in there. My model isn't perfect - for example my understanding of water chemistry is pretty rudimentary. But it provides a framework for thinking about what's going on, and categorizing suprises - like the fact that the water is suddenly all hazy this morning. What's going on there?
And if anyone wants alot of baby guppies ...
Friday, May 15, 2009
A joke of a science
There were four population ecologists shivering and starving, trapped in the boreal winter – a conservation biologist, a fisheries scientist, a theoretical ecologist and a pest manager. A moose appeared on the horizon and came thundering towards them – 1000 kg of warm edible flesh. Each scientist drew on his or her expertise and dealt with the moose using all their respective discipline’s wisdom:
- The conservation biologist couldn’t decide on an objective. He died wondering whether the moose’s existence was more important than his own.
- The fisheries scientist used the wrong model. Based on her prior knowledge of elk, she predicted that more moose would be coming, so she starved in anticipation of a herd that never appeared.
- The theoretical ecologist drew out his laptop and quickly wrote a program to calculate the optimal distance at which to shoot the moose. His calculations proved that the optimal distance was an imaginary number, and he would have been successful had the moose entered imaginary space.
- The pest manager knew immediately that the moose had to be killed – the only question was with what – pesticide or natural biological control? She opted for the environmentally friendly biological control and released a wolf, which turned around and ate her.
Shea K. et al (1998) Management of Populations in conservation harvesting and control. Trends in Ecology and Evolution 13: 371-376.
Wednesday, May 13, 2009
The nature of interdisciplinary work in Nature
...those who bring disciplines together in the pursuit of scientific and technical answers find that the best people to have on board are those with deep specialized knowledge, and that such individuals can usually find a place in well-led collaborations.This resonates with my own experience - in pursuit of "multidisciplinarity" we cannot forget the value of people with deep knowledge. Martin Kemp, in an essay later in the same issue, put the need this way:
What is needed is an education that inculcates a broad mutual understanding of the nature of the various fields of research, so that we might recognize where there special competence and limitations lie.Such an education is not the same as having students take introductory courses in all disciplines, which is a common outcome when groups get together to train students across disciplines. I believe that a "broad mutual understanding" can only arise by working together with people from other disciplines on common problems - this needs a different course, one that is distinctly interdisciplinary. We still need people to do multiple things - biologists that take some math, and policy analysts that take some biology, but there are few people able to do math, ecology AND policy science. And yet to solve real world problems in river restoration we need to be able to bring Policy analysis, math, hydrogeology, biogeochemistry, and ecology to bear at a minimum.
Saturday, May 9, 2009
Costs of flood control
Unbounded uncertainty. Levees and other engineering structures work as long as the system doesn't exceed expectations - expectations that are based on the "period of record".
Friday, May 8, 2009
Not all ecosystem services are sexy
Prosser, P., C. Nattrass, and C. Prosser. 2008. Rate of removal of bird carcasses in arable farmland by predators and scavengers. Ecotoxicology and Environmental Safety 71:601-608.
roads - the ultimate evil
Another example of how we can't always predict the indirect effects on ecosystems arising from human activities.
E. Suárez, et al 2009. Oil industry, wild meat trade and roads: indirect effects of oil extraction activities in a protected area in north-eastern Ecuador. Animal Conservation (online).
DOI: 10.1111/j.1469-1795.2009.00262.x
Wednesday, May 6, 2009
Answering the dinner lady
But that's not what I wanted to write about - we also had dinner at a local restaurant (Front Street Steakhouse, Ogallala - they even had a pretty good vegie burger, but get the extra mushrooms and onions) - 20 odd biologists, hydrologists and engineers descended on this fine local establishment for an excellent meal and good conversation. As the last few of us were paying our bills, the woman behind the till, asked us "What are you doing with our River?". I promptly passed the buck to Chad Smith - after all, I'm merely a consultant! Chad gave a pretty good short answer - something about planning for environmental restoration - but that wasn't quite enough for her. She had two things on her mind - getting jobs in Ogallala ("Why aren't you doing it NOW") and getting the river back from the "Hunters from Cabela's" into the hands of the farmers. At the time I wasn't quite sure what to make of that last comment about the hunters, but now I think it was her label for people that want the river to do something other than provide irrigation water.
This is the essential tradeoff - water left to go down the river is water that farmers can't use for irrigation, and hence a reduced economic output for the region. Worse, the environmental restoration will not provide the jobs that the dinner lady wanted. The program has lots of money, but much of it is going to people like me - high priced consultants to help design and monitor the restoration. Much also goes to purchasing land, paying taxes and so forth, but relatively little is going to get directly into the economy of Ogallala. I don't think the full story would make the dinner lady very happy at all.
This idea that water not used is wasted isn't limited to Ogallala. I wasn't able to attend the "Future of water for food" conference held this week in my office building, but one of my colleagues that did quoted a speaker there - "water that reaches the sea is water wasted". That's a pretty radical point of view! But as an ecologist, how do I rebut that view? And, will the dinner lady ever accept my answer?
Monday, May 4, 2009
Detecting Ecological Thresholds
Even picking up thresholds in data is hard - I've used maximum likelihood estimation of split line models to do this, but you have to decide in advance how many different "thresholds" to put into the models. Derek Sonderegger and colleagues present a different way of identifying thresholds in this weeks Frontier's in Ecology and the Environment - using a method called SiZer that examines changes in the derivatives of smooth curves fit to the data with different bandwidths. This is way cool - makes nice pictures too!
Sonderegger et al. (2009) Using SiZer to detect thresholds in ecological data. 7(4):190-195
doi:10.1890/070179
Friday, May 1, 2009
Supporting women in AM?
I've often wondered whether I do enough to encourage women in my profession - as a male in a department DOMINATED by males I perhaps should worry about it more. Whenever I've served on search committees I've always gone in with the notion of pushing women candidates as hard as possible - but this is where my efforts go awry, because there often are no women candidates, or the women that do apply are so far down the list of candidates by any objective measure that its impossible to get them on the short list no matter how one tries to skew the numbers. I mean really really far down. So where are all the women applicants?
I've heard that many capable women scientists bail between grad school and academia, but in our case even that is hard to believe. I worry that our department is stuck in a kind of alternate stable state - qualified women don't apply *because* we are dominated by men. Many departments at UNL have used opportunity hires very successfully to dig themselves out of this state, but we're not so good at that - at least not since I've been here. The only tactic I've been able to think of is to actively seek out women that might be interested in a position and ask them to apply - but that kind of cold call requires a certain chutzpah that is in short supply for me. Any other ideas welcome.
Something to think about as I go about training students to become leaders in AM.
Wednesday, April 29, 2009
More web 2.0 messing around
Thinking of cool networks - check out this map of science built using data on "clickstreams" - which links do people click on in what sequence as they search publishers websites and web portals such as ISI web of science. Pretty cool stuff - at least it looks cool. Nice to see that ecology fits in between social sciences and biology.
Tuesday, April 28, 2009
A how-to manual for transformative science
"The term "transformative research" is being used to describe a range of endeavors which promise extraordinary outcomes, such as: revolutionizing entire disciplines; creating entirely new fields; or disrupting accepted theories and perspectives — in other words, those endeavors which have the potential to change the way we address challenges in science, engineering, and innovation. Supporting more transformative research is of critical importance in the fast-paced, science and technology-intensive world of the 21st Century. "
but ... that begs the question - how do you go about revolutionizing an entire discipline? Disrupt an accepted theory, OK, I can see how to propose that - attacking foundational assumptions. Create a new field? seems a tall order for one 3 year grant.
Monday, April 27, 2009
Shifting Baselines and why we train grad students
For a long time - since I started writing grants to fund myself as a PhD student and later as a postdoc - I've believed that the primary reason Universities push graduate education so hard is economic. Graduate students are cheap, highly motivated labor, both for teaching and research. Mark Taylor wrote a nice Op-Ed piece in the New York Times last week where he made exactly that point - from the perspective of a professor of religious studies. He also made some pretty radical suggestions for how to reform the University system. Now I don't necessarily agree with every point he made there, but the one about graduate training emphasizing "cloning" is highly relevant. One of the big struggles facing the implementation of AM is a lack of people that have the right kind of background. Training as a research ecologist does not prepare you to help managers do Adaptive Management. Training as a wildlife biologist does not prepare you to use AM in making decisions - in fact, we avoid teaching ecologists about decision making altogether. We (the denizens of the 4th floor of Hardin Hall) have put together a graduate specialization in AM, but this only solves half the problem.
The other half of the problem - well, maybe the other 90% of the problem - is that graduate education for wildlife biologists in North America is largely funded by RESEARCH grants. So - how do you write a grant to fund someone who is going to learn how to do AM? Sure, they can work on a project to develop an AM plan, but it is going to take alot longer to develop than say, counting birds on a bunch of conservation easements before and after woody veg removal. The latter is relatively easy for a state agency to fund ("relatively" is an important word here!), whereas the AM plan is much harder, because it looks expensive and there is a long lag time before a student can start to really get the work done. And where's the research component? If you want to do research on developing the tools of AM - quantitative methods, stakeholder interactions, etc - great, but then you really have alot of learning to do before you can be effective. Who pays for you to develop the skills needed to even begin doing that research?
Sorry, no answers on that front. I'm open to suggestions for how to pay for graduate education that is relevant to professionals in the field.
Resilience and AM
My colleague and collaborator Jim Peterson describes the fundamental intellectual debate about AM as falling into two schools of thought: the North American and Australian Schools. The "North American school" developed directly from Buzz Holling's work, and evolved into it's current form through its application primarly to the Florida Everglades - Resilience Alliance is one great source of information on this perspective and the many places it has been deployed. The Collaborative Adaptive Management network is another.
The "Australian School" developed in parallel, inspired by the ideas of Buzz Holling and collaborators, but with a much greater emphasis on using quantative tools from Decision Theory. This approach is currently the focus of vigourous development by Prof. Hugh Possingham and the research facility for Applied Environmental Decision Analysis as well as many scientists at the USGS. The USGS folks have also one of the best examples of an "Australian School" AM in the North American Waterfowl Harvest Management plan.
So - what, if anything, is the difference between the two schools of thought? I believe that the differences are rather small, but those small differences may have significant consequences. First, the North American school requires (insists? believes?) that Adaptive Management uses management actions as experiments, and that building resilience is the goal. This is certainly Holling's intent in his original writing, and thus these two components of experimentation and resilience echo strongly in projects like the Florida Everglades and Glen Canyon. The Platte River Recovery Implementation Project is also focused on experimentation and resilience building. To put one's finger right on the difference - the Australian school does not require these two things, although they are certainly allowed. Otherwise I think they're pretty much the same.
I plan to write a bunch more about these differences and their consequences, but I wanted to get my views up now for a variety of reasons.
Sunday, April 26, 2009
Its sunday
Monday, April 20, 2009
Calling the corner pocket
Sunday, April 19, 2009
Population Viability Management
The only thing I was a bit disappointed by was the way Bakker and Doak skirt past the issue of tradeoffs - although they mention that cost-viability tradeoffs occur, in their example they have chosen not to evaluate them. Any actions that exceed the current available budget are not evaluated. In that sense, the tradeoff IS made, and at a rather extreme level. I agree with their assertion that conducting a full analysis of the ecological risks makes the economic assessment more meaningful - and thus all the more surprising that such ecological assessments are not conducted more often.
They conclude with a reference that I'll have to pursue - about how iterative cycles of explanation and improvement ultimately led to uptake of the recommendations of the model by managers - what they call the "handshake approach".
Bakker, Victoria J. and Daniel F. Doak. 2009. Population viability management: ecological standards to guide adaptive management for rare species. Frontiers in Ecology and the Environment 7:158-165. doi:10.1890/070220
Tuesday, April 14, 2009
Funding Transformative Science
Maybe there's another way. What if agencies simply gave all scientists (and I include social scientists here) some baseline funding? Sounds expensive, but think of how much effort it costs society to review all those grants. Check out this review of several relevant articles and discussion at the link, it's thought provoking.
Corn! and NEPA
One of the conclusions of that article is that decreasing diversity decreases ecosystem function. I came across another article on the diversity-function debate in last week's Nature. The authors set up > 1000 microbial communities, each with the same suite of 18 taxa of denitrifying bacteria. The key was that the communities varied in "evenness" - the extent to which the community is dominated by one or a few species. They then challenged each micro-community with a dose of nitrite, and then measured how much of the nitrite was removed 20 hours later. Communities with greater evenness performed better - and more importantly, maintained that improved performance to a greater extent when the microcosm was simulataneously challenged with increased salinity.
So, is this resilience sensu Holling and Gunderson? I think not, for a couple of reasons. First, although the functional rate decreases with decreasing evenness, all the microcosms still function to some extent; there is no "flip" to an alternative non-functioning state. The appearance of a non-linear threshold would be the ultimate cool result, but they found only smooth changes. Second, they did not examine the structure of the microbial community AFTER the perturbation. Resilience posits that resilient systems will maintain their structure during and after a perturbation - structure in this case would refer to the relative abundance of the 18 microbial taxa. It is possible that even the control communities that were not challenged would have shifted in relative abundance just because of normal competition between the microbes.
Regardless, it is a really cool experiment!
Full citations:
Douglas A. Landis, Mary M. Gardiner, Wopke van der Werf and Scott M. Swinton. 2008. Increasing corn for biofuel production reduces biocontrol services in agricultural landscapes. PNAS 105(51) 20552-20557 doi: 10.1073/pnas.0804951106
Lieven Wittebolle, Massimo Marzorati, Lieven Clement, Annalisa Balloi, Daniele Daffonchio, Kim Heylen, Paul De Vos, Willy Verstraete & Nico Boon. 2009. Initial community evenness favours functionality under selective stress. Nature 458:623-627. doi:10.1038/nature07840
Sunday, April 12, 2009
Learning about AM
One neat intro is a series of webcasts put together by trainers at the Bureau of Land Managment. There are 3 seperate 2 hour webcasts, which have a presentation and panel format. They were originally webcast live, so the expert panel also fields questions from viewers. I particularly found the segment on the legal implications of Adaptive Management very interesting - and a little frightening!
Saturday, April 11, 2009
Coming back to life ...
The centerpiece, AM-wise, was the Missouri River Natural Resources Conference in Billings, Montana. My colleague Sarah Michaels, from UNL's Political Science department, was one of the Plenary speakers, and gave us (scientists & engineers, that is) a real prodding to wake up and recognize our differences so we can get on with the business of saving the planet. Or at least the Missouri River, in the first instance! The other plenary speakers were on fire too - Jim Martin, formerly from Oregon's state agency and now director of conservation at Berkley Conservation Institute gave a great presentation pointing out how we (scientists primarily) need to look up now and then - and in particular think about how increasing demand for water will affect conservation efforts. The rest of the conference was in the usual vein - plenty of presentations about how fish species A is growing in life stage B in habitat C, or Bird X is or is not eating Y. So much knowledge, in so many people, and so hard to track how its all relevant. Independently I've run into alot of stuff about Knowledge Management which suddenly seemed horribly relevant (pun intended) - more on that in a future blog post, although I've touched on it before.
So, many apologies for the long lag in thinking, but I've been much too busy DOING it to think about it! Not what I signed up for when I set out to become an academic ...
Sunday, March 8, 2009
On the cutting edge!
Saturday, March 7, 2009
Adaptive Co-management?
Which brings me to Adaptive Co-management, beautifully outlined in a review article by Derek Armitage and several colleagues (Frontiers in Ecology and Environment 2009 7:95-102). I hadn't heard of Adaptive Co-management, but the title obviously struck me as directly relevant! The basic idea is that coupled social-ecological systems are complex systems, in the sense of complexity theory rather than having alot of parts, and thus traditional, centralized, command and control systems have difficulty managing them. In contrast, Co-management is a novel approach to governance that emphasises collaboration among multiple groups with diverse opinions, and emergent understanding instead of prescriptive knowledge. Adaptive co-management, if I understand correctly, is the merger of this social governance approach with Adaptive Management's emphasis on using management and models to reduce ecological uncertainties. I particularly liked their list of 10 conditions required for successful co-management. I believe that this kind of effort to generalize learning about AM in particular situations (Narwhal management for Armitage et al.) will bring great dividends when trying to apply these tools to new situations. Several of those conditions resonate directly with the conditions listed in the Department of Interior Technical Guide on AM, such as the need for long term committment to the process, and a well-defined resource system.
What Adaptive Co-management adds to AM is a sense of how the social governance components of the system should evolve. And while thinking about these new components, it occurred to me that DoI AM largely assumes that these social components already exist, and that management will take place within existing social structures. In fact, insofar as laws are expressions of social architecture, the DoI approach is explicit in articulating this - an adaptive management plan must comply with existing laws and regulations. One of the conditions outlined by Armitage et al. also points to this - that the "National and regional policy environment is explicitly supportive of collaborative management efforts". Efforts like this to incorporate the larger social-ecological system into AM will be useful, in my opinion.
The other possible link that struck me was the idea of functional diversity - in ecosystems increased functional diversity often leads to greater function - couldn't the same idea hold for social networks? It is not immediately clear to me what the social equivalent of relative abundance, body size, or specific leaf area is, but I'm not a social scientist. Surely there are ways to quantify the diversity of functional traits in a social network, and perhaps, increasing that diversity increases performance of those networks. Anyway, its an exciting time to be involved in AM!
Thursday, March 5, 2009
Diversity and function
Not quite serendiptiously I came across a great article on the relationship between biodiversity and ecosystem functioning today - I was following up on a comment by my colleague Ann Bleed that ecologists fail to connect with people, especially when arguing for something vague like "ecosystem function". Patricia Balvanera and colleagues (2006; Ecology Letters 9:1146-1156) conducted a quantitative metaanalysis of studies that perturbed the diversity of a group and measured one or more ecosystem functons. Across a huge range of functions, higher diversity leads to higher function. More recently, Michael Scherer-Lorenzen (2008; Functional Ecology 22:547-555) reported on some elegant experiments demonstrating that functional diversity, rather than species diversity per se, affects decomposition rates. What I liked the most about that work was the use of a quantitative metric for functional diversity based on measurable, ecologically important traits such as specific leaf area and C:N ratios. If all species had the same values of these traits, or if there was a monoculture, then functional diversity is zero. And it worked! What makes this particularly exciting is that it provides a means to connect relative abundances of species, and measurable traits of those species, to variation in ecosystem function. We can use population models to predict changes in relative abundance, and this approach means that we can also show that the mixture of species present can matter.
I find it remarkable how completely random ideas can set up constructive harmonies of thought that lead to new places. As I was wondering about functional diversity in ecosystems, and how to use it as an objective in AM, I received an email from my Dad about David Snowden. Some of the material on that website looked interesting and relevant to my teaching, as well as my work with helping groups develop decision making in ecology. I find his ideas about how groups accrue knowledge and turn it into decisions very intriguing. In one article he describes the idea of scanning information to recognize weak signals, and then act on those signals. Where the resonance happened was when he described how diversity, yeah, diversity of individuals in a group leads to improved scanning of information. Which brought me back to MRRIC, a huge group of diverse individuals. Although that diversity may make life difficult for the scientists that must communicate with them, that same diversity is the groups strength in searching for weak signals, and acting on them.
Wednesday, February 25, 2009
Anchoring! and other psychological trivia
For instance, it turns out that human decisions can be profoundly influenced by subtle non-linguistic signals such as the tone of voice, the posture of the speaker among others. There was a great summary of some of the "secret signals" in the Jan 29 edition of Nature (pg 528) through an interview with one of the scientists working on documenting these effects, Alex Pentland at MIT. Pentland is actually a computer scientist who has built small wearable devices to record people's "secret signals" as they go about their daily business. Using these techniques they have discovered ways for call center operators to improve their sales techniques to near 100% success. Now that's frightening. Worse, they also cite a study of faculty members that shows you can predict 70% of the variation in student evaluation scores from a 30 second slice of soundless video (Ambady and Rosenthal, J. Pers. Soc. Psychol. 64, 431-444)! Believe me, I grabbed that article right away - my student evaluations need all the help they can get.
So what does that have to do with adaptive management? AM involves people, from stakeholders that worry about the effects of a decision on their livelihoods, to scientists with a stake in getting their project deemed important enough to recieve funding. This means that getting people together to agree on objectives, and later, choosing actions means influencing how they think. At the beginning, I'm often trying to sell, literally sell, the process to a skeptical group of customers. I need Alex Pentland's secret signals big time. Up until now, I'm not sure that we've done a good job of recognising how important these social communication skills are for the people that help others come to good decisions - at least in the Department of Interior related AM community. In many other instances, we arrange for facilitators to be hired to provide the social skills, creating a divide between the technical experts and the social experts. I'm not sure yet what the best solution is, but it bears some careful thinking about.
Along the same lines, one of the phenomena that groups interested in AM encounter is 'Anchoring' - the tendency to interpret new data in light of personal knowledge. This can lead people to interpret new data completely differently, even when they are both looking at exactly the same graph. If this sounds like a recipe for conflict, let me assure you, it is! As it turns out, whether the group members are happy or sad may influence how strongly folks are anchoring - a recent experimental test found that anchoring disappeared for happy non-experts compared to less happy non-experts (http://journal.sjdm.org/71130/jdm71130.pdf). Intriguingly, experts were relatively unaffected by mood - they were affected by anchoring regardless of how they felt. Experts in this case were people who had alot of background knowledge of the decision being made - law graduates choosing a sentence for a crime. While that doesn't give me much hope - mostly my workshops involve teams of experts - it might help the next time I want to work with a larger group of non-expert stakeholders.
One last thought - figuring out how to use science effectively in decision making is hard enough - but what if you don't even have the scientists to turn to? Larkin Powell had some sobering thoughts on this from Namibia, as well as a good anecdote that makes me glad our restrooms have deadbolts and not keys ...
Thursday, February 19, 2009
Stimulating Ecosystem Services?
Destroying ecosystems for short-term economic benefit is like killing the cow for its meat, when one might keep from starving by drinking its milk for years to come. Now is not the time to slaughter the cow.
In addition, the editorial calls for an increase in ecosystem monitoring, research, analysis, and (bravo!) simulation. I couldn't agree more, but as ecologists we are ill equipped to meet the immediate analytical demand that would be required to "fold ecosystem services into the budget".
Thursday, February 12, 2009
Out of the blocks!
My goal here is to start blogging about papers, books, tv programs, and whatever else strikes me as related to AM and the practice of AM. I'd like this to be like a running review. Hey, its an experiment. Maybe it won't be useful, but at least it will give me something to do, and a place to put thoughts when they occur to me.