Scientific advising

I don’t know why, but I didn’t feel like the Foody article was that useful to me. Of course, it was labelled as a progress report, but had little depth and little to add.

The hunter article also glosses over some details that would have been appreciated. The idea of ‘meta-uncertainty’ was very interesting though, and I think is one of the main reasons we need experts and education in geography and uncertainty. The development of methods to determine error is very important for future projects. However, the communication of error is the other key problem, especially when submitting results/conclusions to political bodies. This is the very reason that people in government appointment advisors from the scientific community to help interpret findings in the scientific community. I don’t, however, think that developing a good way of communicating and displaying uncertainty is going to do away with scientific advisors. There must be more to knowing about uncertainty than  just looking at a table or graph like those examples below. I think that part of truly being able to understand error in results/conclusions is knowing about the processes/methods that have been done to get to the result. How can we measure potential loss of accuracy or other problems coming from a workflow? I don’t think that is going to be fully possible with just a graph or table. Decision makers still need to know, or be communicated, the processes that have been gone through.
The kind of systems mentioned in the papers seem to be geared towards researchers. This still doesn’t necessarily mean that they will be translatable to ‘plain language’ for decision makers (the visualisations being developed here http://slvg.soe.ucsc.edu/unvis.html certainly seem complicated to understand, especially with multidimensional data), and so, perhaps another set of systems/visualisations need to be developed. Maybe we will need a certain set of visualisations and metrics for understanding, and another set for communicating results.
Finally, would making decisions necessarily hinge upon “this conclusion is more accurate, so we’ll choose this one”? Would a decision maker, presented with a certain finding that was deemed to have the least errors always choose to support that conclusion?
Either way, having a better way of understanding and communicating error and uncertainty will surely be beneficial to the scientific community, by making conclusions a little more transparent and understandable to the public.
-?

One Response to “Scientific advising”

  1. Peter says:

    Hi, can you leave your name on the post, so I know who to grade?