The Science of Science Advice - Producing the Advice (5)
(September 22nd, 2015) Many government decisions have a scientific element. But who decides what kind of scientific advice will be used? How it is presented to the politicians and citizens they represent? Jeremy Garwood looks at the rise of the ‘Science of Science Advice’.
Having framed the question for the science advisors and taken steps to ensure the independence and integrity of their advice, the next phase in the advisory process entails actually producing the appropriate scientific advice to answer the question.
How do the chosen independent science advisers approach the questions that are being posed? Translating research results into advice that is useful for policy-makers is a challenge. On many issues, there will be diverse scientific views and these views need to be presented in an organised manner if they are to be understood and acted upon. In addition, scientific evidence often entails a considerable degree of uncertainty that needs to be interpreted and placed in context. This can make it difficult for scientific advisers to communicate clear advice to policy-makers. Therefore, both the diversity of scientific views and uncertainties need to be properly treated in generating, communicating, and utilising scientific advice.
Accommodating diverse opinions
“Individual scientists have diverse views on many issues, depending on their fields, methods, approaches, and subjective judgments. Uncertainties accompanying scientific knowledge may also cause diversity in scientists’ views.” This range and diversity of opinion needs to be properly handled if scientific advice is to be incorporated into policy making in a sound and effective manner. In this respect, it is important “to acknowledge that scientific judgment itself is made within a value-rich context, and that data-collection depends to a large extent on the way a question is framed.” As was noted in ‘The Science of Science Advice’ part 3, different frames may lead to different scientific results and opinions.
Ensuring “scientific legitimacy” on complex global issues requires particular efforts as the international and interdisciplinary nature of the problems increases the likelihood of divergent opinions. The advisory structures (groups, councils, panels, etc.) that address these complex issues need to set up internal procedures that can allow the necessary scientific debate, while at the same time providing “integrated assessments and recommendations that can be of use to policy-makers.”
However, scientists often acquire more specialised knowledge in depth rather than width, and this can reinforce a narrow perception of problems - what the OECD report terms “silos” - to the detriment of sound integrated advice (discussed in ‘Disciplinary dilemma: working across research silos is harder than it looks’, The Guardian 11/06/14). Taking a broad approach to the advisory issues could facilitate discussion aimed at building consensual agreement. “However, experience tells us that such agreement will sometimes be hard to achieve.” Furthermore, attempts to seek a full consensus to provide clarity on an issue could undermine the rigour of the final advice. Therefore, the OECD suggests that if “legitimate differences in views cannot be resolved, they should be identified and communicated to policy-makers.”
“The fundamental difference between risk and uncertainty is not always well-understood.” Since scientific advice inevitably incorporates various degrees of uncertainty, there may be doubts about the scientific evidence. This might also involve considerable statistical uncertainty such that key conclusions are expressed in terms of probabilities. There have been many cases where the scientific advice has been contested because the evidence upon which it is based is not conclusive enough to provide “a clear answer.” Therefore, the OECD recommends that “as a general rule” scientific advisers should explicitly assess uncertainties and communicate and explain them to policy-makers.
When producing the scientific evidence underlying their advice, scientists should take appropriate steps to ensure the reproducibility of their analysis and the quality of their advice. A standard solution to this problem is to get others to read and comment on the report – notably peer review by experts not involved in the immediate advisory process can provide an important “quality control mechanism.” When tackling complex, multi-factorial issues, experts from other domains may be of great help, provided that “they formulate their opinions based on scientifically valid work.”
Once the chosen science advisors have discussed, debated, and disputed with each other to decide upon the scope and contents of their advice, the next phase of the advisory process is concerned with actually communicating it to best effect, as discussed in the next instalment.