The Science of Science Advice - Communicating and Using the Advice (6)

(September 25th, 2015) Many government decisions have a scientific element. But who decides what kind of scientific advice will be used? How it is presented to the politicians and citizens they represent? Jeremy Garwood looks at the rise of the ‘Science of Science Advice’.





Having thought at length about the question, the science advisers need to communicate their answers/advice to policy-makers. Usually this is in the form of a written report. Nevertheless, there is always the danger that the report’s language and contents will not be sufficiently adapted to maximise shared understanding (and minimise misunderstandings). The OECD stress that this is “a major factor” in determining the impact of the advice. “Scientists responsible for communicating advice need to spend time to speak with policy-makers to explain their work and ideally reach mutual understanding of its policy implications.”

However, as discussed later, there are “crisis” situations when there does not seem to be enough time to consider and communicate all the scientific arguments in great detail. Furthermore, the “timing” of the communication of science advice can create problems. “Premature, innacurate or biased reporting can undermine the whole advisory process.” Most established advisory bodies have informal rules that request experts not to communicate before the advice has been made public, but often there is no real regulation or control in place. Premature communication of partial results and findings may lead to loss of credibility, to misunderstanding and to increased public pressure. “An early leak can lead to further questions that distract from the main task.”

In some urgent situations – especially concerning the assessment of risks associated with natural disasters or disease epidemics – early communication of initial results and recommendations is necessary. In these situations, transparency and openness is essential even if this can further complicate the subsequent debate. “It should be clear when the advice is based on preliminary results and incomplete data.” This can help to ensure that a later change of interpretation of the data or the use of additional data does not lead to public mistrust in the science advisory process.

Discussing drafts

The scientific advice will often be provided to both decision-makers and the public. If the advisory body is fully independent from the decision-making body, the final advice can be made available simultaneously to the public and to the policy side. But sometimes, the decision-making ‘customer’ is first consulted over the “draft advice”, and comments are taken into account before the final document is made public. For example, for the advisory reports of the International Plant Protection Convention (IPPC), the Panel prepares a first draft including all findings of the different expert groups. After this draft has been reviewed, authors prepare a second draft of the report and a first draft of its ‘Summary for Policy-Makers’. These are subject to simultaneous review by both governments and experts. Authors then prepare the final drafts of the report and the Summary for Policy-Makers. These are distributed to governments who provide written comments on the revised draft before meeting in plenary session to approve the summary and accept the report.   
 
Not too technical

“A common mistake for the communication of scientific advice is for it to be written in long, very technical reports.” Instead, the OECD recommends “short readable reports.” An advisory report for policy-makers and open publications should be written in a scientifically accurate manner and, at the same time, be understandable to those expected to consider the advice. One way to achieve this is to write several versions of the report. For example, a scientific report and a summary for policy-makers. This dual report process has been adopted by a number of scientific advisory bodies at both international and national level. Often, there will be a general introduction – an abstract/preface – presenting the main details of the question that has been addressed, who has been consulted, who has written the report, and key conclusions. This is followed by an “executive summary” of several pages that presents the issues in a short readable form that should be accessible to a wide audience. Finally, the detailed part of the report gives in-depth background and analysis, including technical reviews of the available scientific evidence, areas of uncertainty, divergent interpretations, alternative ways of approaching the problem, and how recommendations have been formulated (see, for example, the 289 page report “State of the science of endocrine disrupting chemicals” prepared by a group of experts for the United Nations Environment Programme (UNEP) and WHO).

Receiving advice transparently

“The authority of the advisory body and its credibility may be substantially challenged by the way the policy side officially reacts to its advice.” For the OECD,
the importance of openness in the scientific advisory process “cannot be emphasised enough” and this relates directly to communication and the use of scientific advice. “Governments must assure timely access for the public to information related to policy decisions based on scientific advice.” In some countries this is required by law, although consideration is also given to the treatment of “sensitive information” regarding diplomatic, national security, privacy and other issues.

Transparency is necessary to allow public accountability and to demonstrate the independence of scientific advisers. However, it also needs to be shown how scientific advice has been considered when subequently drawing up government policy, especially when the resulting policy decisions may be in conflict with the solicited advice. Policy decisions may well be based on a number of criteria other than the scientific information provided by advisory bodies. But, it is important that the science advice should not be used “selectively” in order to justify pre-determined positions or as an excuse to avoid political responsibility. If at the end of the advisory process, the scientific advice (with its inherent uncertainties) gets blamed by government, public or private interests, it can undermine the credibility of the advisory system as well as “ruining the development of evidence-based policies.”

Assessing the impact

This is an additional fifth phase to advisory processes that is often overlooked. Maybe we should take time to look not only at how good the advice was on a given question but whether it had a real and decisive ‘impact’ e.g. did it solve the problem or result in new research, or provide a basis for addressing the risk of similar recurrent questions/problems? 

Some scientific advisory bodies provide recommendations that have to be enforced by decision-makers (usually in the regulatory domain), but most provide advice that is non-binding. Few advisory structures carry out formal impact evaluation or are required to monitor or comment on the implementation of their recommendations. “They usually consider that their role ends when the advice is provided, do not comment on policy decisions and only intervene and communicate when the advice is misinterpreted.” Nevertheless, assessing the actual impact of such advice is important given the time and resources that are devoted to generating it. Impact assessment can also help improve the advisory processes. The OECD notes some exceptions, for example, the German ‘Expertkommission Forschung und Innovation’ (EFI - Commission of Experts for Research and Innovation) conducts follow-up surveys and communicates on the take up of its science policy advice, even though the EFI itself has no direct role in its implementation.

International advisory mechanisms may also find their usefulness and credibility diminished if their advice is perceived as having little impact. These bodies do not usually provide binding advice (one notable exception is advice from EU regulatory agencies, e.g. the European Medicines Agency and European Food Safety Authority, that have a statutory role in regulatory procedures) and their potential policy impacts can be very diverse, depending on the nature of their mandate. Integrated science assessment structures, such as the Intergovernmental Panel on Climate Change (IPCC) or Intergovernmental science-policy Platform on Biodiversity and Ecosystem Services
(IPBES) are mainly focused on providing evidence for international policy agreements and standards, while UN organisations, such as the World Health Organization (WHO) can produce guidelines and standards that are often of particular interest to developing countries. On the other hand, international risk assessments may aim mainly to raise awareness and help build response capacities in exposed countries.   

Trying to provide information to a very wide range of recipients can lead to lack of precision and relevance for some international bodies. For instance, while the IPCC is specifically targeted at informing the negotiations for the United Nations Framework Convention on Climate Change (UNFCCC), another UN advisory structure - the UN Global Environmental Outlook (GEO) - aims at supporting decision-making at all levels and does not have a specific policy forum that it targets on a regular basis. As a result, it is a major challenge for GEO to define its impact strategy and find the most suitable channels to inform policy discussions.

In general, it can be difficult for national governments to change policies solely on the basis of recommendations provided by an international advisory committee. To really be effective international science advisory mechanisms often need to be complemented by national processes.

In a future editorial, Jeremy Garwood will look in more detail at how science advice is provided and where it can all go wrong.

Jeremy Garwood

Photo: Fotolia/Gajus




Last Changes: 12.01.2015



Information 4


Information 5


Information 6