Cross-cutting assessment methodology developments at EFSA
Interview with Tony Hardy, Chair of EFSA’s Scientific Committee
Chair of EFSA’s Scientific Committee
Professor Tony Hardy is an ecologist, environmental chemist and ecotoxicologist with extensive research and risk assessment experience. His particular expertise is the environmental impact of pesticides, GMOs and contaminants on wildlife. As Chair of the Scientific Committee, Professor Hardy is closely involved in overseeing EFSA’s efforts – both carried out by the Scientific Committee and by EFSA’s scientific staff – aimed at developing harmonised risk assessment methodologies for use in EFSA’s work.
Why has EFSA produced an EFSA Journal Editorial about its current work on risk assessment methodologies?
We want the wider scientific community to understand our work better. So we are creating the conditions for others to repeat our assessments if they so wish and in some cases to contribute to the assessment with additional information and considerations.
This kind of open scientific assessment requires three essential goals: quality data, rigorous methodology and transparency for efficient communication.
This editorial aims to demonstrate how EFSA’s work can improve unambiguous communication (of the assessment outcomes) to decision-makers, the wider scientific community and stakeholders.
More specifically, we felt it was important to highlight a series of activities related to our assessment methodologies that EFSA’s scientific staff and we, the members of EFSA’s Scientific Committee, are in the midst of developing as these cut right across the diverse spectrum of scientific areas that EFSA covers.
What are these methodological developments?
There are four separate but closely connected activities:
A methodological umbrella project called “Promoting Methods for Evidence Use in Scientific Assessments“. This will define the processes and guiding principles for using evidence in scientific assessments. This is led internally by scientific staff in EFSA.
The development of guidance and a toolbox of suitable methods to describe and account for uncertainty (both qualitatively and quantitatively) during the different stages of an assessment.
Guidance on how to combine the different strands of evidence in a consistent and transparent way in scientific assessments using the weight of such evidence, i.e. its relative importance for the specific assessment at hand.
And guidance on scientific criteria to decide on the biological relevance of observed adverse or positive health effects for the target species under consideration.
The last three activities are spearheaded by EFSA’s Scientific Committee, which takes the lead on horizontal or cross-cutting issues.
In practical terms, how do you see this work helping EFSA’s experts to carry out their work?
These activities will provide more complete practical guidance to help EFSA’s independent experts and internal scientific staff in the wide spectrum of scientific assessments while recognising the need for transparency, fitness for purpose and timeliness of advice to the decision makers.
For example, common standards and criteria for assessing, reporting and communicating uncertainties that can be applied in all the scientific areas will, in terms of transparency, help to contextualise the conclusions of EFSA’s assessments better.
Likewise, the guidance on “biological relevance” aims to further clarify and create a common understanding of how expert judgement is used to decide on the usefulness of data available for an assessment. For example, is an observed effect adverse in the organism (e.g. test animal) being studied or just an adaptive response? Can the available data (e.g. test results) be extrapolated to humans or another specific population being assessed? These and other specific developments within the scope of these activities are critical for EFSA’s scientific assessments.
Doesn’t EFSA already have clear guidance on how to deal with evidence and related uncertainties?
Yes, of course, when we do scientific assessments at EFSA we apply internationally-recognised assessment methodologies that ensure these issues are rigorously addressed. Also we have developed a large body of cross-cutting and sector-specific guidance documents to address EFSA’s specific assessment needs, particularly for the evaluation of regulated products. But science is an innovative, iterative process that continuously builds on hundreds of years of accumulated knowledge. So we are always looking for ways to improve how we do things and fill gaps where new challenges or needs arise.
Are stakeholders including other risk assessors involved in these activities?
Absolutely, EFSA does not work in isolation. The Authority receives most of the mandates for its work from risk managers in the European Commission so addressing their needs, and those of the European Parliament and Member State, is critical since they depend on EFSA’s scientific advice to inform their risk management decisions. Also, EFSA depends greatly on Member States for much of its data and expertise and regularly consults with its national and international partners as well as the broader scientific and stakeholder communities on important issues. For example, we shared this editorial with organisations carrying out similar work to EFSA and will hold an important workshop with them later this year to generate additional input into these activities.
When and how will this work begin to take effect?
These interdependent activities are already underway and will be delivered in stages by early 2017. In the first half of 2015, you can expect the first report on Promoting Methods for Evidence Use which will set out the common principles. Also, by the middle of the year we plan to hold a public consultation on the draft Guidance on uncertainty analysis in risk assessment.
As a scientist, what do you get out of it for your work?
To me as an expert the outcomes of these activities provide state of the art (or science) guidance to help us apply more consistently and transparently up to date and robust scientific methods. The challenge with developing such guidance documents is to engage with the wider scientific community and to align them with international scientific opinion.