Collection, appraisal and analysis of evidence – data, literature and expert judgement – are key steps in scientific assessments.
The following principles guide our approach to evidence use: impartiality, excellence (methodological quality), transparency and openness, and responsiveness (fitness-for purpose).
PROmoting METHods for Evidence Use in Scientific assessments (PROMETHEUS) is an EFSA initiative designed to foster these principles. It involves a four-step approach that can be tailored to the different circumstances and requirements of each scientific assessment:
- Upfront planning of the assessment strategy, defining the relevant data and the approach for collecting, appraising and integrating them
- Conducting the scientific assessment in line with the plan, and independently of prior knowledge of the results of the available studies
- Verifying the process to ensure alignment with the plan and the guiding principles
- Documenting and reporting of all steps, including deviations from the original plan.
EFSA scientific outputs that have used or are under development using this approach are: isoflavones, listex, dioxins, insect resistance monitoring of Lepidoptera-resistant Bt maizes for PMEM, bluetongue, glyphosate and animal health, dietary reference values for sodium, preparation for the re-evaluation of bisphenol A, and pest assessment of Eotetranychus lewisi.
While primarily aimed at EFSA’s Panels and scientists, the principles and the process should be applied by scientific organisations carrying out work on EFSA’s behalf.
Sources of evidence
- the EFSA Data Warehouse contains a series of databases (e.g. Comprehensive Food Consumption database, Chemical occurrence tested in laboratory studies)
- analytical data mainly collected by Member State data providers but also submitted by other stakeholders (universities, research centres, industry) sometimes in response to a call for data
- data from experimental studies that are submitted to EFSA by food operators as part of a market authorisation process
- data from other external data holders (e.g. the EU statistical office, Eurostat, or the World Health Organization).
We work closely with our network of Member State data providers to standardise how the data is collected and submitted to us to improve data quality, usability and accessibility.
Systematic literature reviews map all the available literature on a given research topic. They require a structured process that includes identifying and screening relevant research through an extensive literature search. It then involves extracting evidence, critically appraising it, analysing the data from the studies they include and finally reporting on the process.
EFSA Guidance on systematic review methodology promotes eight key steps in the systematic review process:
Expert knowledge elicitation
When there is limited empirical evidence for an assessment, expert judgement is needed.
- Empirical evidence limited, absent, conflicting, of doubtful relevance, open to different interpretations or inaccessible via publicly available information sources (e.g. bibliographic databases, scientific journals, or websites). In such cases, assessors can elicit reliable information from knowledgeable experts using systematic and standardised methods.
- Unaided expert judgement of the quantities required for risk modelling – and particularly the related uncertainty associated with such judgements – can be biased, reducing its value. Various systematic and standardised methods exist to elicit knowledge from experts in a manner as unbiased as possible.
Protocols and principles – the EFSA Guidance on expert knowledge elicitation (EKE) provides detailed protocols for obtaining expert judgement in the areas covered by EFSA’s food safety remit. It also sets down guiding principles for overcoming the major challenges to expert knowledge elicitation – framing the question, selecting the experts, eliciting uncertainty, aggregating the results of multiple experts, and documenting the process.
The EFSA Guidance also establishes a phased process for EKE, with assigned responsibilities at each step: defining the assessment problem, preparing for elicitation and the elicitation itself, culminating in complete documentation.
Training and development – we provide training to our scientists and external experts to ensure a consistent approach to EKE. We also support further developments of expert knowledge elicitation methodology at EFSA and in the wider scientific community.