The EU experience in the first phase of COVID-19: implications for measuring preparedness – Italy


This report analyzes the experiences of five EU countries – Croatia, Finland, Germany, Italy and Spain, from the start of the pandemic until COVID-19 vaccines became available at the end of the pandemic. 2020.

The report focuses on testing and surveillance, health sector coordination and emergency risk communication and identifies the specific challenges that have been encountered in these areas, as well as successful responses to them.

Implications for preparedness measurement are identified to inform epidemic preparedness efforts in EU Member States in the future.


In light of the challenges faced during the COVID-19 crisis, European Union (EU) legislation has been reviewed to strengthen the EU’s collective preparedness to respond to communicable disease threats in the future. Decision 1082/2013/EU on serious cross-border threats to health is revised into a regulation, to be adopted in autumn 2022. ECDC’s mandate is also revised and will enter into force once the regulation on to serious cross-border threats to health is adopted and published in the Official Journal of the EU. Measuring and evaluating the performance of public health emergency preparedness (PHEP) systems is a key part of the process of strengthening preparedness.

This technical report presents an analysis focused on three issues (testing and surveillance, health sector coordination, and emergency risk communication) during the first phase of the COVID-19 pandemic. The analysis identifies specific challenges that were encountered in this phase, as well as successful responses to them. Implications for preparedness measurement are also identified to inform epidemic preparedness efforts in EU Member States in the future.

This analysis is based on the experiences of five countries (Croatia, Finland, Germany, Italy and Spain) during the first phase of the pandemic, i.e. before the launch of vaccination programs in December 2020. It s ‘Press on :

  1. pandemic preparedness and response plans, standard operating procedures and other documents related to COVID-19 response measures provided by countries,
  2. interviews with country representatives, and
  3. other documents identified through the completion of rapid literature reviews.

The analysis identifies the following general problems with existing measurement systems for preparation:

  • The COVID-19 pandemic has forced EU Member States to develop new strategies, approaches and policies related to their strained PHEP systems and structures. These were also to be reviewed and revised as the pandemic evolved. The extent of revision and innovation required has not been considered in existing measurement tools for preparedness.
  • Existing measurement tools for preparedness are generally not compatible with a country’s internal reporting structure for public health, healthcare and other entities that influence emergency response.
  • Existing measurement tools for preparedness generally do not reflect the coordination required between different sections of the health system, particularly at the hospital and community levels.
  • Existing measurement tools for preparedness generally do not allow for the adequate flexibility and resilience needed to meet the challenges of scaling up a country’s pandemic response.

Section 3.1 builds on these general themes with specific indicators of issues that are missing or insufficiently covered in existing preparedness measurement systems, in particular the Health Emergencies Preparedness Self-Assessment Tool ( HEPSA) from ECDC and the Joint External Evaluation Tool (JEE) from WHO, and for some parts, the Global Health Security Index (GHSI). The following conclusions are drawn:

  • An indicator should be included in the preparedness measurement tools referring to the ability to carry out large-scale testing, which was essential in the initial phase of the pandemic.
  • A flexibility indicator for the surveillance system should be introduced in the preparedness measurement tools.
  • In fact, existing measurement tools for preparedness related to testing and surveillance cover the main tasks but do not address the ability of systems to scale up testing capacity, the importance and complexity of subnational surveillance and epidemiological investigation, nor the challenges of adapting existing surveillance systems and developing new ones during the pandemic. These must be treated with precise indicators.
  • Although three capacities of ECDC’s PHEP logic model (management of medical countermeasures, supplies and equipment; medical emergency; and hospital infection control practices) were found to be essential, they are not represented with respective indicators in existing measurement tools for readiness.
  • The “preventive services” capability of ECDC’s PHEP logic model is expected to include a new broader capability on “population-based medicine coordination”, defined as “the ability to activate and strengthen coordination on a given geographic territory – during a high-impact outbreak. infectious diseases – public health, ambulatory care, including primary care services, mental health and social support agencies, public and private sectors and hospital health care, using integrated pathways between different levels of care (outpatient and inpatient).’
  • Experience gained during the COVID-19 response has proven that the risk communication capacities identified in ECDC’s PHEP logic model are valid and relevant, but are not fully represented in existing measurement tools for risk communication. preparation. Additionally, countries found it difficult to manage an information epidemic, which meant that the logic model had to be further expanded to include a fifth capability, “infodemic management”, i.e. dealing with an overabundance information (some accurate and some not).

In summary, given the different systems for measuring readiness that exist, the analysis in this report suggests that the type of measurement approach and format used in the joint external evaluation process, for example, could be useful in evaluation of EU preparedness efforts. This would involve first developing a set of measurement tools and indicators to address the areas identified in the analysis as being less developed, and then creating a scoring system or scale for each area. As with the JEE process, the assessment would begin with preliminary analysis and scoring by national experts. This would be followed by a meeting where peers from other countries would review the internal analysis documentation and meet with national experts to reach consensus on the scoring. The assessment process could include an analysis of existing systems, performance during the COVID-19 pandemic and the “stress tests” mentioned in the proposed EU legislation on health emergency preparedness and response.

Comments are closed.