‘lessons learned in designing effective assessments’

30
1 eea presentation/complete set english DGE European Environment Agency (EEA) Anita Künitzer http://www.eea.eu.int ‘Lessons learned in designing effective assessments’

Upload: clarke-emerson

Post on 31-Dec-2015

52 views

Category:

Documents


3 download

DESCRIPTION

‘Lessons learned in designing effective assessments’. European Environment Agency (EEA) Anita Künitzer. http://www.eea.eu.int. Domingo Jiménez-Beltrán Executive Director, E EA. The EEA's Mission. ... is to deliver timely, targeted, relevant and reliable information - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: ‘Lessons learned in designing effective assessments’

1eea presentation/complete set english DGE

European Environment Agency (EEA)Anita Künitzer

http://www.eea.eu.int

‘Lessons learned in designing effective assessments’

Page 2: ‘Lessons learned in designing effective assessments’

2

The EEA's Mission

... is to deliver

timely, targeted, relevant and reliable information

to policy-makers and the public for the development and implementation of sound environmental policies in the European Union and other EEA member countries.

Domingo Jiménez-BeltránExecutive Director, EEA

Page 3: ‘Lessons learned in designing effective assessments’

3

EU 15 member states+ Iceland LiechtensteinNorwayEEA candidatecountriesStability Pactcountries TACIS

EEA member countries

Page 4: ‘Lessons learned in designing effective assessments’

5

Production process of assessment

•Identification of users (politicians, scientists, school children,...)

•Policy issues to be addressed: what should the assessment achieve?

•Process of the assessment

•Launch: at which policy event

Page 5: ‘Lessons learned in designing effective assessments’

6

Designing effective assessments:

The role of participation, science

and governance, and focusWorkshop co-organised by

the European Environment Agency and

the Global Environmental Assessment Project

2001

Page 6: ‘Lessons learned in designing effective assessments’

7

Integrated Environmental Assessment

An interdisciplinary process of structuring knowledge elements from various scientific disciplines in such a manner that all relevant aspects of a complex societal problem are considered in their mutual coherence for the benefit of (sustainable) decision-making.

Page 7: ‘Lessons learned in designing effective assessments’

8

policy preparation

policy formulation

policy execution

policy evaluation

The Policy Cycle

Page 8: ‘Lessons learned in designing effective assessments’

9

Framework for scientific assessment process to inform policy makers

1. Example: IPCC (Intergovernmental Panel on Climate Change)

• Involved only expert scientists in defined disciplines, no political stakeholders

• Production of lengthy reports

2. Example: CLRTAP (Long-Range Transboundary Air Pollution)

• Less clear science-policy distinction• Few formal reports

Page 9: ‘Lessons learned in designing effective assessments’

10

Effective assessments

•What is effective?

–Cost-effectivness–Improvements in the natural environment–Fulfilling political objectives

•Attributes for effective assessments:

–Credibility–Salience–Legitimacy

Page 10: ‘Lessons learned in designing effective assessments’

11

Credibility

• Lack of credibility:–Assessment based on shoddy methods–Assessment ignores important empirical evidence–Assessment draws inappropriate conclusions from data

• Gain credibility:–Through the process by which the information is created (example: data obtained by good laboratory practise)

–By the credentials or other characteristics of producers of the assessment (example: assessment done by well-known, highly regarded scientist)

Page 11: ‘Lessons learned in designing effective assessments’

12

Salience or relevance

• Lack of salience:–Produced report is never referred to and never heard from again

–Assessment addresses questions to which the user is not interested in the answers

• Gain salience:–Assessment is able to address the particular concerns of a user

–User is aware of the assessment–User considers the assessment relevant to current policy

Page 12: ‘Lessons learned in designing effective assessments’

13

Legitimacy

measure of the political acceptability or perceived fairness of an assessment to a user

• Lack of legitimacy:–In ‘global’ assessments inputs from less powerful countries are not included or their interests are ignored

• Gain legitimacy:–Users and participants interests, concerns, views, perspectives have been taken into account

–The assessment process has been a fair one

Page 13: ‘Lessons learned in designing effective assessments’

14

Assessment design (1)

1. Historical context of the assessment• Characteristics of the issue area• Position of the issue on the political agenda

2. Characteristics of the intended user• Interest in the issue and/or assessment• User capacity to understand the results• User openness to different sources of advice

Page 14: ‘Lessons learned in designing effective assessments’

15

Assessment design (2)

3. Assessment characteristics

• Participation: who is involved in the assessment process?

• Science and governance: how are assessments conducted with respect to the interactions between scientific experts and policy makers?

• Focus: how broadly (multidisciplinary) or narrowly (technically) focussed should the assessment be? How consensus-based should it be?

Page 15: ‘Lessons learned in designing effective assessments’

16

Conceptual framework for considering effective assessments

Ultimate determinants Proximate pathways Assessment effectiveness

Historical context

•Issue characteristics

•Linkage

•Attention cycle

User characteristics

•Concern

•Capacity

•openness

Assessment characteristics

•Science/governance

•Participation

•focus

Salience

Credibility

Legitimacy

Effectiveness

Page 16: ‘Lessons learned in designing effective assessments’

17

Participation: critical issues• The capacity of partners, clients and/or users to participate

in the assessment (travel costs, administrative capacity, time for the assessment itself).

• Are scientists participating in their individual capacity (good for scientific credebility) or are they accountable to governments?

• Encourage participation of stakeholders to whom the assessment is designed to: NGOs, policy making community, country representatives to make them interested in the final report

• Process of participation might be more important than content: inclusion as author or attending a meeting increases legitimacy of assessment.

• Broad review of an assessment done by few scientist by several international organisations can increase the legitimacy

Page 17: ‘Lessons learned in designing effective assessments’

18

Science and governance: critical issues

• Assessments on issue areas, which are scientific controversial should be undertaken by institutions accountable to the scientific community to minimise credibility concerns.

• While scientist prefer credibility of assessments, politicians prefer salience. Therefore such assessments on more mature scientific areas might better be undertaken by other organisations more focussed on policy needs.

• Including policy recommendations in a scientific assessment can be dangerous. Here assessment of ’boundary organisations’ that are accountable to science and policy may be the solution.

Page 18: ‘Lessons learned in designing effective assessments’

19

Focus: critical issues

• Succesful assessment avoid adressing controversial issues.

• Broadly focussed assessments include more relevant factors and increase the audience and might be more relevant to decision makers.

• Most assessment to date have been too simple by excluding too many factors and causal chains.

• Assessment should be kept comprehensive despite all interactions. Periodic separate thematic assessments could be produced instead of one big comprehensive assessment.

Page 19: ‘Lessons learned in designing effective assessments’

20

M: Monitoring

D: Data

I: Information

A: Assessment

R: Reporting

R: Reporting

A: Assessment

I: Information

D: Data

M: Monitoring

Use MDIAR to analyse the information provision process

MDIAR stands for:

Page 20: ‘Lessons learned in designing effective assessments’

22

The DPSIR framework

Responses

State

Drivers

Pressures Impact

e.g. Clean Production, Public Transport, Regulations, taxes Information, etc.

e.g. Ill health, Biodiversity loss, Economic damage

e.g. Transportand Industry

e.g. PollutingEmissions

e.g. Air, Water, Soil quality

Page 21: ‘Lessons learned in designing effective assessments’

23

A performance indicator

Emissions of ozone precursors, EU15

target

Page 22: ‘Lessons learned in designing effective assessments’

24

The link between indicators and the policy process – distinguishing the differences

and improving relevance

1 2 3

Indicators linked to

quantitativetargets

Indicatorslinked tostated

objectives

Indicators linked to

policy intentionsor public

expectations

Page 23: ‘Lessons learned in designing effective assessments’

25

What are scenarios?

Scenarios are archetypal descriptions of alternative images of the future, created from mental maps or models that reflect different perspectives on past, present and future developments.

Page 24: ‘Lessons learned in designing effective assessments’

26

Measuring is Not Knowing:The Marine Environment and the

Precautionary Principle

‘The enormous number of papers in the marine environment means that huge amounts of data are available, but …we have reached a sort of plateau in …the understanding of what the information is telling us …. We… seem not to be able to do very much about it or with it. This is what led to the precautionary principle, after all – we do not know whether, in our studied ecosystem, a loss of diversity would matter, and it might’.

Marine Pollution Bulletin, Vol 34, No. 9, pp. 680-681, 1997

Page 25: ‘Lessons learned in designing effective assessments’

27

Precautionary principle in assessments

•Levels of proof: Assessments for public policy making need lower levels of proof than normal good science

•Multidisciplinary approaches: improve the quality of an assessment by considering aspects of the problem from different perspectives

•Early warnings: successful prevention of environmental impacts and associated cost needs early warnings

Page 26: ‘Lessons learned in designing effective assessments’

Levels of proof - some illustrations

Beyond all reasonable doubt

reasonable certainty

balance of probabilities/evidence

strong possibility

scientific suspicion of risk

negligible/insignificant

Page 27: ‘Lessons learned in designing effective assessments’

29

Organisation through Interest Groups

On each of

the 33 servers

across Europe

Page 28: ‘Lessons learned in designing effective assessments’

30

CIRCLE Library Service

Page 29: ‘Lessons learned in designing effective assessments’

31

Data Flows in EIONET

National Layer

EEA Warehouse

European Layer

Inform

ation R

etrieval System

Data access and visualisation

E2RCReportsand Reference Centre Concept

Page 30: ‘Lessons learned in designing effective assessments’

32