minutes of meeting - European Commission

MINUTES OF MEETING
Project
The Study on the Performance Measurement Framework
Date
18/03/2014
Location
DG TAXUD, Brussels
Meeting no.
5
Taken by
Ida Maegaard Nielsen
Subject
Progress meeting for the Steering Group
Monitoring Group
The Evaluation Partnership / Ramboll
Nancy Peeters (R3)
Vanessa Ludden (VL) (TEP)
Ioana Condurat (R3)
Bradford Rohmer (TEP)
Bernie Komduur (NL)
Ida Nielsen (Ramboll)
Petya Ivanova (BG)
Maria Prieto Pleite (ES)
Ieva Peskova (LV)
Gabriella Nagy (HU)
Duygu Yucesoy (TK)
Otto Christiane (DE)
Galabina Mitkova (BG)
Muriel Saint Supery (ES)
Frederik Persson (SE)
Gowtam Jinnuri (FR)
1.
Introduction
Nancy Peeters (Unit R3) briefly outlined the context within which the study on the
Performance Measurement Framework (PMF) for Customs 2020 and Fiscalis 2020 takes
place. She stressed that the study is part of a broader trend towards increased monitoring
of spending programmes both within DG TAXUD and the European Commission in general.
Additionally, Nancy Peeters reminded participants of the role of the project group and
emphasised that the purpose of the meeting was to get the monitoring group members’
input and feedback on the data collection tools.
1/7
After a tour de table, Vanessa Ludden restated the study’s aims and scope. She
explained that the aim of the study was to contribute to and complete the draft performance
measurement framework developed by unit R3 to enable the measurement of the Customs
and Fiscalis 2020 programmes’ implementation, processes and results using a
comprehensive, detailed and feasible monitoring system.
Vanessa Ludden also described the scope of the study as the functioning of the programmes
and their outputs, results and long term impact, without assessing in-depth the underlying
tax/customs policy. A few key considerations in relation to the study were also stressed,
for example in terms of developing one single framework for both programmes that is
proportionate, consistent and takes into account data and resource constraints.
An overview of progress to date and next steps was also provided by Vanessa Ludden,
highlighting the key methods and tools employed as part of the study.
2.
The intervention Logics
Vanessa Ludden presented the study’s intervention logics (as reviewed by the
monitoring group at the stage of the inception report), which underpin the study.
In relation to the Customs 2020 IL, one participant asked why the general objective had not
been broken down into specific objectives and these represented in the IL. The attendant
noted that as the IL stands, it will be difficult to identify the contribution of results to the
impacts.
The contractors explained that the IL presented is intended to support the development of
the PMF by showing what the indicators should focus on. Furthermore the ILs are simplified
and do not describe in detail the intended links between programme activities and
programme objectives. For an IL displaying the programmes’ objectives, the contractor
referred to that presented in DG TAXUD’s the Management Plan. It was stressed that the
impact indicators are intended to provide input for the two planned evaluations, rather than
contribute to the monitoring of programme performance on an annual basis.
Two suggestions were made in relation to the IL for Fiscalis 2020, firstly that the general
objective in the IL corresponds to the overall objective and secondly that wording of the
result “The application of and implementation of Union law in the field of taxation is
supported” should be changed from ‘application’ to ‘correct application’. The contractors
agreed to take these comments into consideration and recheck the wording against the
Regulations.
3.
The indicators
Vanessa Ludden presented the overall approach taken to develop the indicators and
explained that they are central to the measurement of programme progress. The indicators
will be further defined, including through the development of definitions, baselines and
targets - which will be set, wherever possible,according to the data gathered on the
2/7
previous programme. Vanessa Ludden highlighted that a majority of the indicators are
common for the programmes, while some are specific (e.g. benchmarking for C2020 or MLC
indicators for F2020). During the presentation of the indicators, a number of questions were
raised.
The first question concerned how the different levels of indicators relate to each other,
specifically how output indicators relate to result indicators. The contractor emphasised that
the levels should be seen as separate from each other since they measure different aspects
of the programme. Mainly, the output indicators measure what the programme delivers
(e.g. guidelines/trainings etc.) and result indicators measure what the results of these
deliverables were.
The second question related to what degree of importance was going to be placed on output
indicators such as ‘The number of face-to-face meetings’. The contractors and the
monitoring group agreed that output indicators will not provide information on the quality or
importance of collaboration (which is what can be assessed as part of an evaluation and not
a monitoring framework such as this one), but that they do allow us to quantify the
programmes’ deliverables.
The third question concerned the use of the word ‘effect’ in the indicator matrix. The
contractor and the Monitoring Group agreed that this wording should be reviewed by the
contractor.
The fourth question concerned the lack of an indicator on the quality of the guidelines
produced. It was agreed that an assessment of the quality of the guidelines is outside the
scope of the PMF.
In extension of the above point, the fifth question referred to whether the PMF will examine
the usefulness of guidelines. Here the contractors explained that the indicators relating to
the guidelines will include: 1) the percentage of participants who made use of guideline 2)
the percentage of participants who disseminated the guidelines and 3) the percentage of
participant who declared that a guideline lead to a change. It was agreed that an indicator
on whether 4) participants feel that an activity had resulted in the production of a guideline
that was useful and relevant could be considered.
The sixth question related to how the data on the indicator ‘the percentage of officials that
found the eModule to be relevant and useful’ will be gathered. The contractors explained
that the necessary data would be gathered through the Commissions’ existing eModule
survey which is sent out by the training unit after the eModule is finished.
Finally, the last point raised was on the absence of input indicators on the programmes’
costs. Here the contractors highlighted that these indicators were included in the original
indicator list, but that they were later removed as the assessment of cost-efficiency is
outside the scope of the study which is to focus on outputs, results and impacts, and of a
monitoring framework as it is more a question to be dealt with by an evaluation.
3/7
4.
Data collection
Ida Maegaard presented some of the main data collection methods that the study team has
been tasked to revise and will be used in relation to the PMF1. As indicated in the indicator
list, the data collection methods are:






Proposal form in ART
Action Follow-up Form (AFF)
Event Evaluation Form (EEF)
Event Follow-up Form (EFF)
Programme Poll
Progress report
The presentation placed emphasis on the importance of ensuring that the data collection for
the PMF is proportionate to the benefits derived from the framework, as well as in
accordance with the resources available. Moreover, the data collection tools contribute to
the validity of the indicators by ensuring that the data gathered accurately reflects the
definition of the corresponding indicators. The presentation provided examples of how each
data collection method was gathered, by whom, when and which indicators it relates to.
The Monitoring Project Group then broke out into three smaller groups to discuss the data
collection methods in detail, based on the suggested design and content of both the existing
forms (such as the proposal form and Programme Poll) as well as the new forms/reporting
templates (such as the AFF, EEF, EFF and Programme Progress Report). The following
subsections provide an overview of the main comments and suggestions made by the
Monitoring Project Group members during the discussions. In addition to these, a number of
minor2 changes (e.g. on wording or question order) were proposed which are not included
here, but which will be taken into account by the contractor.
4.1
Proposal form in ART

4.2
It was stressed that it is important that the Action Manager clearly describes the action’s
expected results and means of measuring these in the proposal form as this will later
enable her/him to make an accurate assessment of the extent to which the expected
results were achieved in the AFF. Therefore, it was suggested that it should be explored
which aspects of the ART proposal form could be pre-determined (for example by using
drop-down lists) and that programme management should review the action’s expected
results carefully when assessing proposals. The guidance for filling-in the proposal form
should clearly state that the expected results should be defined in a SMART manner.
Action Follow-up Form (AFF)
The study team has not been tasked to review those data collection tools relating to the
common and IT training activities, but will feed relevant questions to the units in DG TAXUD
responsible for these to ensure that the required data is being gathered, for example on the
use and relevance of these activities.
1
2
Minor changes include changes to wording and question references.
4/7






4.3
Event Evaluation Form (EEF)




4.4
Working visits: The relevance of certain questions for the working visits was questioned,
for example the questions on the action ex-post (Q2). It was also suggested that a
question be added for working visits on the host as the way the working visit is
organized by the host can lead to expectations being met or not.
Working visits: It was also questioned whether both the AFF and EEF needed to
completed by working visit participants as in both cases it would be the views of
participants (rather than those of project leaders in the case of the AFF) that would be
represented.
MLCs: It was felt that it would represent quite a burden for such a form to be filled out
on an annual basis by MLCs' Coordinator and it was questioned whether the MLC reports
could not be used instead.
An automatic feed from the proposal form to the AFF would be helpful as it would save
time, e.g. objectives, expected results, comments section in proposal form.
Include guidance pop ups in AFF, like in the proposal form. Ensure you make clear in the
guidance who is responsible for filling out the form.
In relation to the rating of the degree of achievement of expected results, it is important
to consider the differences in scope between various programme activities and ensure
that it is possible to disaggregate results by activity type. This would ensure that the
data reflect differences in activity scope and ambitions (e.g. between a one-day seminar
and longer-running PG) and address concerns that working visits would potentially make
up a large proportion of completed forms.
It was suggested that a question be added on whether the activity type (e.g. workshop,
seminar) was suitable.
In relation to the EEF and EFF, some members of the project group suggested to make
filling out these forms obligatory by linking it to the reimbursement of expenses. If DG
TAXUD decided to do this, it was stressed that this needed to be made clear it in the
guidelines. DG TAXUD replied that in the beginning this will not be an option but in
function of the evolution of the response rate, it might be considered at a later stage.
In relation to the EEF and EFF, it was proposed that for the longer-running actions (e.g.
PGs) there may be a need to gather participant feedback on a regular basis rather than
rely solely on AFF forms completed by project leaders.
Confusion was expressed on the difference between relevance and usefulness, as well as
between ‘meeting expectations’ and relevance.
Event Follow-up Form (EFF)


5/7
The dissemination of this form was judged to be important (since it takes place six
months after an activity ended) and that it should be ensured that the participants of
activities are registered on PICs and available to receive emails.
The members of the project group agreed that asking officials to complete the form sixmonth after having attended an event was an appropriate timeline.
4.5
Programme Poll




4.6
Progress report




5.
It was indicated that the main challenge to the programme poll in the past was that it
was distributed via an external link, which could not be accessed within certain national
administrations. These administrations restrict access to certain websites or do not grant
access to the internet at all (for example Spain, Hungary and Bulgaria). This concern is
valid for the other data collection forms as well (i.e. EFF, EEF).
Concerns were raised as to survey fatigue and the fact that officials were unlikely to
want to respond to the same survey on an annual basis. Staff turnover is generally not
very high among programme participants in the NAs, so it would be the same people
targeted year on year.
It was questioned whether it was relevant to ask questions on age and gender.
It was stressed that it is important to ensure that the areas of work and the tools listed
are relevant and coherent – some suggestions were provided to the study team that
could be used to revise the poll.
It was stressed that it would be more informative for Member States if certain indicators
were reported on a disaggregate level, because Member States could then assess their
own strengths and weaknesses and this would perhaps help improve the performance of
Member States.
It was suggested that it would be useful to have a concluding section on “Next Steps” at
the end of the report.
As management would be the main target audience of this report and that it would need
to be presented to them in the local language (so translated by NCs), it was stressed
that it should be kept short.
Feedback should be sought from MS after one or two such reports have been produced
to allow officials to comment on its relevance, the content and make suggestions for
improvement.
Conclusions
The key points listed above and emanating from the break-out groups were discussed in a
plenary session in more detail. Clarifications were provided, where relevant, by Ioana
Condurat and the contractors. The key ones included:

It was stressed that DG TAXUD would consider whether it made sense and was
realistic in terms of resources to break down the results by MS. It was stressed that
this would only make sense in relation to results that MS have a possibility to
influence, such as degree of awareness of the programme.

Acknowledging the need to reflect more on the applicability of some of the forms for
the working visits and MLCs in particular.

Considering the best means to disseminate the Programme Poll as well as the other
forms (e.g. as an electronic attachment or as a pdf file where the data could be
more easily compiled), in particular due to the internet access problems highlighted.
6/7


Considering whether (and how) to ask participants of longer running actions for their
views on a regular basis (as the current setup only collects feedback from project
leaders via the AFF).
Considering whether the data collected via the AFF on the achievement of results
can be disaggregated and considered in terms of activity type, context and scope.
The members of the project group asked if the project group should approve the PMF. It was
clarified that the project group is a group providing expertise to the Commission to help the
Commission to finalise the PMF. The PMF is under the responsibility of the Commission who
involved the members of the project group and consulted them with regard to main
elements of the PMF. The Commission will present for information the PMF to the
Coordinators during the Network meetings in May 2014 and to the Committee delegates in
December.
Ioana Condurat then asked that any questions in relation to the study be put to her via
PICS, thanked the Monitoring Project Group members for their participation, and closed the
meeting.
7/7