HERA D3.2.1 HERA Peer Review Report WP3 July 15 2006 (month

HERA website: www.heranet.info
Deliverable number
HERA D3.2.1
Title
HERA Peer Review Report
Work Package
WP3
Actual submission
date
(contractual date)
July 15 2006 (month 18) – final version
Organisation name(s)
of lead contractor for
this deliverable
Irish Research Council for the Humanities and Social Sciences
(IRCHSS)
Author(s)
Dipti Pandya
Julie Curley
With the help of
All HERA partners
Nature
Report
Status
Final version
Dissemination level
Public
Abstract
Contract no: ERAC-CT-2005-016179
Peer Review Report
final version, July 15 2006
IRCHSS
Work Package 3
Task 3.2
Contents
Part 1
Section 1
Introduction .........................................................................5
Overview of HERA ...........................................................................................5
Overall objectives of HERA ...............................................................................5
HERA Partners .............................................................................................5/6
Section 2
Task 3.2 Peer Review ...........................................................6
Objectives......................................................................................................6
Description of work .........................................................................................6
Overview Task 3.2 ..........................................................................................7
Section 3
Methodology ........................................................................7
Development of task ....................................................................................7/8
Data collection process ....................................................................................8
Peer Review workshop .....................................................................................8
Compilation of report.......................................................................................9
Design of report..............................................................................................9
Section 4
Compare and contrast Peer Review processes .....................9
Table design...................................................................................................9
Overview of all key processes table ................................................................. 10
Section 5
Commonalities/Differences of Peer Review processes .......15
Section 6
Outcomes of Peer Review workshop ..................................19
Section 7
Peer Review Model .............................................................27
Section 8
Peer Review process step by step ......................................31
Section 9
Timeframe of Peer Review process ....................................32
Section 10
Conclusion..........................................................................33
Section 11
Template of evaluation form ..............................................34
HERA Work Package 3 Task 3.2 Peer Review Report
Page 3 of 124
Part 2
Profile of Peer Review processes in each partner Council
1. Netherlands Organisation for Scientific Research (NWO) – Coordinator…36
2. Academy of Finland (AKA)…………………………………………………………………………40
3. Academy of Sciences Czech Republic (ASCR)………………………………………….44
4. Arts and Humanities Research Council (AHRC)………………………………………..47
5. Austrian Science Fund (FWF)…………………………………………………………………….51
6. Danish Research Agency (DRA)…………………………………………………………………55
7. Estonian Science Foundation (EstSF)………………………………………………………..59
8. European Science Foundation (ESF)………………………………………………………….62
9. Fund for Scientific Research – Flanders (FWO)…………………………………………68
10. Icelandic Centre for Research (RANNÍS)…………………………………………………..71
11. Irish Research Council for the Humanities and Social Sciences (IRCHSS)74
12. Slovenian Ministry of Higher Education Science and Technology(MHEST)78
13. Research Council of Norway (RCN)……………………………………………………………82
14. Swedish Research Council (VR)………………………………………………………………….85
Sponsoring partners
15. Swiss National Science Foundation (SNSF)……………………………………………….88
16. National Fund for Scientific Research, Belgium (FNRS)…………………………….92
Appendix
Peer Review workshop schedule……………………………………………………95
Peer Review workshop participant list……………………………………………97
Glossary of terms…………………………………………………………………………103
Questionnaire……………………………………………………………………………..104
HERA Work Package 3 Task 3.2 Peer Review Report
Page 4 of 124
Section 1 Introduction
Overview of HERA
HERA financed by the EU Framework Programme 6's ERA-NET scheme was established
from the ERA-NET ERCH (European Network for Research Councils in the Humanities)
formulated by the Danish, Dutch and Irish Research Councils. HERA in conjunction with the
ESF and participating Humanities Research Councils across Europe will endeavour to firmly
establish the humanities in the European Research Area and the 6th Framework
Programme. Over a period of four years partners are dedicated to the establishment of
best practice in funding mechanisms, research priorities, humanities infrastructure and the
development of a transnational research funding programme.
Overall objectives of HERA
The main objective of HERA is to ensure that the European Research Area can fully benefit
from key contributions consequent on humanities research. Because of the varied and yet
essential nature of the field, a Europe-wide structuring initiative is particularly important
for attaining such objectives. This aspiration will be accomplished through a number of
supporting general objectives:
•
to stimulate transnational research cooperation within the humanities
•
to enable the humanities to play an appropriate and dynamic role in the ERA and
within EU Framework Programmes
•
to overcome fragmentation of research in the humanities
•
to advance new and innovative collaborative research agendas
•
to improve cooperation between a large number of research funding agencies in
Europe
•
to attract more funding to research in the humanities by raising the profile of the
humanities
HERA Partners
Partner
Netherlands Organization for Scientific Research
Partner acronym
NWO
Country
Netherlands
Irish Research Council for the Humanities and
Social Sciences
IRCHSS
Ireland
Danish Research Agency
DRA
Denmark
European Science Foundation
ESF
International
HERA Work Package 3 Task 3.2 Peer Review Report
Page 5 of 124
Arts and Humanities Research Board
AHRB
UK
Academy of Finland
AKA
Finland
Estonian Science Foundation
EstSF
Estonia
Research Council of Norway
RCN
Norway
Academy of Science Czech Republic
ASCR
Czech Republic
Rannís – the Icelandic Centre for Research
Rannís
Iceland
Swedish Research Council
VR
Sweden
Ministry of Higher Education, Science and
Technology
MHEST
Slovenia
Austrian Science Fund
FWF
Austria
Fund for Scientific Research – Flanders
FWO
Belgium
Swiss National Science Foundation
SNSF
Switzerland
Section 2 Work Package 3 Exchange of information: Surveys in best practice,
Task 3.2 Peer Review as per HERA Description of Work
Objectives
•
Collection of detailed information on peer review procedures
•
Influence current national peer review procedures
•
Determination of relevant peer review procedures for the joint funding programme
envisaged in WP9
Description of work
Task 3.2 Peer review
The ERCH SSA survey of humanities research councils has demonstrated a high degree of
operational and policy divergence among national research councils with regard to
mechanisms and modalities of peer review conducted either on a national or international
basis.
A workshop on peer review held in the context of the ERCH Conference in
September 2004 highlighted the common agreement that peer review is critical to the
transparent and objective allocation of research funding for both individual awards and
programme awards. However, the workshop also highlighted a divergence of approaches
to peer review in terms of selection of peer review experts, use of data bases of peer
reviewers, instruction and training of peer reviewers, the challenge of peer reviewer
fatigue, international versus domestic peer review, the articulation of an ethical dimension
to peer review, provision of expert feedback to reviewers, and the expectations of
applicants with regard to peer review.
In addition to collating detailed data on such
HERA Work Package 3 Task 3.2 Peer Review Report
Page 6 of 124
operational issues, it is also proposed to formulate an overview of broader policy
approaches to peer review. The data collected will underpin a workshop on peer review
and will inform a resulting report on best practice in the field. The workshop will be
moderated by an external facilitator. This report will influence current procedures among
research councils and will determine relevant procedures for the research programme
envisaged in WP9. Similar exercises carried out by other ERA-NETs, e.g. Norface will be
taken into consideration.
Tasks:
3.2.1 Collection of data on current procedures
3.2.2 Organisation of a workshop for programme managers, directors and relevant
personnel
3.2.3 Production of a report on best practice in peer review
Overview Task 3.2 Peer Review
The Irish Research Council is the task leader for Workpackage 3. This task will serve as a
forum for the exchange of operational expertise between research councils for the
humanities, leading to a systematic exchange of information on research funding
instrument procedures and processes. The subsequent report produced recommends best
practice in peer review in preparation of joint calls for transnational research programmes.
This task involved a survey (circulation of a questionnaire) of all HERA partner
organisations to obtain data on practices and procedures both at an operational and
strategic level and the organisation of a workshop to discuss and debate the key issues of
peer review. The resulting report will consist of two elements:
Part One consists of an analysis of all operational procedures employed, discussion and
outcomes of the peer review workshop and recommendations for best practice in peer
review in relation to the management of the transnational research programme scheduled
for 2008.
Part Two is a profile (paper-based) of individual partners’ broad strategic principles and
organisational peer review processes.
Section 3 Methodology
Development of task
The stages of the task entailed gathering preliminary data on HERA partners, investigation
and examination of comparative surveys undertaken by other relevant ERA-NETs [ERCH
HERA Work Package 3 Task 3.2 Peer Review Report
Page 7 of 124
(SSA) and NORFACE (CA)]. The draft design and development of the questionnaire-based
survey was completed. On consultation with HERA partners the questionnaire was
finalised, circulated and completed by all partners. The data relating to peer review was
extracted from the questionnaires was then collated and compiled to allow an overview of
peer review processes in each partner. A workshop was organised with presentations from
two peer review experts, discussion of advantages and disadvantages of peer review in
partner organisations and the discussion of key elements of peer review. The final report
has been produced on the basis of these findings.
Data Collection
Preliminary activity
Task 3.2 Peer Review commenced with the investigation and study of previous surveys
conducted in the context of the Humanities and the Social Sciences as part of the ERA-NET
programme, namely NORFACE and ERCH. An investigation of each partner Research
Council and the ESF was undertaken to establish a basic understanding of the organisation
structure, the nature of the research funding instruments operated by the Councils and the
dedicated budgets involved.
Survey
On establishing the key points of peer review processes employed by Research Councils in
the operation of research funding instruments, a survey based questionnaire was drafted.
This questionnaire was presented at the HERA Kick-Off meeting (June 2005) and following
consultation and feedback from HERA partners the questionnaire was altered accordingly
and the final version was distributed by the end of June 2005. The questionnaire also
included questions relating to Task 3.1 Application Procedures, whereby it was deemed
more time effective to incorporate into the same questionnaire.
Peer Review Workshop
In conjunction with the survey-based questionnaire, a peer review workshop was also held
to debate and discuss the key components of peer review. Preparations for the workshop
began five months previous, including logistical details, design and structure of the
workshop, selection of expert speakers, input of partners (six partners presented the
advantages and disadvantages of peer review in their respective organisations). Principal
topics were selected to be discussed during assigned breakout sessions (topics included:
Balance between National and International Peer Review, Evaluation Procedures and
Criteria, Identification and Selection of Experts, Feedback Processes) and a rapporteur was
appointed to each group to lead and record the discussions and present their findings at
the end of the workshop. Relevant participants were invited to attend the event.
Location and Date
The workshop was held on Friday 18 November 2005, at Dublin Castle, Dublin.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 8 of 124
Participants
A total of thirty-nine participants attended the workshop, including representatives from all
HERA partner Research Councils and the ESF. Key figures from the research funding and
policy community in Ireland were also in attendance.
Final compilation of report
On receiving completed questionnaires from HERA partners, the individual data was
extracted from the questionnaires and formed the basis of the profiles of individual peer
review processes implemented by each partner. This data was compared and contrasted
via a constructed table that highlighted the key considerations. The major commonalities
and differences were then obtained and documented. A ‘model’ (best practice) of peer
review has been recommended incorporating all knowledge and information available via
the survey and discussions at the workshop. Practical considerations are also included, in
terms of a step-by-step plan, timeframe and an evaluation form template to be potentially
utilised during the transnational research funding programme.
Design of report
The report has been purely based on a survey based questionnaire of HERA partners and
discussions and outcomes of the peer review workshop; therefore it is not an exercise in
comparative analysis and does not examine the academic ramifications of peer review
processes implemented by partners. This report is intended to be entirely factual in nature
and produce a recommendation for a best practice ‘Peer Review model’ to be implemented
in the transnational research funding programme, thereby providing HERA partners with a
framework by which to plan and execute their objective.
Section 4 Compare and contrast application procedures
The following table identifies the key aspects of peer review processes and is intended to
provide a general overall view of the methods employed by all partners. The information
included has been considered in general terms and is derived from the larger scale funding
instruments operated by partners.
Glossary of terms
FOI – Freedom of Information Act
NR – Not Recommend
HERA Work Package 3 Task 3.2 Peer Review Report
Page 9 of 124
Overview of Peer Review processes per HERA partner
HERA Partner
Netherlands
Organisation
for Scientific
Research
(NWO)
Academy of
Finland (AKA)
Academy of
Sciences Czech
Republic
(ASCR)
Methods of Peer Review
Independent Reviewers
Assessment Panel
- All instruments
- National or
International
- Selected by NWO
- No fee
- Evaluation form
- Graded
- Majority of instruments
- Predominantly domestic
members, international if no
domestic assessor available or
conflict of interest
- Grade and rank
- Expenses reimbursed
- 8 members
- Often interviews conducted with
applicant
Average
Number of
Reviewers
Selection
of
Reviewers
Reviewer
Training
Grading System
Feedback Process
Appeals
Process
3
Council
members
No training
Applications
Graded and
Ranked
Both independent
reviewers and
assessment panel
members evaluations
available
On procedural
grounds only
Organisation
personnel
Right to reply
available
Subject to FOI
-
Majority of instruments
2 per application
€55 fee
Evaluation form
Graded
- Primarily panels of both domestic
and international assessors
- Grade only
- Flat fee €280 + €18 per
application + travel costs
- Evaluation form
- 3/4 members
2
-
All instruments
Majority international
3/7 per application
€300 fee per application
Evaluation form
Graded
- All instruments
- Both domestic and international
members
- Grade and rank
- 5 to 10 members
- €300 fee per application
3-7
HERA Work Package 3 Task 3.2 Peer Review Report
Council
members
No training
Organisation
personnel
Council
members
Applicants
recommend
Page 10 of 124
Training for
one
instrument
only
Graded 1(poor) to
5 (outstanding)
Ranked only by
Council members
during funding
decision stage
Only independent
reviewers evaluations
available
Graded A++ to NR
Ranked
Both independent
reviewers and
assessment panel
members evaluations
available
Subject to FOI
No appeal
process
Subject to FOI
No appeal
process
*Peer Review College established by AHRC
Arts and
Humanities
Research
Council (AHRC)
-
All instruments
2 per application
No fee
Evaluation form
Graded
-
All instruments
Majority domestic members only
8 – 15 members
Flat fee €1,500 + travel costs
Evaluation form
Graded and ranked
3
Selected
from Peer
Review
College
members
Training
provided
Organisation
personnel
Graded A+, A, N
(not priority), RS
(resubmit), U
(unsuccessful)
Ranked by
responsive mode
panels only
Independent
reviewers evaluations
available
Appeals process
is in place
Right to reply
available
Subject to FOI
No training
Graded 100 (very
good) to 10 (poor)
Independent
reviewers evaluations
available
No appeals
process
Majority of instruments
2-10 per application
No fee
Evaluation form
Graded
- All instruments
- Depending on the instrument,
panels are composed of either all
domestic or all international
members
- Domestic panel: 26 members,
flat fee €700 + travel costs,
ranked
- International panel: 6-10
members, expenses reimbursed,
graded
Overall 2
(depending
on
instrument
4-10)
Council
members
Danish
Research
Agency (DRA)
- Not common procedure,
only if conflict of interest,
no expertise available,
funding exceeds certain
level
- All instruments are reviewed by
Council members only
- 15 members
- Flat fee + travel costs
2
Council
members
No formal
training
Applications are
not graded or
ranked
Subject to FOI
Letter stating why did
not receive funding
On
administrative
grounds or
requesting
elaboration of
feedback
Estonian
Science
Foundation
(EstSF)
-
-
2
Council
members
No training
Graded 1 (poor) to
5 (outstanding)
Both independent
reviewers and
assessment panel
members evaluations
available
No appeals
process
Austrian
Science Fund
(FWF)
-
All instruments
2 per application
Fee €21 per application
Evaluation form
Graded
All instruments
Domestic members only
10 members
Fee + travel costs
Ranked
Applicants
recommend
Organisation
personnel
No FOI
No FOI
HERA Work Package 3 Task 3.2 Peer Review Report
Page 11 of 124
European
Science
Foundation
(ESF)
- Majority of instruments
- No fee
- Graded
-
All instruments
International members only
9-15 members
Graded and ranked
2-3
Organisation
personnel
Training
provided
Graded 1 (poor) to
5 (excellent)
Ranked
Participating
funding
agencies
Both independent
reviewers and
assessment panel
members evaluations
available
No appeals
process
Right of rebuttal
available
Fund for
Scientific
Research –
Flanders (FWO)
-
All instruments
National or international
2 per application
Evaluation form
Graded
-
All instruments
International members only
14 members
Graded and ranked
2
Icelandic
Centre for
Research
(RANNÍS)
-
All instruments
National only
2 per application
€50 per application
Evaluation form
Graded
-
All instruments
Domestic members only
7 members
Evaluation form
€40 per application + hourly fee
Ranked
2
Applicants
recommend
No training
Graded
Ranked
Only grades available
No appeals
process
Subject to FOI
Assessment
panel
nominates
independent
reviewers
No training
Graded (I, II, III,
IV, V)
Ranked
Both independent
reviewers and
assessment panel
members evaluations
available
No appeals
process
Subject to FOI
Irish Research
Council for the
Humanities and
Social Sciences
(IRCHSS)
- Only Post-Doctoral and
individual senior funding
instruments
- Fee €65 per application
- Evaluation form
- Not graded
- All instruments
- International members only
- 6-25 members (depending on
instrument)
- Flat fee €1,000 + travel costs
- Graded and ranked
2-3
Council
members
Organisation
personnel
No training
Graded (A++, A+,
A, B, NR)
Ranked
Both independent
reviewers and
assessment panel
members evaluations
available
Subject to FOI
HERA Work Package 3 Task 3.2 Peer Review Report
Page 12 of 124
No formal
appeals process
- All instruments
- National and
international
- 3 per application
- No fee
- Evaluation form
- Graded
-
Research
Council of
Norway (RCN)
- All instruments
- National and
international
- 3 per application
- Fee €108 per
application
- Evaluation form
- Graded
- Panels are not utilised
- Majority of instruments
- Utilised only if panel
deem necessary
- All instruments
- Domestic members
(International only for ‘Centres of
Excellence’)
- 7-8 members
- Fee €60 per application + travel
costs
- Evaluation form
- Graded or ranked
2
- Responsive mode instruments:
Council member panel, 5-21
members, annual fee €7,000 +
travel, graded
- Thematic mode instruments:
domestic and international
members, 5-21 members, fee,
graded
5
Swedish
Research
Council (VR)
All instruments
Council members only
14 members
Flat fee + travel costs
Evaluation form
Not graded or ranked
3
Slovenian
Ministry of
Higher
Education,
Science and
Technology
(MHEST)
Organisation
personnel
No training
Graded 0-100%
Ranked
Applicants
recommend
Both independent
reviewers and
assessment panel
members evaluations
available
Appeals process
available if
unsuccessful
applicant replies
within 8 days
Subject to FOI
2-3
Database of
reviewers
No training
Organisation
personnel
Graded 7 (highest
rating) to 1
(lowest)
Ranked
Independent
reviewers evaluations
available
A limited appeals
process is
available
Subject to FOI
Applicants
recommend
Council
members
No training
Organisation
personnel
Graded 7 (highest
rating) to 1
(lowest)
Ranked
Independent
reviewers and
assessment panel
members evaluations
available
No appeals
process
Subject to FOI
Sponsoring
partner
Swiss National
Science
Foundation
(SNSF)
- All instruments
- National and
international
- 2-6 per application
- No fee
- Evaluation form
- Not graded
HERA Work Package 3 Task 3.2 Peer Review Report
Council
members
Organisation
personnel
No training
Graded (A, AB, B,
BC, C, CD, D)
Ranked
Independent
reviewers and
assessment panel
members evaluations
available
No FOI
Page 13 of 124
Appeals process
in place
Sponsoring
partner
Fund for
Scientific
Research,
Belgium
(FNRS)
Independent reviewers
are not utilised
- All instruments
- Domestic and international
members
- 10 members
- Domestic €50, International
€400 + travel costs
- Graded
HERA Work Package 3 Task 3.2 Peer Review Report
10
Council
members
Page 14 of 124
No training
Graded (A++, A+,
A, B, NR)
Evaluations are not
available
Applications are
not ranked
No FOI
No appeals
process
Section 5
Key commonalities and differences of the Peer Review Process in partner
organisations
Peer Review Objectives
The vast majority of HERA partners noted that the evaluation of applications, the making
of funding recommendations, the funding of research excellence and a transparent
impartial peer review process were of the highest and equal importance. International
benchmarking was deemed the next highest priority. The ESF also
noted that
collaboration, European added value, interdisciplinary, and the efficient utilisation of
funding were also considered. The AKA stated that their peer review process did not make
funding recommendations.
Peer Review Methods
The vast majority of partners employ both independent reviewers and assessment panels
in their peer review processes. Exceptions include the RCN which does not facilitate
assessment panels. Both the DRA and VR do not use independent reviewers as a general
procedure, the DRA employs their service only if there is a conflict of interest with
assessment panel members and the applicant, if no expertise is available on the panel or if
the funding reaches a certain level (excess of approximately €1 million). The VR states
that it only uses independent reviewers when the assessment panel deem their additional
assistance to be necessary. Also the FNRS does not include independent reviewers in their
peer review process.
Independent Reviewers
Partners that utilise independent reviewers (postal reviewers) generally insist that
evaluations include a grade; however the IRCHSS and SNSF are exceptions and request
that no grade be supplied. Six partners do not pay a fee to independent reviewers and an
equal number pay a nominal fee. An evaluation form is forwarded and completed by all
reviewers and approximately two to three different reviewers evaluate each application.
Partners’ decision to use domestic/international reviewers varies greatly, with a slight
majority employing either domestic or international depending on the research proposal of
the application and the expertise available nationally.
Assessment Panels
As noted all partners peer review process includes an assessment panel except for the
RCN. The composition of panels varies greatly across partners, for example, six partners
employ assessment panels composed of both domestic and international members,
depending on the expertise required. Panels of domestic members only are utilised by a
further four partners and international member only panels are favoured by three
partners. The remainder partner panels consist of Council members only. However despite
the variations the majority of panels’ grade and rank applications, members are paid both
a flat fee for their participation and travel expenses are reimbursed and the approximate
average number of members per panel is ten.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 15 of 124
It is worth noting that some partners have unique peer review aspects, for example NWO
forwards evaluations made by independent reviewers anonymously to applicants and the
applicant is granted the opportunity to submit their comments on the evaluation to the
assessment panel, also in some cases the applicants are interviewed by the panel before
they make their final decisions. The DRA peer review system exclusively retains Council
members as its assessment panel, therefore only Council members are involved in the
process of evaluation. The IRCHSS operates assessment panels exclusively composed of
international assessors, due to the relatively small and closely interlinked research
community and therefore the requirement of transparency is greater.
The AHRC has established a Peer Review College to conduct its peer review requirements.
The college evaluates and reviews applications submitted to all research funding
instruments. The College has a membership of more than 560 academics that participate
in evaluating applications. The college was established to improve the quality of peer
review, by enabling members to assess a number of applications per year as opposed to
just one or two over an extended period, this therefore allows their expertise to develop
and in turn provide better comparative assessments of the strengths and weaknesses of
applications.
Number of Reviewers
The number of reviewers varies greatly, however the average is approximately 2-3
reviewers that evaluate submitted research proposals in detail.
Process of selecting Reviewers
In the majority of partners, reviewers are selected primarily by Council members, followed
by organisation personnel through networks and often the internet is considered a
resourceful tool to locate assessors. Some partners actively involve the applicants by
requesting that they recommend appropriate reviewers for their submitted proposal.
Criteria for selection of reviewers
In general, across all partners it was considered that academic excellence and relevant
disciplinary (subject area) competence were the most importance factors when selecting
reviewers. However previous peer review experience, encouraging young researchers, and
creating gender balance were also significant aspects to be taken into account. The ESF
noted that a lack of conflict of interest and availability of the reviewer were also considered
when selecting reviewers.
Reviewer recognition
Several partners’ list members of their assessment panels on their websites and it is noted
that reviewers are free to include details of their participation in their Curriculum Vitaes
and academic publications. The NWO and FWF send reviewers formal thank you letters,
while FWO reviewers are awarded a medal of honour at the end of their service.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 16 of 124
Reviewer Training
The majority of partners do not offer any training to reviewers. However the ESF
EUROCORES Programme and the AHRC Peer Review College do offer substantial training in
particular to reviewers new to the peer review process. The ESF EUROCORES Review Panel
members receive detailed instructions before and during meetings (half-day session);
during the review process and advice on procedural issues is made available. The AHRC
organises induction days, consisting of mock assessment exercises with experienced Peer
Review College members.
Conflict of Interest
All partners have a ‘conflict of interest’ policy in place, however the policy varies, it
generally encompasses similar aspects, for example reviewers are requested not to
participate if they have a financial interest in a proposal, if they have a personal or
professional relationship with the applicant and often if the applicant is from their own
university institution. Many partners require that reviewers sign a formal declaration
concerning impartiality. The IRCHSS and FWF operate a voluntary policy, whereby
reviewers inform the organisation of a conflict of interest arises. The AKA’s ‘conflict of
interest’ policy is based on Finnish law. For example the ESF ‘conflict of interest’ policy
states that “an interest may be defined as where a person may benefit either
professionally or personally by the success, or failure, of a proposal”.
Grading System
All partners’ grade and rank applications, however the grading system employed by
partners differs greatly. The ESF employs a scale of 1 to 5; the IRCHSS and FNRS A++ to
NR (Not Recommended), and the RCN and VR use a 1 to 7 range. A percentage scale of
0% to 100% is utilised by the FWF.
Application evaluation criteria
Research Proposal
The majority of partners consider that the most important aspects of evaluation criteria
are as follows (not in order): broad aims and objectives of the research, proposal
description, proposed schedule of development of proposal, location of the research
proposal within the current state of research and the relative significance of the
contribution that the research proposal will make to the research field. These criteria are
closely followed by the importance of methodology, theoretical framework and the
suitability of the institution proposed. Bibliography in the research area and plans for
publication and dissemination of the research results are then taken into consideration.
Exceptions include the NWO which adheres to a different set of criteria including
assessment of the quality of the researcher, innovative character of the proposed
research, assessment of the quality of the research proposal and the final overall
assessment. The ASCR also states research infrastructure as an important consideration.
The ESF also evaluations under the following: scientific quality, feasibility, level of
multidisciplinary, originality, budget estimation, collaboration, European added-value and
an absence of overlapping with existing projects. Their EURYI and EUROCORES
programmes also have specific criteria (see ESF profile for details).
HERA Work Package 3 Task 3.2 Peer Review Report
Page 17 of 124
Principal applicant
In general, most partners evaluate principal applicants under the following criteria and in
this approximate order: academic record and achievements of applicant, international
collaboration, previous awards/funding and the potential for mobility of researchers.
Feedback to applicants
Independent reviewer and assessment panel evaluations are available to applicants in the
majority of instances. The identity of the independent reviewers is kept anonymous,
whereas members of assessment panels are usually published on the organisations
website and there made available to the public. The majority of partners are subject to a
‘Freedom of Information’ Act. Both the AHRC and NWO ‘Right to Reply’ and ESF ‘Right to
Rebuttal’ procedures allow applicants the opportunity to comment (in written format) on
evaluations of their research proposal by independent reviewers and these are then
forwarded to the assessment panel for inclusion in the final evaluation process.
Feedback to reviewers
Formal feedback to reviewers is not forwarded in general, however some are made aware
of the applicants that were successful and received funding.
Appeals Process
The majority of partners do not have an appeals process in place, and if so it
predominantly focuses on procedural issues that arise as opposed to an opportunity to
contest the outcome of their evaluation.
Cost of peer review process
Data was not available for the majority of partners; however AKA stated that the cost of
evaluating all funding instruments in 2004 was €140,500. The RCN noted that it cost on
average €85,000 for the peer review process of one large scale funding instrument. MHEST
spend €20,850 per annum and the EsfSF €2,045 in 2005. The peer review process
expense to the IRCHSS is approximately €40,000 per funding instrument and RANNIS
calculate that €28,000 is expended per year (€350 per application evaluated). The SNSF
also stated that 1% of their total annual budget (€250,000) was dedicated to the peer
review process.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 18 of 124
Section 6 Peer Review Workshop outcomes
Workshop Speakers
Two scholars were invited to speak at the workshop on the basis of their academic peer
review expertise.
Professor Chris Caswill
Professor Caswill is a Visiting Fellow at the James Martin Institute at Oxford University,
Visiting Professor at Exeter University, and Senior Research Fellow at University College,
London. Senior Research Associate at the Interdisciplinary Centre for Comparative
Research in the Social Sciences in Vienna, Adviser to the Research Council of Norway, and
Senior Policy Adviser to the EU-funded NORFACE ERA-NET project. Until the end of 2003,
he was Director of Research at the ESRC. His research interests are in science policy,
European research policy, the application of principal-agent theory and interactive social
science.
Professor Caswill’s presentation was entitled ‘Peer Review in Context’ and discussed peer
review from a sociological and political perspective. He presented inherent challenges
facing peer review systems, for example biases, group dynamics and the element of
chance and recommended that Research Councils address and aim to reduce these issues.
Peer Review in relation to HERA was discussed, in particular the importance of the
selection process and the necessity to establish methods of evaluation to monitor and
review the process.
Dr Sven Hemlin
Dr Hemlin is a Senior Lecturer at the Sahlgrenska Academy, Göteborg University, Visiting
Research Fellow at SPRU, University of Sussex, and Visiting Professor at the Department
for Management, Politics and Philosophy, Copenhagen Business School. Dr Hemlin is an
expert in cognitive and social psychology based science studies, research ethics, research
policy, R&D management and research evaluation studies.
Dr Hemlin’s presentation was entitled ‘How does Peer Review work?’ and addressed peer
agreement verses peer disagreement, concluding that agreement was best, however
difficult to achieve in the HSS. He noted that peer review should encompass objective as
opposed
to
personal
views,
equal
consideration
to
both
mainstream
and
marginal/innovative research be granted and evaluation focus on the actual research
proposal as opposed to the status of the research facility. Ethical issues were raised
including the existence of sexism and nepotism within systems.
He argued that peer
review was often too ‘soft’ and research should be graded according to a citation index. He
advocated flexibility, frequent change, explicit evaluation criteria and the necessity for
feedback to applicants.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 19 of 124
'Tour de Table'
HERA Work Package leaders (AHRC, NWO, DRA, ESF, AKA FIN, and IRCHSS) provided a
short presentation on the advantages and disadvantages of the peer review system in
operation in their respective organisation.
Arts and Humanities Research Council (AHRC)
Faye Auty outlined the unique ‘Peer Review College’ operated by the AHRC which to its
advantage had been widely accepted by the academic community, and was transparent
due to the ‘right to reply’ procedure it offered which allowed applicants to comment on
evaluations prior to the final decision making. Disadvantages included the lack of flexibility
due to the limited number of members of the college and this placed stress on organisation
personnel to select new members and burdened existing members to evaluate a high
number of applications.
Netherlands Organisation for Scientific Research (NWO)
Annemarie Bos noted that the anonymity of reviewers and the ‘right to reply’ procedure
were to be recommended as reviewers tended to be more open-minded, objective and
were more inclined to participate and applicants benefited as they could correct
interpretations, reply to objections/controversies and display their skills further. It was
advised that a written appeal process be avoided as it did not contribute to the
assessment, limited judicial review of the Council, was expensive and was open to
inappropriate use.
Danish Research Agency (DRA)
Grete Kladakis considered the strength of the peer review system from the perspective of
the organisation to be consistent, due to the use of Council members which ensures a high
level of continuity and stability. Efficiency is key from the applicants perspective as the
call for applications takes place at the same time each year and the decision making
process is completed within three months. However weaknesses include reviewer fatigue
as the workload of Council members (which are also full time academics) is substantial,
and from the applicant’s point of view, the potential for bias is considered greater as no
external reviewers are utilised and the academic community is relatively small.
European Science Foundation (ESF)
Dr Ruediger Klein focused on the EURCORES programme operated by the ESF, he noted
that Review Panel advantages included the physical meeting which enabled full discussion
of applications and through consensus allowed external evaluations to be overruled and
enabled detailed feedback to be available to applicants. However a potential disadvantage
existed whereby members could act as national representatives as opposed to objective
assessors. The EUROCORES online assessment assisted easy access for all parties, efficient
processing, storage and quality control, although difficulties arose with misunderstandings
of terms, inappropriate reviewers proposed by funding agencies and reviewers unable to
check compliance with national requirements. Further problems included reviewer fatigue
HERA Work Package 3 Task 3.2 Peer Review Report
Page 20 of 124
(approximately 30% of evaluation were of sufficient quality), no financial incentive offered,
reviewers inability to compare applications and evaluation of only one aspect of
interdisciplinary proposals. The EUROCORES Programme has a two-stage process of
application, whereby applicants send an initial proposal outline and if successful then a
further detailed proposal, this is beneficial as it allows those proposals not fitting the call to
be eliminated, saving the applicant time and following for a full evaluation from multiple
tiers ensuring quality control. On the negative side this two-stage process adds a further
two months to the process and considerable cost and time to the organisation.
Transparency of the process if key to the EUROCORES review process and all information
of interest to relevant parties is made available. The applicants have access to a ‘right to
rebuttal’ procedure whereby they may comment on external evaluations and have this
included in the review panel discussions. The declaration of interest enforced by the ESF
ensures that conflicts are interest do not occur, or at least minimised.
Academy of Finland (AKA FIN)
Dr Kustaa Multamäki concluded that the first advantage of the AKA peer review process
was the reliability whereby the evaluation stage (graded applications only) and funding
decision stage (ranked applications only) were distinctly separate. Secondly he noted that
the process was transparent and adaptable as there was no formal pool of reviewers so the
best could be selected in each case and as the same evaluation criteria was applicable to
all disciplines, interdisciplinary panels were easy to form. Disadvantages included the high
cost and time consuming nature of the process, due to selection of experts and arranging
panel meetings and also the often lack of uniformity of the evaluations by different
experts.
Irish Research Council for the Humanities and Social Sciences (IRCHSS)
Dr Marc Caball noted that the advantages of the IRCHSS peer review process included the
utilisation of international assessment panels which were composed of leading international
experts in their fields and therefore quality assessments were assured, also as the Irish
university sector was relatively small it enabled the integrity of Research Council’s decision
making process and allowed recommendations on a transparent basis without local
academic intervention. Disadvantages to this process include, pressure in a short time
frame to source reports from international reviewers to inform decisions of international
assessment panels and the variation in the quality of evaluations. Also assessment panels
were composed of mainly established scholars (largely middle-aged and male) and it was
questioned if this impacted on emergent or interdisciplinary areas of research.
Breakout Sessions
The second part of the workshop centred on four separate breakout sessions, each session
having a specific peer review thematic strand for discussion. Each one-hour session was
directed by a rapporteur who introduced the topic and acted as facilitator for reflection and
HERA Work Package 3 Task 3.2 Peer Review Report
Page 21 of 124
debate. Following the sessions, the rapporteurs reported back to the plenary group on the
main issues and topics emanating from the workshops.
Breakout session 1. 'Balance between National and International Peer Review'
was facilitated by Dr Eiríkur Smári Sigurðarson, Icelandic Centre for Research”.
The group discussed various aspects of national and international peer review. Some
participants had considerable experience with international peer review – e.g. the
Academy of Finland (AKA), the Irish Research Council for the Humanities and Social
Sciences (IRCHSS), the Austrian Science Fund (FWF) and ESF. There was general
agreement that opting for an international peer review was primarily a political decision. In
Austria the decision was made in order to increase the research quality in Austria. However
there are substantial costs involved in this option. Regionally based research has less
international interest than research without regional focus. This may be a particular
problem in the humanities. On the positive side it was noted that in general it was easier
to avoid problems with conflict of interest in an international peer review – though there
are some research areas that are so specialised that it is almost impossible to avoid these
problems even when the search for peer reviewers is on an international scale.
A distinction was made between review panels and mail reviewers. There was a worry that
working with international review panels was more difficult than using national panels,
while this was not a problem with international mail reviewers. It is difficult and expensive
to arrange meetings with international review panels. Against this it was pointed out the
AKA, IRCHSS, FWF and ESF have been using international panels for some time without
much problems. International reviewers do not have to come from afar – in the case of
Ireland for example they can mainly be from the UK. It was stressed in the group that
international panels were not as such better than national panels. They can be good or
bad, just as national panels can be good or bad. In discussing mail review most had a
similar experience with international review more often being hagiographic than national
mail review – it is in general more positive than national review.
Language was also on the agenda. This may be a peculiar problem to the humanities, as
the research is often on language itself or on texts in the national languages. But the use
of the national languages does not exclude international peer review, not in all cases at
least (some languages, like Icelandic, are not used by large communities abroad). English,
French, German, Spanish, Portuguese are not restricted to single countries. Danish,
Norwegian and Swedish researchers read and understand each other’s languages. In
addition there are often significant communities that have settled abroad.
Another potential problem raised was the prospect of young researchers in an international
system. International reviewers tend to be more established than national reviewers and
they may favour more established researchers. Recruiting reviewers can be difficult and
the participants have varying degrees of success. This is possibly related to who identifies
and contacts potential reviewers. It seems to be more effective to use national council or
panel members than research administrators. The issue of payment was also raised and it
was recommended that a single European rate for peer review be adopted.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 22 of 124
While there was a common agreement that international peer review was good, in
particular if the objective is to internationalise research, it was also stressed within the
group that the difference between national and international peer review should not be
perceived in terms of quality. The one is not, as such, better than the other.
Breakout Session 2. 'Evaluation Procedures and Criteria':
Carl Dolan, Arts and Humanities Research Council, UK acted as rapporteur for the
discussion of this topic.
The main points of discussion were as follows:
Design of forms
Better design of forms – both application forms and reviewer assessment forms – is
necessary not only to ensure more efficient processing and to avoid duplicating
information, but also because complex, confusing and poorly-designed assessment forms
are a cause of peer review fatigue. There was some support for a two-stage application
procedure to avoid overburdening reviewers.
There are a number of important guiding principles that should be borne in mind when
designing forms and processes:
•
Forms should be as concise and to-the-point as possible
•
The effort required to complete a form should be commensurate with the value of
money involved
•
A balance should be struck between trying to engage the applicant in such a way
as to gather useful and relevant information and wasting the applicants valuable
time
•
Application processes and forms should be standardised across the organisation,
and across organisations, as much as possible to avoid confusing the academic
community.
Training
The importance of word of mouth and informal networks should not be underestimated in
encouraging the community to apply and as a way of providing training in the culture of
applying for research funding.
Electronic application
AKA, Swedish Research Council, EstSF, and AHRC all reported logistical difficulties in the
transition to electronic application processes ranging from the periodic failure/collapse of
the system to the inflexibility or rigidity of some electronic systems when there is no
longer regular human input (e.g. a limited number of options being allowed when giving
feedback to applicants). The dangers of having one system for all sciences were also
highlighted.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 23 of 124
Getting robust assessments
The difficulty of getting robust assessments was discussed at length. The main difficulties
experienced were:
•
Banal, unhelpful comments (often only a couple of words in length)
•
Comments not matching the assigned grade, with the consequence that panels
need to second-guess the reviewers’ intentions.
It was agreed that more effort should be made to identify why these problems occur, but
some potential solutions were suggested:
•
Send unsatisfactory comments back to reviewer
•
Address the issue of mismatch of comments and grade with a more sophisticated,
fine-grained grading scale
•
More detailed guidance for reviewers – must be very clear and specific about what
should be commented on
•
Face-to-face meetings of review panels often help to dispel misunderstandings and
calibrate grades. Referees who see only one or a few applications find the
comparative element very difficult.
The difficulty in identifying and securing the help of international reviewers for small
countries was also identified as a major problem.
Joint peer-review
The group also discussed the difficulties of devising joint peer-review mechanism with
other research councils. Both AKA and SRC had some experience of both international and
cross-Council versions of this, but the solution involved commissioning a combined review
panel or jointly nominating reviewers to disburse funding which had been ring-fenced for
purpose of cross-national research programmes. There was little experience of procedures
for the joint-review of multi-country proposals in a responsive or bottom-up mode in such
a way that the comparative, competitive element of the assessment is maintained. Such a
method of funding international collaboration would have implications for the way budgets
are handled and procedures/deadlines harmonised. The need for conditional funding or
reserve lists was briefly discussed as mechanisms for handling this issue.
Interdisciplinarity
The subject of how to avoid interdisciplinary applications from being ‘talked down’ to the
lowest grade was also discussed, although the evidence for this is largely anecdotal and
Research Councils reported that the success rates for applications that are explicitly
designated ‘interdisciplinary’ is roughly the same. The problem may be one of perceptions
rather than real and sustained bias.
Breakout Session 3. 'Identification and Selection of Experts'
Dr Marc Caball, Irish Research Council for the Humanities and Social Sciences.
The importance of differentiating between mail reviewers and panel reviewers was noted,
with mail reviewers requiring more expertise in specific research areas as opposed to panel
HERA Work Package 3 Task 3.2 Peer Review Report
Page 24 of 124
reviewers whereby experience in a broader range of disciplines and additional skills of
project management and policy engagement where beneficial. The group agreed that it
would be difficult to design specific criteria for the selection of reviewers; however it was
considered good practice to allow researchers to nominate appropriate reviewers for their
own research proposal. The individuals responsible for selecting reviewers were debated
and both Research Council members and administrative staff were considered suitable for
varying reasons (expertise/networks and experience). It was agreed that reviewer fatigue
was an issue and could be addressed with participation linked to the professional
development of academics. Gender and age profile of reviewers was regarded as a key
concern. It was considered that a database of reviewers would be very difficult and timeconsuming to construct and that the best method was the creation of networks among
Research Councils across Europe to exchange information.
Breakout Session 4. 'Feedback Processes'
Professor Elizabeth Meehan, Irish Research Council for the Humanities and Social Sciences
The group added to the issues in the pre-prepared briefing notes some related issues that
had emerged in the plenary sessions.
1. Feedback is Essential
The group agreed that feedback is fundamentally necessary.
It contributes to the
development of knowledge as, without it, researchers would not learn how to improve. At
the same time, there are issues about how best to provide it.
2. Anonymity of Reviewers’ Comments; Some Contradictory Views
The group noted that a number of contexts mean that anonymity is necessary; especially
in small countries, small research communities; and to give reviewers confidence that they
can be frank (bearing in mind that comments should observe guidance requiring criticism
to be temperate, constructive and objective).
mean that there is a lack of openness.
It was felt that this does not necessarily
There are examples of the practice of making
public lists of the names of reviewers and an indication of the number and subject-range of
applications reviewed over a period of time, without being explicit about which proposals
which reviewers had reviewed (e.g., the whole council in Denmark and panels in Science
Foundation Ireland (SFI).
On the other hand, it was also noted that the greater transparency of a lack of anonymity
could be a good thing.
If criticism were expressed to the standards noted above, this
could lead to continued interaction amongst researchers in the same field. This could be
good for the development of the applicant’s intellectual and career development. (In the
final plenary, Chris Caswill argued that, in the future, the abolition of anonymity was
probably likely).
HERA Work Package 3 Task 3.2 Peer Review Report
Page 25 of 124
3. Right to Reply; Right of Rebuttal
The group was very interested to have heard earlier about the Right to Reply practiced by
the AHRC and NWO and the ESF’s Right of Rebuttal. The group was told that applicants to
the AHRC were given guidance about how to make best use of the opportunity; that is, not
to respond emotionally but to explain rationally that some aspect or another had not been
fully understood by the reviewer(s). The group also noted that the Right to Reply does not
involve correspondence between the applicant and reviewers but that the Reply constitutes
an additional piece of evidence for consideration by the final decision-making board,
alongside the reviewers’ comments. The group heard something of instances in which this
had had an impact on the discussion of the final funding decision. The group agreed that
this was a good practice worthy of emulation, noting also that its successful use by the
AHRC and also by NWO owed much to a supportive IT infrastructure which minimized the
administrative burden.
4. Structure of Feedback
One member of the group referred to different cultures in the different countries from
which panels of international reviewers would be drawn, suggesting that funding bodies
might have to be content with a summary paragraph for forwarding to an applicant.
However, it was generally agreed that feedback should be systematically linked to the
areas and criteria to which applicants had been asked to direct their proposals.
5. Handling the Feedback of Contradictory Reviews
It was agreed that it is essential to follow practices similar to those of, for example, the
Health Research Board, SFI and AHRC, that feedback should include an overarching
paragraph explaining what factors had been taken into account by the final funding
decision-makers in considering the contradictory reviews and deciding whether to accept
or reject.
6. Handling the Coexistence of High Grades/Praise and a Decision not to Fund
It was noted that financial constraints made it inevitable that good research would not
always win funding. In one case, Estonia, this had, on one occasion, led to a sharing out
of the available funding to the top applications. It was also noted that this would not be
possible in the UK where there is a requirement on universities to secure the full costs of
research. The group noted that it is good practice to publish information, in general and to
applicants to particular calls for funding, about the ratio of all applications to successful
ones. At the same time, it was noted that this was ‘cold comfort’ for the unsuccessful but
excellent applicants. This did not bode well for the future of publicly funded research as
HERA Work Package 3 Task 3.2 Peer Review Report
Page 26 of 124
researchers would weigh up whether personal rationality might point to writing an article
for a journal instead of a research grant application.
Workshop conclusions
The workshop concluded with discursive reflections by Professor Caswill and Dr Hemlin.
Dr Sven Hemlin
Dr Hemlin observed that due to commercial pressure that the concept of ‘innovation’
dominated the humanities and therefore limited the discussion of conflicts of interest and
research ethics. He suggested that evaluation procedures be available electronically and
that the quality of assessments and the rating scale be addressed. In relation to the
identification and selection of experts, he noted that individuals directly influenced
decisions taken and therefore members of assessment panels are frequently changed.
Professor Chris Caswill
Professor Caswill recommended that the peer review debate continue and he outlined the
following points:
•
Framework Programme 7 will incorporate a Humanities theme and it should be
considered how this will be reviewed at EU level
•
The establishment of the European Research Council should be considered, in
terms of procedures adopted
•
Recommend investment in Information and Communication Technology (ICT) to
share information and aide communication
•
Transparency is key, recommend endorsement of the rebuttal procedure and a
statement of the peer review process published on Research Council websites
•
HERA is an excellent opportunity to ‘forward look’ and create best practices in the
funding of humanities research
HERA Work Package 3 Task 3.2 Peer Review Report
Page 27 of 124
Section 7 Recommended Peer Review Model
(based on survey results, commonalities and workshop)
Peer Review Methods
Independent Reviewers only
One-stage peer review process be utilised which solely relies on Independent Reviewers to
evaluate and grade the applications submitted to the transnational research funding
programme. Recommend at least two independent reviewers assess each application. The
evaluations are then compiled and submitted to the HERA Network Board for final funding
decisions to be made.
OR
Assessment Panel only
One-stage peer review process which utilises an assessment panel only, whereby
assessment panel members would meet on a designated date to discuss the proposals in
person. It is recommended that the assessment panel consist of international members
selected from participation HERA countries and perhaps from outside Europe. The
assessors would receive the applications well in advance of the meeting and have had time
to complete their evaluation and award a grade before the physical meeting. At least two
assessors would be requested to assess the proposal in detail (selection dependant on
assessor’s expertise) and other members would be asked to comment at the meeting. The
assessment panel would aim to reach agreement of the grade and rank of each
application. The recommendations would then be compiled and presented to the HERA
Network Board for final funding decisions to be made.
OR
Independent Reviewers and Assessment Panel
A two-stage peer review process, whereby international Independent Reviewers would be
selected due to their specialised expertise that directly related to the applicants research
proposal. Evaluations and grades would be submitted and then forwarded to an
International Assessment Panel composed of academics that have experience of a broad
range of subject areas and research policy to further discuss and debate the proposals,
assign a final grade and rank the applications. These recommendations would then be
compiled and presented to the HERA Network Board for final funding decisions to be made.
Consideration: should reviewers/assessment panel members be paid a fee for their
services?
Number of Reviewers
2-3 per application
HERA Work Package 3 Task 3.2 Peer Review Report
Page 28 of 124
Process of selecting Reviewers
Participating HERA partners in the transnational research funding programme nominate
reviewers from their own research network and submit to the organising partner for
consideration.
Criteria for selection of reviewers
Academic excellence and relevant disciplinary (subject area) competence were the most
importance factors when selecting reviewers. Also previous peer review experience,
encouraging young researchers, and creating gender balance should be taken into account.
A lack of conflict of interest and availability of the reviewer should also be considered as
relevant.
Reviewer recognition
List members of assessment panels on HERA website; recommend that reviewers include
details of their participation in their Curriculum Vitaes and academic publications. Send
reviewers formal thank you letters.
Reviewer Training
Some training be offered, even if only guidelines of their expected role and evaluation
criteria be noted and forwarded to reviewers
Conflict of Interest
The ESF ‘conflict of interest’ policy which states that “an interest may be defined as where
a person may benefit either professionally or personally by the success, or failure, of a
proposal”, could be taken as an example, due to ESF current transnational programmes in
operation.
Grading System
A simple and straightforward number grading system is the best option, it was
recommended that the scale be very explicit ensure a more exacting grade.
The ESF
currently utilises an easy scale of: 1 = poor, 2 = average, 3 = good, 4 = very good and 5
= excellent.
Application evaluation criteria
Research Proposal
First tier aspects (not in order):
•
Broad aims and objectives of the research
•
Proposal description
•
Proposed schedule of development of proposal
•
Location of the research proposal within the current state of research
•
Significance of the contribution that the research proposal will make to the
research field
HERA Work Package 3 Task 3.2 Peer Review Report
Page 29 of 124
Second tier aspects (not in order):
•
Methodology
•
Theoretical framework
•
Suitability of the institution proposed
Third tier aspects (not in order):
•
Bibliography in the research area
•
Plans for publication and dissemination of the research results
Additional considerations:
•
Feasibility
•
Level of multidisciplinary
•
Originality
•
Budget estimation
•
Collaboration
•
European added-value
•
Absence of overlapping with existing projects
Principal applicant criteria (in order)
•
Academic record and achievements of applicant
•
International collaboration
•
Previous awards/funding
•
Potential for mobility of researchers
Feedback to applicants
All evaluations be made available to applicants with the identity of independent reviewers
anonymous and members of assessment panels be made public.
Consider a ‘Right to Reply’/‘Right to Rebuttal’ procedure to allow applicants the opportunity
to comment (in written format) on evaluations of their research proposal by independent
reviewers and which are then forwarded to the assessment panel for inclusion in the final
evaluation process.
Feedback to reviewers
Formal thank you letter sent to all reviewers including details of applicants that were
successful and received funding. All successful awardees posted on the HERA website.
Reviewers included in the circulation of the HERA newsletter and any additional
publications.
Appeals Process
The majority of partners do not have an appeals process in place.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 30 of 124
Section 8 Step by step account of proposed peer review in practice
1.
Theme selected
2.
Funding method agreed
3.
Financial contribution of each partner agreed and allocated
4.
Legal barriers addressed
5.
Partners sign an agreement/contract to commit to research programme
6.
Application procedures finalised and agreed
7.
Peer review process finalised and agreed
8.
Partner Council selected to co-ordinate and organise peer review process
9.
Partners nominate appropriate independent assessors/members of review panel
10. Call launched on HERA website
11. Date and location of assessment panel selected and confirmed
12. Independent assessors/members of review panel selected, contacted and
confirmed
13. Applications received by post by specific deadline
14. Processing of applications
15. Applications sent to relevant independent reviewers for evaluation
16. Independent reviewers complete evaluation form and return
17. Independent reviewers evaluations sent to members of assessment panel
18. Assessment panel meeting held
19. Recommendations of assessment compiled – grading, ranking confirmed
20. Recommendations of assessment panel presented to HERA Network Board
21. HERA Network Board make final funding decisions
22. Successful awards made
HERA Work Package 3 Task 3.2 Peer Review Report
Page 31 of 124
Section 9 Timetable for the proposed peer review process
Preceding agreement:
•
Theme decided
•
Financial contribution by all partner Councils agreed
•
Agreement on all aspects of application procedure and peer review process agreed,
drafted and signed by all partners
Month 1
Documentation finalised and agreed
Nominations of potential assessors/review panel members
Month 2
Call launched
Month 3
Consideration of potential assessors/review panel members
Month 4
Finalise and contact assessors/review panel members
Month 5
Submission of research proposals by applicants
Month 6
Processing of applications received
Month 7
Peer Review process commence
Applications sent to independent reviewers
Month 8
Independent reviewers return completed evaluation forms
Independent reviewer evaluations sent to assessment panel members
Assessment panel meeting held
Month 9
Recommendations of assessment panel compiled and forwarded to HERA
Network Board
Month 10
HERA Network Board meet to discuss recommendations and final funding
decisions made
HERA Work Package 3 Task 3.2 Peer Review Report
Page 32 of 124
Section 10 Conclusion
The report is intended to provide a practical template of a possible peer review model to
be utilised in the forthcoming HERA transnational research funding programme, a
framework to be forwarded to HERA partners responsible for administration of the
programme.
As the report documents there is much variation in the peer review processes employed by
partner Councils across Europe, and therefore many different options the forthcoming
transnational research funding programme could implement. This report has endeavoured
to outline different peer review templates to be then further debated and discussed and
considered in terms of the report of WP9 Barriers to Joint Funding and to reach a final
conclusion as to the peer review method that benefits the HERA group most.
As this report is intended as the accumulated results of a survey-based questionnaire and
workshop presentations and discussions and not a comparative or academic analysis, it
could be suggested that a more in-depth paper on peer review be commissioned to
compliment this report, which would address the broader policy aspects of peer review.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 33 of 124
Section 11 Template for application form
HERA
Transnational Research Funding Programme
Evaluation Form
Details of Reviewer
Title
Prof/Dr/Mr/Mrs/Ms/Miss/(Other)
Initials
Surname
E-mail
Telephone
Post held
Institution
- Department
- Address
Grade
Theme, title, project leader of Research Project evaluated
HERA Work Package 3 Task 3.2 Peer Review Report
Page 34 of 124
Comments on Research Proposal
1.Broad aims and objectives of the research
2. Proposal description
3. Proposed schedule of development of proposal
4. Location of the research proposal within the current state of research
5. Bibliography in the research area
6. Relative significance of the contribution that the research proposal
will make to the research field
7. Methodology
8. Theoretical framework
9. Suitability of institutions proposed
10. Plans for publication and dissemination of the research results
Comments on Budget
HERA Work Package 3 Task 3.2 Peer Review Report
Page 35 of 124
HERA Coordinating Partner
Research Council for the Humanities of the Netherlands Organisation for
Scientific Research (NWO)
Nederlandse Organisatie voor wetenschappelijk Onderzoek (NWO), Geesteswetenschappen
(GW)
Organisation Overview
Description
The NWO is defined as a Research Council, and has a division specific to the funding of
Humanities research.
Strategy
Entitled ‘Themes plus Talent’, NWO envisages its mission as the provision of answers to
the questions that scientific researchers expect society to face both now and in the future.
A new strategy paper will be published early in 2006.
Funding Instruments
Veni is an individual grant for researchers who have recently completed their PhD and wish
to further develop their area of research (Post Doctoral).
Vidi offers grants to researchers to establish their own research group with one or more
assistants.
Vici is a grant for senior researchers to build their own research group.
The NWO also offers funding to the following instruments: Medium sized research projects,
Sabbatical
leave
grants,
Research
Schools,
Scientific
meetings,
Medium
sized
infrastructure and a scheme entitled ‘Dutch Flemish Cooperation’.
Thematic research programmes including: ‘Future of the religious past’, ‘Transformations
in art and culture’, ‘Cultural change and the fundamentals of the humanities’, ‘Interactive
multimodal information extraction (IMIX)’, ‘Preserving and developing the Archaeological
archive (BBO)’, ‘Malta harvest’, ‘Ethical and social aspects of research and innovation’ and
‘Societal component of genomics research and language acquisition and multilingualism’.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 36 of 124
Research funding
The NWO Humanities division has an annual research funding budget of approximately €25
million, of which 25% of the budget is allocated to top-down (thematic research) and 75%
to bottom-up research.
Peer Review Process
Peer Review Objectives
The NWO considers the following peer review objectives to be of equal importance:
evaluation of applications, funding recommendations, research excellence, transparent
process and an impartial process.
Peer Review Process Overview
A two-phase peer review process is utilised by the NWO in approximately 50% of the
responsive mode research funding instruments, whereby applicants submit an initial
summary of their research proposal, this is evaluated by the assessment committee of the
specific programme/instrument and the best applications are invited to submit a detailed
version of their research proposal. The assessment committee comprises of experts in the
field who are appointed by the Council responsible for the formal funding decision. In the
second phase of the peer review process, evaluations written by independent reviewers
are sent anonymously to applicants and the applicant is allowed the opportunity to
comment on the evaluation (“Right to reply”.
Methods
The NWO utilises independent domestic and international reviewers and panels consisting
of both domestic and international assessors. This applies for both responsive and
thematic funding instruments.
Independent Reviewers
Each application is reviewed by at least two (on average 3) external independent expert
reviewers. These are either national or international scholars chosen by the NWO for their
expertise in the subject of the proposal. Independent reviewers are not paid a fee for their
services. A common evaluation form is completed by all reviewers and applications are
graded.
Assessment Committee
All proposals submitted within a particular funding instrument or research programme are
evaluated by (international) experts and subsequently ranked by an assessment
committee of experts from the Humanities. International experts are selected especially if
a domestic expert can not be located or if a conflict of interest with a national expert
arises. The assessment committee assesses and ranks each application on the basis of the
evaluations formulated by the independent reviewers, the comments from the applicants
and the actual research proposal. In the majority of responsive funding instruments the
committee also conducts interviews with the applicant. The committee makes a ranking list
and submits it to the Council Board.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 37 of 124
The committee is comprised of approximately 8 members, depending on the number of
applications received and the expertise required. Slightly more males than females are
represented on committee (60% male to 40% female). Travel expenses for members are
reimbursed. There is no common evaluation form, but a set of assessment criteria is
supplied and all applications are graded and ranked.
Number of reviewers
On average evaluations are reviewed in detail by 3 external (independent) reviewers.
Process for selecting reviewers
Council members nominate/select appropriate reviewers and reviewers are also sourced
individually by the Research Council staff via contacts, research networks and the internet.
Criteria for selection of reviewers
1 = relevant disciplinary competence
2 = academic excellence
3 = previous peer review experience
4 = to create gender balance
5 = encourage young academics.
Reviewer recognition
A formal thank you letter from the NWO is forwarded to participating reviewers.
Reviewer Training
Reviewers receive no formal training from the NWO.
Conflict of interest
When the NWO is engaged in selecting reviewers, it does not select reviewers from the
same university or individuals considered of having a direct and active relationship to the
applicant via networks, research groups etc.
In relation to conflict of interest for (international) reviewers, assessment committee
members and Council members, the NWO forwards extensive instructions which are
agreed and a statement signed.
Grading system
Both a grading and ranking system are utilised by the NWO.
Application evaluation criteria
Research proposal
Proposals in the major talent-oriented funding schemes are rated on the following criteria:
1. Assessment of the quality of the researcher
2. Innovative character of the proposed research
3. Assessment of the quality of the research proposal
4. Final overall assessment
HERA Work Package 3 Task 3.2 Peer Review Report
Page 38 of 124
Principal applicant
1 = Academic record and achievements
2 = International collaboration
3 = Previous awards/funding
4 = Mobility of researchers
Feedback
Independent reviewer’s written evaluations and grades are available to applicants;
however their identity is kept anonymous.
Assessment committee member’s written evaluations are also available to applicants and
their identity is made known to candidates. Applicants are not informed about their
ranking position.
Interviews
conducted
by
the
assessment
committee
are
recorded
and
filed
for
administration purposes and are only released in exceptional cases whereby an official
complaint has been made.
The NWO evaluation process is subject to a national ‘Freedom of Information’ Act but in a
limited way, i.e. the names of the expert reviewers remain confidential.
Feedback to reviewers
The final funding decisions are communicated to reviewers.
Appeals process
An appeal process is available for applicants based on procedural grounds only.
Cost of peer review process
This data is not obtainable.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 39 of 124
HERA Partner
Academy of Finland (AKA)
Suomen Akatemia (SA)
Organisation Overview
Description
The Academy of Finland has a division dedicated to the funding of humanities and social
sciences and is entitled the ‘Research Council for Culture and Society’.
Strategy
The mission of the AKA is to promote high-quality scientific research by means of longterm funding based on scientific quality, reliable evaluation, science policy expertise and
extensive international cooperation.
Funding Instruments
The AKA provides funding for the Research appropriations (also referred to as Research
grants, which are designed to promote diversity and innovation in research), Academy
Professors, Academy Fellows, Postdoctoral researchers and Senior Scientists and operates
a number of Research Programmes (thematic research programmes).
AKA also funds
Centres of Excellence.
Research funding
The AKA Research Council for Culture and Society has a budget of approximately
€29.6million per annum. 80% of this budget is allocated to bottom-up research with the
remaining 20% to targeted (top-down) research.
Peer Review Process
Peer Review objectives
1 = Evaluate applications, transparent and impartial process
2 = Fund research excellence and international benchmarking
The peer review process does not make actual funding recommendations.
Peer Review Process Overview
Both one and two-phase peer review processes are utilised by the AKA. The two-phase
process is operated in the case of thematic research funding instruments, whereby initial
applications are reviewed by a Programme Steering Group. (The steering group is
composed of members of the respective research council, other funding organisations and
representatives of relevant interest groups. The steering group is appointed by the
President of the Academy of Finland.)
HERA Work Package 3 Task 3.2 Peer Review Report
Page 40 of 124
Methods
The AKA method of peer review for responsive mode research funding instruments
involves both independent domestic and international reviewers and also panels of
domestic and international assessors.
It should be noted that the evaluation process and funding decisions are distinctly separate
from each other and review panels are not used in all disciplines if, for example few
applications are received.
Independent Reviewers
The majority of AKA research funding instruments are evaluated by independent
reviewers, on average 2 per application. They receive a sum of €55 per application
evaluated, use a standard evaluation form and grade the applications. (Some short-term
funding instruments (e.g., grants for researcher training abroad) are usually reviewed by
two Research Council members).
Domestic and International Panels
Panels are primarily composed of international and domestic members, however panels
consisting of only international or only domestic members may be utilised to facilitate the
employment of the best experts available.
Panels are composed of approximately 3/4 members (approximately 70% male and 30%
female) and this is consistent regardless of the research funding instrument. Panel
members are paid a flat fee of €280 (the Chair of the panel receives €370) plus travel
costs and a further €18 per application evaluated. A standard evaluation form is completed
by all panel members and applications are graded only.
Number of reviewers
On average each application is assessed in detail by 2 reviewers prior to the assessment
panel meeting, although all assessment panel members are responsible for the final
statements.
Process for selecting reviewers
Council members suggest and nominate appropriate reviewers or reviewers are sourced
individually by Research Council staff, for example via relevant contacts, networks or the
internet.
Criteria for selection of reviewers
1 = Academic excellence, relevant disciplinary (subject area) competence
3 = Previous peer review experience, create gender balance
5 = Encourage young academics
HERA Work Package 3 Task 3.2 Peer Review Report
Page 41 of 124
Reviewer recognition
Reviewers do not receive formal recognition for their participation in the peer review
process.
Reviewer Training
Reviewers do not receive training from the AKA.
Conflict of interest
The AKA has a ‘conflict of interest’ policy in place based on Finnish law and reviewers are
requested to adhere to those regulations. The regulation states that reviewers are required
to declare any personal interests and must disqualify themselves in the following
circumstances:
-
benefit from the approval or rejection of the proposal
-
closely related to the applicant
-
been a superior or subordinate or instructor of the applicant during the past three
years
-
close collaboration with the applicant
-
currently applying for the same post as the applicant
-
currently applying for funding from the Academy from the same funding
instrument
Grading system
The AKA adheres to a grading system of: 1 (poor), 2 (satisfactory), 3 (good), 4 (excellent)
and 5 (outstanding). Applications are ranked during the funding decision process
conducted by Council members.
Application evaluation criteria
Research proposal
1 = Proposal description, location of the research proposal within the current state of
research and the relative significance of the contribution that the research proposal will
make to the research field
2 = Broad aims and objectives of the research, methodology, theoretical framework and
the proposed schedule of development
3 = Bibliography in the research area, plans for publication and dissemination of the
research results
4 = Suitability of institution proposed
Principal applicant
1 = Academic record and achievements
2 = International collaboration
3 = Mobility of researchers
4 = Previous awards/funding
Feedback
The identity of independent reviewers and assessment panel members and their written
evaluations and grades are available to applicants of funding instruments.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 42 of 124
Discussions of the applications by assessment panel members are not recorded at the
meeting.
The evaluation process is subject to a Freedom of Information policy, in that; individual
evaluations are available to the respective applicants only.
Feedback to reviewers
The final funding decisions are normally communicated to reviewers.
Appeals process
An appeal process is not available for applicants.
Cost of peer review process
The total overall cost of the ‘Research Council for Culture and Society’ peer review process
for 2004 (evaluation of all research funding instruments, including Centres of Excellence)
was €140,500.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 43 of 124
HERA Partner
Academy of Sciences of the Czech Republic (ASCR)
Akademie věd České republiky (AV ČR)
Organisation Overview
Description
The Academy of Sciences of the Czech Republic (ASCR) formulates its own scientific policy,
advises the state, administers national and international research programmes, and
promotes cooperation with both applied research and industry to foster technology transfer
and exploitation of scientific knowledge. The Humanities and Social Sciences Division is
responsible for funding humanities research.
Strategy
The main objective of the Academy is to conduct both fundamental and strategic applied
research to create scientific knowledge that contributes to strengthening the nation's
position in key areas of science and to find solutions to contemporary issues in society. It
also promotes basic research in the humanities and social sciences, performs related
activities in the higher education sector, including dissemination and application of
research and the preservation of the national heritage.
Funding Instruments
The ASCR operates several research funding instruments, including: Institutional Research
Plans (thematic funding instrument), Program for the support of targeted research
(responsive mode), Program "Information Society" and the Grant Agency of ASCR
(responsive mode).
Research funding
The total budget of the ASCR allocated to the funding of humanities and social sciences
was €15.5 million in 2003.
Peer Review Process
Peer Review objectives
1 = make funding recommendations, Fund research excellence (at national and European
level), transparent and impartial process
2 = Evaluate applications (mapping of research potential), transparent and impartial
process and to fund research with cultural relevance (Bohemian studies, cultural heritage
studies)
3 = international benchmarking
HERA Work Package 3 Task 3.2 Peer Review Report
Page 44 of 124
Peer Review Process Overview
A one-phase peer review process is utilised by all funding instruments.
Methods
The ASCR utilises different peer review processes depending on the funding instrument.
Institutional Research Plans
The evaluation procedure has three main steps:
1.
Assessment by independent reviewers
2.
Presentations made by applicants of their proposed research plans to independent
reviewers and some Evaluation Committee members
3.
Evaluation Committee (assessment panel) meet to discuss and grade applications
In other research funding instruments, applications are evaluated and graded by
independent reviewers (postal reviewers) and final decisions are made by an assessment
panel.
Independent Reviewers
Independent reviewers are utilised by all research funding instruments. Applications are
reviewed by 3-7 reviewers depending on the funding instrument. The majority of
independent reviewers are international and receive financial compensation of €300 per
application reviewed. Standard evaluation forms are completed and grades are assigned
for some funding instruments.
Combined Domestic and International Panels
Assessment panels consisting of both domestic and international members are operated
for all funding instruments, with the number of members per panel ranging from 5 to 10.
Financial numeration differs depending on the funding instrument, for example no
compensation is paid for the ‘Grant agency of ASCR’ but the ‘Institutional Research Plans’
assessment panel receives a fee of €300 per application reviewed and travel costs are
reimbursed. Panel members complete standard evaluation forms and applications are
graded and ranked.
International Panels
Exclusive international panels are not utilised by the ASCR.
Number of reviewers
The average number of reviewers per application is 3-5, however this increases in relation
to the instrument ‘Institutional Research Plans’ which is evaluated by 5-7 reviewers.
Process for selecting reviewers
Council members select reviewers; reviewers are also sourced from a database of
domestic and international reviewers. In addition applicants are requested to recommend
appropriate reviewers when submitting their application.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 45 of 124
Criteria for selection of reviewers
1 = Previous peer review experience, relevant disciplinary (subject area) competence
2 = Academic excellence, encourage young academics, international co-operation
3 = Create gender balance
Reviewer recognition
Reviewers receive recognition by noting their participation in their Curriculum Vitae and
academic publications.
Reviewer Training
Reviewers only receive training for the funding instrument ‘Institutional Research Plans’.
Conflict of interest
A ‘conflict of interest’ policy exists, whereby reviewers must sign a formal declaration.
Grading system
Applications are graded and ranked. The grading system utilised is: A++, A+, A, B, NR
(Not Recommended).
Application evaluation criteria
Research proposal
1 = Broad aims and objectives of the research, Proposed schedule of development of
proposal, Bibliography in the research area, Relative significance of the contribution that
the research proposal will make to the research field, Methodology, Financial support
2 = Plans for publication and dissemination of the research results, Proposal description,
Theoretical framework, Suitability of institution proposed, Research infrastructure
Principal applicant
1 = Academic record and achievements, International collaboration
2 = Mobility of researchers, Research infrastructure, Financial support
3 = Previous awards/funding
Feedback
Independent reviewer’s written evaluations and grades are available to applicants of
funding instruments and the identity of the reviewer remains anonymous to the applicant.
Assessment panel members written evaluations and grade assigned are released to the
applicants; and their identity remains anonymous. The ASCR is subject to a ‘Freedom of
Information’ Act.
Feedback to reviewers
Feedback on the peer review process is provided to reviewers.
Appeals process
An appeals process is not in operation.
Cost of peer review process
This data was not obtainable.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 46 of 124
HERA Partner
Arts and Humanities Research Council (AHRC)
Organisation Overview
Description
The Arts and Humanities Research Board (AHRB) was established in October 1998, by the
three higher education funding councils for England, Scotland and Wales. Following a
Government review of research funding in the arts and humanities it was agreed that a
UK-wide Arts and Humanities Research Council should be created and located alongside
the other UK Research Councils. On 1 April 2005, the Arts and Humanities Research
Council was launched. The AHRC operates a wide range of programmes supporting the
highest quality research and postgraduate training in the arts and humanities.
Strategy
The AHRC aims to:
•
Support and promote high-quality and innovative research in the arts and
humanities.
•
Support, through programmes in the arts and humanities, the development of
skilled people for academic, professional and other employment.
•
Promote awareness of the importance of arts and humanities research and its role
in understanding ourselves, our society, our past and our future, and the world in
which we live.
•
Ensure that the knowledge and understanding generated by arts and humanities
research is widely disseminated for the economic, social and cultural benefit of the
UK and beyond.
•
Contribute to the shaping of national policy in relation to the arts and humanities
Funding Instruments
The AHRC operates a number of funding instruments, including:
1.
Research Grants - projects which enable individual researchers to collaborate.
2.
Research Leave - provides a period of matching leave (for a term or semester) for
the completion of significant research projects
3.
Resource Enhancement - projects (maximum funding of £500,000, duration of 3
years) that improve access to research materials and resources.
4.
Small Grants in the Creative and Performing Arts - research projects (maximum
duration of one year) in the creative and performing arts only.
5.
Fellowships in the Creative and Performing Arts - supports artists as research
fellows within a higher education environment.
6.
Research Networks and Workshops – develop interdisciplinary research ideas, by
establishing new research networks or by workshops/seminars.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 47 of 124
7.
Strategic Initiatives - address issues of intellectual and wider cultural, social or
economic urgency that the Council considers are best supported by concentrated
investments.
Funding
The AHRC has an annual budget of more than £75 million.
Peer Review Process
Peer Review objectives
The highest rating 1 was assigned to the following objectives:
Evaluate applications, fund research excellence, transparent and impartial process and a
rating of 2 was assigned to international benchmarking (of particular relevance for funding
awarded to Research Centres).
Peer Review Process Overview
A one-phase peer review process is utilised by all responsive funding instruments. A twophase process is employed by some thematic funding instruments, whereby a preliminary
summary of the research proposal is submitted, evaluated by an assessment board and
those selected are requested to submit a further detailed application.
Methods
The AHRC established a Peer Review College to conduct its peer review requirements. The
college evaluates and reviews applications submitted to all research funding instruments,
including responsive and thematic modes. The College has a membership of more than
560 academics that participate in evaluating applications (however occasionally reviewers
are selected who are not members of the college).
The college was established to improve the quality of peer review, by enabling members to
assess a number of applications per year as opposed to just one or two over an extended
period, this therefore allows their expertise to develop and in turn provide better
comparative assessments of the strengths and weaknesses of applications.
Independent Reviewers
All AHRC research funding instruments are evaluated by independent reviewers, on
average 2 per application. They do not receive monetary compensation. They use a
standard evaluation form and grade the applications.
Domestic and International Panels
The majority of funding instruments employ a panel of domestic assessors only; however
for thematic funding instruments panels include at least one international member. The
Small Grants in the Creative and Performing Arts, Research Networks and Workshop
schemes do not employ a full panel, only panel chairs are involved in decision making in
these instances.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 48 of 124
Panels are composed of approximately 8 members for responsive mode instruments and
strategic panels are composed of 8-15 members. . Panel members are paid a flat fee of
€1,800 plus travel costs. Standard evaluation forms are utilised by all panel members and
applications are graded and (usually) ranked.
International Panels
The AHRC does not employ assessment panels composed exclusively of international
assessors.
Number of reviewers
On average each application is assessed in detail by 3 reviewers (including the nominated
reviewer).
Process for selecting reviewers
Reviewers are selected on the basis of their expertise from the database of college
members or can be sourced individually by Research Council staff, for example, via the
internet.
Criteria for selection of reviewers
The Peer Review College
The AHRC rated academic excellence as the most important factor in selecting reviewers,
followed by previous peer review experience and competence in the relevant discipline.
Creating gender balance and encouraging young academics were also considered of
significance.
Reviewer recognition
AHRC Peer Review College members are listed on the AHRC website; some members will
choose to indicate their membership on their Curriculum Vitae and academic publications.
Reviewer Training
Training for reviewers is provided by the AHRC in the form of induction days organised for
members. The induction consists of carrying out a mock assessment exercise with
experienced Peer Review College members, comparing each others’ grades and the
rationale behind the awarding of a particular grade.
Conflict of interest
The AHRC has a ‘conflict of interest’ policy that applies to independent reviewers, domestic
and international assessment panel members.
Grading system
The AHRC adheres to a grading system of: A+, A, N (not a priority), RS (resubmit) and U
(unsuccessful). Responsive mode panels rank applications to facilitate the approval of
funding recommendations by the AHRCs Research Committee or Council. Strategic mode
panels have (limited) delegated authority to make funding decisions.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 49 of 124
Application evaluation criteria
Research proposal
The highest priorities when evaluating applications are the quality and thematic relevance
of the research proposal, the scheduling of the research tasks and the feasibility. Relative
significance of the contribution that the research proposal will make to the research field
and the demonstration of a detailed knowledge of the field also occupy the highest rating.
Methodology, theoretical framework, suitability of institution proposed and plans for
publication and dissemination of the research results are also considered significant.
Principal applicant
Academic record and achievements take precedence in the evaluation of the principal
applicant, followed by their record of previous awards/funding. International collaboration
is only considered if it is an indication of the status of the applicant or if the research
proposal depends on international cooperation.
Feedback
Independent reviewer’s written evaluations are available to applicants of funding
instruments; however the applicant’s actual grade is not released. The Independent
reviewer remains anonymous throughout the process.
Applicants are not permitted access to an assessment panel member’s written evaluation,
however the grade awarded to their application is released.
Discussions of the applications by assessment panel members are occasionally recorded
and released to applicants (in particular when applications are graded RS (resubmit)
assessment panels are requested to agree on feedback for the applicant).
The evaluation process is subject to the AHRC ‘Freedom of Information’ policy, which is
based on UK legislation entitled the Freedom of Information Act 2000.
Feedback to reviewers
The final funding decisions are communicated to reviewers.
Appeals process
An appeal process is available for applicants to follow if they are dissatisfied with the
outcome of their evaluation.
Cost of peer review process
This data was not obtainable.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 50 of 124
HERA Partner
Austrian Science Fund (FWF)
Fonds zur Förderung der wissenschaftlichen Forschung
Organisation Overview
Description
The Austrian Science Fund (FWF) is Austria's main organisation for the promotion of basic
research. It is equally committed to all branches of science and the humanities and its
activities are solely guided by the standards of the international scientific community. A
specific department exists to cater for the humanities and social sciences.
Strategy
The FWF is dedicated to the promotion of basic research. It is equally committed to all
branches of science and in all its activities is guided solely by the standards of the
international scientific community. . It is also committed to education and training through
research, knowledge transfer and the formation of a research culture. Relevance of the
research funded by FWF for society, culture and economy is welcome, but not the main
goal of the FWF´s mission.
Funding Instruments
The FWF operates a number of funding categories, including:
1.
Support for Stand-alone projects and publications
2.
Priority Research programmes include Special Research Programs (local centers of
excellence), National Research Networks (promoting nation wide networking) and
Doctoral Programs (training of talented of PhD students).
3.
International mobility – Erwin Schrödinger Fellowships. (research period abroad)
and Lise Meitner Program (research period for international researchers in Austria)
4.
Promotion of women - Hertha Firnberg and Charlotte Bühler Programmes (supports
future female professors)
5.
Awards
and
prizes
include
the
START
Program
(young
researchers)
and
Wittgenstein Award (senior researchers) and EURYI Awards.
Some funding instruments (Hertha Firnberg, START Program, Wittgenstein Award,) are
operated by the FWF on behalf of the Federal Ministry for Education, Science and Culture
(BMBWK) and the Federal Ministry for Transport, Innovation and Technology (BMVIT).
Research funding
The FWF has a total annual budget of approximately €108million, of this approximately
12.7% is awarded to humanities research (€14million). All funding instruments operated
by the FWF are bottom-up in nature.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 51 of 124
Peer Review Process
Peer Review objectives
1 = Evaluate applications, make funding recommendations, fund research excellence,
impartial process and International benchmarking
3 = Transparent process, as only authorized sections of the written evaluation are
forwarded anonymously to the applicant
Peer Review Process Overview
A one-phase peer review process is utilised by all funding instruments, except for the
Priority Research programmes which operates a two-phase process, whereby the initial
draft proposals submitted are evaluated by independent reviewers and full proposals are
reviewed by international assessment panels.
Methods
Priority Research programmes and Awards and Prizes (START Program and the
Wittgenstein Prize) utilise both Independent international reviewers and international
assessment panels.
Independent Reviewers
All FWF research funding instruments are evaluated by independent reviewers, except for
the Impulse project. An application may have between 2 and 10 reviewers depending on
the instrument. Reviewers do not receive monetary compensation, a standard evaluation
form is completed and applications are graded.
Domestic/Research Council member Panels
All funding instruments use this method and panels consist of 26 members (26 substitutes
are also appointed); the number of panel members is constant regardless of the funding
instrument. The gender balance is approximately 80% male to 20% female.
All panel members receive a flat fee of €700 and travel expenses are covered. There is no
standard evaluation form; applications are not graded, however applications are ranked
depending on the international reviewers’ evaluations and the budget available.
International Panels
Priority Research Programs utilise International Panels composed of 6 – 10 members, as
these are long term funding instruments. The Awards and Prizes employ International
Panels (composed of 14 members) as they are highly competitive research instruments.
The gender balance is approximately 70% male to 30% female. A fee is not paid to
members of panels, but their travel expenses are covered. Priority Research Networks
panels are required to complete a standard application form and applications are graded.
Only the Awards and Prizes assessment panels rank applications.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 52 of 124
Number of reviewers
The number of reviewers differs significantly across different funding instruments; however
the average is 2 reviewers per application, and is increased depending on the amount of
funding requested. The main exceptions are the Priority Research Networks which are
reviewed by 6-10 reviewers and the Awards and Prizes which are evaluated by
approximately 4-6 reviewers.
Process for selecting reviewers
Reviewers are selected by Council members and applicants are requested to recommend
appropriate reviewers also (applicants may also note inappropriate as reviewers due to
conflicts of interest).
Criteria for selection of reviewers
1 = Academic excellence, Relevant disciplinary (subject area) competence
3 = Previous peer review experience, Encourage young academics, Create gender balance
Reviewer recognition
Reviewers receive a formal letter of confirmation from the FWF.
Reviewer Training
Reviewers do not receive training from the FWF.
Conflict of interest
The FWF has a ‘conflict of interest’ policy, entitled the ‘Declaration of interest’ whereby
reviewers inform the FWF if they believe a conflict of interest exists, for example if a
reviewer could potentially profit professionally, financially or personally from approval or
rejection of an application, or if the reviewer has published with the applicant or their coworkers.
Grading system
The FWF adheres to a grading system of: ‘excellent’ (90-100 points), ‘very good’ (75-85
points), ‘average’ (55-70 points), ‘below average’ (35-50 points) and ‘poor’ (10-30 points).
Applications are not ranked, but rejected and border line cases are comparatively
discussed among all disciplines depending on the budget available.
Application evaluation criteria
Research proposal
1 = Broad aims and objectives of the research, Proposal description, Proposed schedule of
development of proposal, Location of the research proposal within the current state of
research, Relative significance of the contribution that the research proposal will make to
the research field, Methodology, Theoretical framework, Suitability of institution proposed,
Plans for publication and dissemination of the research results
3 = Bibliography in the research area
HERA Work Package 3 Task 3.2 Peer Review Report
Page 53 of 124
Principal applicant
1 = Academic record and achievements
2 = International collaboration
3 = Previous awards/funding, Mobility of researchers
Feedback
Independent reviewer’s written evaluations are available to applicants of funding
instruments; however the applicant’s actual grade is not released. The Independent
reviewer remains anonymous throughout the process.
Only applicants to Priority Research Networks are not permitted access to an assessment
panel member’s written evaluation; however the grade awarded to their application and
the identity of the assessment panel members are released.
Discussions of the applications by assessment panel members are only recorded and
released to applicants of the Priority Research Networks. There is no ‘Freedom of
Information’ policy which the FWF must adhere to.
Feedback to reviewers
The final funding decisions are communicated to reviewers on request.
Appeals process
No process is in place at present.
Cost of peer review process
This data was not obtainable.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 54 of 124
HERA Partner
Danish Research Agency (DRA)
Forskningsstyrelsen (FS)
Organisation Overview
Description
The DRA serves as an administrative agency assisting the Danish research councils and
programme committees, among them the ‘Danish Research Council for the Humanities’.
Strategy
The Danish Research Council for the Humanities funds researcher initiated projects
exclusively.
Funding Instruments
The Danish Research Council for the Humanities operates a number of research funding
instruments including Research projects (including Post Doctoral and collective senior
researcher projects), grants to PhD students, Research Centres, Networks, Conferences
and international exchanges. An instrument is in place to facilitate the application of
natural science techniques to humanities research and a scheme entitled START which
assists in the preparation of large-scale EU proposals. Funding is also offered for the
dissemination of research including the translation and publication of research material
(books, doctoral thesis and journals).
The Research Council for the Humanities does not fund thematic research programmes,
however a number of themes selected by the Board of the research councils are funded
annually. These selected themes are funded though a number of the funding instruments
and according to the general procedure for all funding. In 2005, the selected theme within
the humanities is “Cultural heritage & creative industries”.
Research funding
The Danish Research Council for the Humanities has an annual budget of approximately
€15million (113million DKR). This budget is allocated exclusively to bottom-up research.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 55 of 124
Peer Review Process
Peer Review Objectives
The key objectives of the DRA peer review process are to fund research excellence and to
operate an impartial and transparent process.
Peer Review Process Overview
A one-phase peer review process is exclusively utilised by the Danish Research Council for
the Humanities, whereby full applications are submitted by the designated closing date.
Method
Generally Research Council members act as both independent reviewers and form the
assessment panel.
Each Council member is assigned the applications that correlate to their specific area of
academic expertise, this Council member will then make an oral review (no written
evaluation is completed) of the application and present it to an initial sub-group of 3-4
council members who will provisionally prepare the evaluations. Four sub-groups are
formed by disciplines: 1 Aesthetics (Art, literature, musicology, theatre science and media
science), 2 Languages, 3 History (history, archaeology, anthropology, ethnography) and 4
philosophy, psychology, educational science, religion and theology. The provisionally
prepared evaluations are then presented to the entire council, of which each member will
have read and assessed all the applications. The applications are then discussed and final
funding decisions made.
The advantages of reviews by council members are identified by the DRA as: 1) a short
and effective review and decision making procedure 2) inexpensive administration costs
and 3) consensual inter-subjectivity and transparency.
Research Council member Panels
The panel consists of 15 participants (the entire research council); the gender balance is
relatively even, with 8 male and 7 female panel members.
Research Council members are paid a flat fee and travel expenses for participating in the
peer review process; however the flat fee forms the monetary sum paid to all Council
members for their annual work at the Council including meetings, evaluations, counselling,
etc. This fee is currently approximate €6,710 per annum.
Panel members do not complete a standard evaluation form, nor do they grade or rank
applications.
Independent Reviewers
According to standard procedure, the council act as reviewers. However, independent
national experts are applied by the DRA for certain applications that is 1) if a conflict of
interest with a council member arises in relation to a specific application or 2) if there is no
expertise in the council for reviewing a given project.
Independent international experts are employed by the DRA for certain applications: 1) if
the funding applied for in a single application reaches €1.33million. 2) if a conflict of
HERA Work Package 3 Task 3.2 Peer Review Report
Page 56 of 124
interest with a council member arises in relation to a specific application and no national
expert is identified.
External independent reviewers (not Research Council members) receive financial
compensation for evaluations; they receive €135 as an initial fee and then €135 per
application evaluated (e.g. €270 for one application, €405 for two applications, €540 for
three, etc.). External independent reviewers do not receive a standard evaluation form,
but they do receive guidelines for evaluating applications.
International Panels
International panels are not utilised by the Danish Research Council for the Humanities.
Number of reviewers
On average 2 external reviewers evaluate each application in detail.
Procedure for selecting external reviewers
The role of Research Council members includes the designated task of evaluating
applications. Council members nominate/select appropriate reviewers due to their
expertise in a particular research area.
Criteria for selection of reviewers
1 = Relevant disciplinary (subject area) competence, 2 = Academic excellence, 3 =
Previous peer review experience, 5 = Encourage young academics and create gender
balance.
Reviewer recognition
Reviewers do not receive formal recognition of their participation in the peer review
process.
Reviewer Training
Reviewers do not receive formal training from the DRA; however, they do receive a
detailed letter including guidelines, criteria and the call for proposals.
Conflict of interest
The DRA has a legally binding ‘conflict of interest’ policy that relates to both independent
reviewers, council members/assessment panel members, stating that “reviewers may not
be in an economic, professional or personal relationship with the applicant.”
Grading system
It is the policy of the Danish Research Council for the Humanities not to grade or rank
applications during the review procedure.
Application evaluation criteria
Research proposal
1 = Proposal description, relative significance of the contribution that the research
proposal will make to the research field
HERA Work Package 3 Task 3.2 Peer Review Report
Page 57 of 124
2 = Methodology and theoretical framework
3 = Broad aims and objectives of the research, proposed schedule of development of
proposal, location of the research proposal within the current state of research and
suitability of institution proposed.
4 = Bibliography in the research area
5 = Plans for publication and dissemination of the research results
Principal applicant
1 = Academic record and achievements, international collaboration
3 = Previous awards/funding and mobility of researchers
Feedback
Only external independent reviewer’s written evaluations are available to the applicant. All
Research Council members’ evaluations are oral however; the applicant receives a letter
stating the reasons for not being granted funding. Applicants are entitled by law to request
the identity of their reviewers.
Some elements of the discussion of the applications by assessment panel members are
recorded at the meeting, in particular the council’s grounds for rejecting an application.
The evaluation process is subject to a national ‘Freedom of Information’ legislation and
upon request applicants may gain access to all papers concerning his/her application
(except for papers designated as internal working documents).
The DRA is obliged by law to send each unsuccessful applicant a letter informing them if
their application was unsuccessful and why it was deemed so, for example the letter may
outline the specific shortcomings of the project, insufficient academic qualifications of the
applicant or it may refer to the fact that due to the council’s limited budget, the council
could not fund all projects deemed to merit support.
Feedback to reviewers
The final funding decisions are not directly communicated to external reviewers; however,
the council’s funding decisions are made public and listed on the DRA website.
Appeals process
An appeals process is available to unsuccessful applicants. Applicants may make a
complaint to the Ministry for Science, Technology and Innovation regarding administrative
procedures. Applicants may also request an elaboration of the feedback provided to them.
Cost of peer review process
This data was not obtainable, as it is highly variable due to the varying number of external
reviewers and the general flat fee council members receive for all their council work.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 58 of 124
HERA Partner
Estonian Science Foundation (EstSF)
Eesti Teadusfond (ETF)
Organisation Overview
Description
The Estonian Science Foundation (EstSF) is an expert research-funding organisation which
aims to support the best research initiatives in all fields of basic and applied research. The
Expert Commission for the Humanities is responsible for all humanities research.
Strategy
The objectives of the EstSF include fostering the development of basic and applied
research, supporting the most promising researchers and research groups, facilitating
international cooperation and encouraging young researchers and their mobility.
Funding Instruments
The EstSF operates two main research funding instruments, entitled ‘Research Grant’ and
‘My First Grant’. The EstSF does not currently operate any thematic research funding
instruments.
Research funding
The overall annual budget of the EstSF is €6 million euro, with €535,000 euro devoted
specifically to humanities research.
Peer Review Process
Peer Review objectives
1 = Evaluate applications, make funding recommendations, transparent and impartial
process
2 = Fund research excellence (national and/or field priorities may be considered),
international benchmarking
Peer Review Process Overview
A one-phase peer review process is utilised by all funding instruments.
Methods
Both independent domestic and international reviewers and domestic/Research Council
member panels are employed by the EstSF.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 59 of 124
Independent Reviewers
All funding instruments employ both independent domestic and international reviewers
with approximately 2 reviewers per application evaluated. Each reviewer receives a fee of
€21 per application. A standard application form is completed by reviewers and all
applications are graded.
Domestic/Research Council member Panels
Domestic/Research Council member panels are operated for all funding instruments, with
about 10 members per panel. The gender balance is equal for the Humanities Expert
Commission (50% male to female). Assessment panel members receive financial
numeration including travel expenses and a fee, which varies on the number of
applications evaluated per panel member. Panel members do not complete a standard
evaluation form nor do they grade applications, however applications are ranked.
International Panels
International panels are occasionally utilised by the EstSF to evaluate grant applications of
the Expert Commission members and their relatives.
Number of reviewers
The average number of reviewers per application is 2.
Process for selecting reviewers
Reviewers are selected by Council members and they are also sourced individually by
Research Council staff via contacts, research networks and the internet. Research Council
staff is responsible for selecting at least one international reviewer per application.
Criteria for selection of reviewers
1 = Academic excellence, no conflict of interest
2 = Relevant disciplinary (subject area) competence
3 = Previous peer review experience
5 = Encourage young academics, create gender balance
Reviewer recognition
Reviewers do not receive formal recognition of their participation from the EstSF unless a
reviewer asks for a letter of recognition.
Reviewer Training
Reviewers do not receive training from the EstSF.
Conflict of interest
A ‘conflict of interest’ policy exists, stating that applications from Council members (or
their relatives) are only permitted to be reviewed by international experts.
Grading system
Applications are graded and ranked. The grading system is operated on a 1-5 point scale,
where 1 = “poor” and 5 = “outstanding”. Reviewers are requested to appoint a grade to
HERA Work Package 3 Task 3.2 Peer Review Report
Page 60 of 124
each of the following criteria: scientific quality, competence of the team, relevance of the
research topic, feasibility of research plan and doctoral training. In addition, the reviewers
are requested to provide an overall evaluation rating of 1-5 which is not based on the
mathematical average of previous ratings.
Application evaluation criteria
Research proposal
1 = Broad aims and objectives of the research, Proposal description, Bibliography in the
research area, Relative significance of the contribution that the research proposal will
make to the research field, Methodology, Theoretical framework
2 = Plans for publication and dissemination of the research results
3 = Proposed schedule of development of proposal
5 = Suitability of institution proposed
Principal applicant
1 = Academic record and achievements
3 = International collaboration, Previous awards/funding
5 = Mobility of researchers
Feedback
Independent reviewer’s written evaluations and grade are available to applicants of
funding instruments and the identity of the reviewer remains anonymous to the applicant.
Assessment panel members’ written evaluations and grades are also released to the
applicants; however the composition of the panels is not anonymous. The EstSF is not
subject to a ‘Freedom of Information’ policy.
Feedback to reviewers
No feedback on the peer review process is provided to reviewers.
Appeals process
No appeals process is currently in operation.
Cost of peer review process
In 2005 169125 EUR were granted to new humanities projects and 2045 EUR were spent
on the peer review process.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 61 of 124
HERA Partner
European Science Foundation (ESF)
Organisation Overview
Description
Established in 1974, the European Science Foundation (ESF) is an independent association
of 78
member organisations responsible for the support of scientific research in 30
European countries. The ESF operates five different divisions, each covering a different
scientific field, including a division for the Humanities.
Strategy
The ESF is committed to promoting high quality scientific research in all disciplines at
European level. It facilitates pan-European cooperation and collaboration of scientific
research by encouraging networking and open communication between researchers and
funding agencies. The ESF aims to improve European research cooperation, to advise on
research and science policy, to promote the mobility of researchers, utilisation of research
facilities and to plan/manage collaborative research activities.
Funding Instruments
The ESF operates a number of funding instruments, including:
Exploratory Workshops - European researchers develop links to explore future
collaborative options (funded from the ESF general budget; no “juste retour”).
Scientific Networks – researchers meet and develop plans and opportunities for carrying
out research on a European scale.
Scientific “à la carte” - programmes – European research teams meet and focus on
identified themes and operate for 3-5 years. It gathers European wide research teams
already engaged in funded research projects at national level, with a view to developing a
European platform. They are funded by ESF member organisations on an “à la carte” basis.
EUROCORES Programmes (ESF Collaborative Research Programmes) – mobilise new
research funding, thereby facilitating the creation of collaborative research projects
composed of teams from across different European countries. It is a responsive mode
instrument (supranational selection of EUROCORES theme proposals). Research funding is
owned by ESF Member Organisations and other participating funding agencies; programme
networking is coordinated through ESF (EC funding).
HERA Work Package 3 Task 3.2 Peer Review Report
Page 62 of 124
Forward Looks – are exercises to identify and debate emerging and future research needs
and developments in Europe; they initially centre on a series of workshops with discussion
groups and panels and are often concluded with a final conference.
Research Conferences
- conference series (with competitive calls for themes), co-
sponsored by the ESF and local or institutional sponsors, aiming to facilitate high-level
discussions of innovative specific research issues.
EURYI (European Young Investigators Awards) – enables young researchers to create
research teams in Europe.
Funding
The ESF has a core annual budget of approximately €7 million, accrued from the
contributions of its member organisations. In addition, member organisations contribute
funding to the specific programmes (e.g. Scientific “à la carte” programmes, EUROCORES)
in which they choose to participate.
The Exploratory Workshops and Networks are funded from the ESF budget and the
European Research Conferences series are co-sponsored by the ESF, the European
Commission and local or institutional sponsors.
Peer Review Process
Peer Review objectives
1 = Evaluate application, fund research excellence (including novelty & originality),
transparent and impartial process.
2 = Make funding recommendations, collaboration/European added value, interdisciplinary
and utilise funding in an efficient manner
3 = International benchmarking
Peer Review Process Overview
A one-phase process is in use for all ESF instruments.
Only the EUROCORES Scheme employs a two-stage process: applicants first submit an
Outline Proposal (brief summary of the research proposal), which is assessed by the
EUROCORES Programme Review Panel against the criteria specified in the Call. The best
applications are invited to submit a detailed Full Proposal. Members of the Review Panel
are suggested by participating funding agencies and appointed by ESF. The Review Panel
is independent; members act on the basis of their scientific expertise, not as national
representatives. The two-stage process aims to reduce the submission of work-intensive
Full Proposals not fitting the Call.
Methods
Responsive research funding instruments employ independent international reviewers.
Both independent international “external referees” and a Review Panel of international
reviewers are employed for the EUROCORES Scheme.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 63 of 124
These methods are favoured in an effort to avoid conflict of interest and to promote
transparency as best practice.
Independent Reviewers
Programmes, Exploratory workshops (1st phase), Forward Looks, EURYI and the
EUROCORES Scheme applications are evaluated by independent reviewers.
The number of reviewers differs from instrument to instrument. This aims to strike a
balance between the degree of scientific novelty requested for an instrument and the
financial weight of that instrument.
Applications to EUROCORES Programmes are assessed by a minimum of three independent
external referees, appraised by two rapporteurs from among Review Panel members, and
discussed by the full Review Panel.
Independent online referees do not receive financial compensation for evaluating
applications. EUROCORES Review Panel convenes at the expense of ESF. Online evaluation
forms are completed by all reviewers and all applications are graded.
Domestic Panels/International Panels
The ESF exclusively uses international assessments and panels. The composition of panels
and bodies of external referees is based mainly on the scientific expertise needed (unless
specific national regulations of ESF Member Organisations or funding agencies require
additional criteria).
Domestic and international assessors on the same panel
Both European Young Investigators Awards (EURYI) and EUROCORES funding instruments
are assessed by Panels composed of international members. The panel size differs
depending on the funding instrument, (e.g.: EUROCORES Review Panel: 9-15 members;
EURYI: up to 6 panel members). The gender balance is approximately two-thirds male to
one-third female.
Only EURYI assessment panel members receive financial compensation for evaluating
applications, travel expenses are covered plus a flat fee of €3,000 for the Chair of the
panel and €1,500 for additional panel members.
Panel members complete a standard evaluation form for the EURYI scheme.
EUROCORES programmes have a complete online application and assessment system.
External referees use standard online assessment forms.
Applications to both schemes are graded and ranked by Review Panels.
Number of reviewers
On average, each application to most funding instruments is assessed in detail by 2-3
reviewers.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 64 of 124
Applications to EUROCORES Programmes are assessed by a minimum of three independent
external referees, appraised by two rapporteurs from among Review Panel members, and
discussed by the full Review Panel.
Process for selecting reviewers
Reviewers are sourced individually by ESF personnel from contacts, research networks,
and the internet and from a database of international reviewers compiled by the ESF.
Often participating research councils and review panel members suggest potential
reviewers.
For a EUROCORES programme, participating funding agencies suggest Review Panel
members and “external referees”. ESF completes the scientific expertise needed for Panel
and referee body and appoints the Panel.
Criteria for selection of reviewers
1 = Academic excellence; relevant disciplinary (subject area) competence; no conflict of
interest; availability
3 = Previous peer review experience, encourage young academics (30% maximum for
EUROCORES)
5 = gender balance
Reviewer recognition
Review Panel membership for EUROCORES Programmes is public; names appear on the
EUROCORES Programme website hosted by ESF.
Some ESF reviewers choose to note their participation in their CV (Curriculum Vitae).
Reviewer Training
For most funding instruments, guidelines are made available to referees.
EUROCORES Review Panel members receive detailed instructions before and during
meetings (half-day session); during the review process, advice on procedural issues is
available from the EUROCORES Programme Coordinator. Panel members are asked to
comply with rules governing the declaration of interest (“conflict of interest”).
Panel
members are briefed on national funding guidelines when necessary, and on rules needed
to arrive at an
assessment of budget estimates.
Conflict of interest
The ESF has a ‘conflict of interest’ policy, which states that “an interest may be defined as
where a person may benefit either professionally or personally by the success, or failure,
of a proposal”. Detailed interpretations are made available and explained to Panel
members and online referees.
Grading system
For most instruments, applications are graded by the following system: 1 = poor, 2 =
average, 3 = good, 4 = very good and 5 = excellent.
The significantly more detailed online assessment forms for EUROCORES Full Proposals use
four-rank grading system.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 65 of 124
For all instruments, applications are ranked.
For a EUROCORES Programme, the ranking takes place in the second stage of the peer
review process, by the Review Panel, and on the basis of online assessment by external
referees.
Application evaluation criteria
Research proposal
1 = Relative significance of the contribution that the research proposal will make to the
research
field,
also
specific
to
the
ESF:
scientific
quality,
feasibility,
level
of
multidisciplinary, originality, budget estimation, collaboration, European added-value and
an absence of overlapping with existing projects.
2 = Broad aims and objectives of the research, proposal description, proposed schedule of
development of proposal, location of the research proposal within the current state of
research, bibliography in the research area, methodology, theoretical framework and
suitability of institution proposed
4 = Plans for publication and dissemination of the research results (Scientific quality)
EURYI has specific criteria including (not in order of preference): ground-breaking
character of research proposed, potential to improve the competitiveness of European
research in a global context, positioning in the international context of the field of
research, appropriateness of the chosen methods, capability and commitment to host the
applicant and the proposed research.
EUROCORES Programmes have as standard criteria: Scientific quality; level of (trans-)
disciplinary integration within the collaborative research project (CRP); qualification of the
applicants (suitability for this project; general international standing); level of collaboration
envisaged between the components of the CRP; feasibility (incl. suitability of the methods
selected); overlap with existing projects (or projects applied for); suitability of budget
items (“value for money”). Also additional specific criteria are added for each Programme.
Principal applicant
1 = International collaboration (level of scientific international relationships of the group),
previous awards/funding (quality of the applicant), qualification of the Chair persons,
steering committee members and team leaders in relation to the proposal
2 = Academic record and achievements
5 = Mobility of researchers
EURYI has specific criteria including (not in order of preference): quality of publications,
potential to become a world-class leader in the respective field of research, abilities as an
independent researcher, scientific background and track record, potential for research
team leadership and project management, extent and quality of international research
collaboration and an internationally recognised level of excellence.
For specific EUROCORES criteria, see above.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 66 of 124
Feedback
Independent reviewers remain anonymous; however their written evaluations and grades
are available to applicants of funding instruments.
For most instruments (not for Exploratory Workshops), written summaries of assessments
are provided to applicants. Discussions of the applications by assessment panel members
are recorded and if requested are released to applicants.
For EUROCORES Programmes, the identity of Review Panel members is public. Written
evaluations and final ranking are released to applicants. Applicants also have access to the
anonymities online assessments by external referees, to which they can write a rebuttal,
which will be considered at the Review Panel meeting that proceeds to the final ranking of
proposals.
The evaluation process and the processing of referee data are subject to the rules set by
the French CNIL (“Commission national de l’informatique et des libertés”).
Feedback to reviewers
A thank you letter and a list of the final funded projects are communicated to reviewers.
Appeals process
No appeal process is in place at present.
Cost of peer review process
This data was not obtainable.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 67 of 124
HERA Partner
Research Foundation Flanders (FWO)
Fonds voor Wetenschappelijk Onderzoek – Vlaanderen
Organisation Overview
Description
FWO is Flanders' instrument for supporting and stimulating fundamental research and
advancing scientific quality.
Strategy
The FWO promotes new knowledge and supports human capital which enables goaloriented, applied, technological and strategic research.
Funding Instruments
The FWO operates two main research funding instruments entitled Mandates and Projects.
Funding
The total budget allocated to funding humanities research was €18 million in 2003.
Peer Review Process
Peer Review objectives
1 = Evaluate applications, Fund research excellence, Impartial process
3 = Transparent process
5 = Make funding recommendations International benchmarking
Peer Review Process Overview
A one-phase peer review process is utilised by all research funding instruments.
Methods
International and domestic reviewers and international assessment panels are employed
by the FWO.
Independent Reviewers
All FWO research funding instruments are evaluated by 2 independent reviewers.
Reviewers do not receive monetary compensation, they complete a standard application
form and applications are graded.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 68 of 124
International Panels
All research funding instruments use panels composed exclusively of international
assessors. Each panel has approximately 14 members. A standard evaluation form is
utilised by all panel members and applications are graded and ranked.
Number of reviewers
On average each application is assessed in detail by 2 reviewers.
Process for selecting reviewers
Applicants are requested to recommend appropriate reviewers to evaluate their research
proposals.
Criteria for selection of reviewers
1 = Academic excellence, Relevant disciplinary (subject area) competence
5 = Previous peer review experience, Encourage young academics, Create gender balance
Reviewer recognition
The FWO lists reviewers on its website and its yearbook and reviewers are also recognised
with a medal of honour at the end of their service. Reviewers may also indicate their
participation on their Curriculum Vitae.
Reviewer Training
The FWO does not offer training to its reviewers.
Conflict of interest
The FWO has a ‘conflict of interest’ policy that states that referees may not be a member
of the Board of Referees, Board of Trustees or a co-author of the applicant during the last
3 years. (Board of referee members may also not evaluate a member of their own team).
Grading system
The FWO grades and ranks the applications. The grading system is as follows: Madates
A+, Rank 1, 2, 3, rejected and not proposed.
Application evaluation criteria
Research proposal
1 = Broad aims and objectives of the research, Proposal description, Proposed schedule of
development of proposal, Methodology
2 = Location of the research proposal within the current state of research, Relative
significance of the contribution that the research proposal will make to the research field,
Theoretical framework
3 = Suitability of institution proposed
4 = Bibliography in the research area
5 = Plans for publication and dissemination of the research results
HERA Work Package 3 Task 3.2 Peer Review Report
Page 69 of 124
Principal applicant
1 = Academic record and achievements
2 = International collaboration, Mobility of researchers
5 = Previous awards/funding
Feedback
Independent reviewer’s written evaluations or grades are not available to applicants of
funding instruments; however the independent reviewer does not remains anonymous.
Applicants are not permitted access to an assessment panel member’s written evaluation,
however the grade awarded to their application and the identity of the assessment panel
members are released.
The evaluation process is subject to a ‘Freedom of Information’ policy.
Feedback to reviewers
Feedback is not available to reviewers.
Appeals process
An appeal process is not in operation.
Cost of peer review process
This data was not obtainable.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 70 of 124
HERA Partner
The Icelandic Centre for Research (RANNIS)
Rannsóknamiðstöð Íslands
Organisation Overview
Description
RANNÍS reports to the Ministry of Education, Science and Culture and serves the Icelandic
science community across all fields of science and humanities.
Strategy
The mission of RANNÍS is to provide professional assistance to the preparation and
implementation of science and technology policy in Iceland.
Funding Instruments
RANNÍS administers a number of funds, including the Research Fund, and thematic
programmes. The Research Fund operates three separate research funding instruments:
Project grant, Post-Doctoral grant and Grant of Excellence. The Research Fund funds
responsive mode research exclusively, it does not fund thematic or targeted research
programmes.
Research funding
The total annual budget of The Research Fund is approximately €7 million, of which 2025% (€1.5 million) is allocated for research in the humanities and social sciences.
Peer Review Process
Peer Review objectives
1 = Fund research excellence, transparent and impartial process
2 = International benchmarking
Peer Review Process Overview
A one-phase peer review process is utilised by all funding instruments.
Methods
RANNÍS employs domestic reviewers and domestic assessment panels for all instruments,
with the exception of Grants of Excellence which utilises international reviewers and a
domestic assessment panel. RANNÍS does not use panels of international assessors.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 71 of 124
Independent Reviewers
All research funding instruments are evaluated by two independent domestic reviewers
(mail reviewers). They receive monetary compensation of €50 per application reviewed, a
standard evaluation form is completed and applications are graded.
Review/assessment Panels
All funding instruments use this method. Panels consist of 7 members. The number of
panel members is constant regardless of the funding instrument and the gender balance is
approximately 60% male to 40% female.
All panel members receive a flat fee of €40 per application reviewed and an agreed hourly
rate is paid for their time. A standard evaluation form is utilised by assessors and
applications are ranked, but not graded.
International Panels
International Panels are not employed by RANNÍS.
Number of reviewers
The number of mail reviewers per application is 2.
Process for selecting reviewers
The review panel nominates independent reviewers.
Criteria for selection of reviewers
1 = Academic excellence, Relevant disciplinary (subject area) competence
2 = Previous peer review experience
3 = Create gender balance
4 = Encourage young academics
Reviewer recognition
Reviewers occasionally note their participation on their Curriculum Vitae.
Reviewer Training
Reviewers do not receive training from RANNÍS.
Conflict of interest
RANNÍS has a ‘conflict of interest’ policy which encompasses both national laws, such as
family connections, and also organisation criteria including professional connections and
academic competition etc.
Grading system
RANNÍS adheres to a grading system of: I, II, III, IV and V. Applications are graded and
ranked according to this system.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 72 of 124
Application evaluation criteria
Research proposal
1 = Location of the research proposal within the current state of research, Relative
significance of the contribution that the research proposal will make to the research field,
Methodology, Suitability of institution proposed
2 = Broad aims and objectives of the research, Proposal description, Theoretical
framework
3 = Bibliography in the research area, Plans for publication and dissemination of the
research results
4 = Proposed schedule of development of proposal
Principal applicant
1 = Academic record and achievements
2 = Previous awards/funding
3 = International collaboration
4 = Mobility of researchers
Feedback
Independent reviewer’s written evaluation, grade and identity are available to applicants of
funding instruments on request.
Assessment panel’s written evaluations and grades awarded are released to applicants.
Discussions of the applications by assessment panel members are not recorded. A
‘Freedom of Information’ policy is in place, whereby all documents relevant to the decision
to fund or not fund are accessible to the applicant involved.
Feedback to reviewers
Feedback is not provided to reviewers.
Appeals process
No process is in place at present.
Cost of peer review process
It is estimated that the total cost of the peer review process for one round of applications,
including all the instruments, is €28,000, with each application costing approximately €350
to evaluate. This number only includes the fees for review panel members and
independent reviewers.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 73 of 124
HERA Partner
Irish Research Council for the Humanities and Social Sciences (IRCHSS)
An Chomhairle um Thaighde sna Dána agus sna hEolaíochtaí Sóisialta
Organisation Overview
Description
The IRCHSS funds post-graduate research in the Humanities, Social Sciences, Business
and Law.
Strategy
The IRCHSS is dedicated to the funding of leading-edge research in the humanities, social
sciences, business and law with the objective of creating new expertise beneficial to
Ireland’s development as a dynamic knowledge society.
Funding Instruments
The IRCHSS operates five different research funding instruments:
1.
Post-Graduate Scholarship scheme – funds masters or doctoral degree by research
for a maximum of three years
2.
Post-Doctoral Fellowship scheme – awards funding for a maximum of two years
3.
Research Fellowship scheme – awards research leave for senior academics
4.
Senior Research Fellowship scheme – awards research leave for senior academics
5.
Thematic Research Project scheme – funds projects for a maximum of three years
in the following thematic areas:
Theme 1: Research infrastructures in the humanities and social sciences
Theme 2: Identity, culture and society in Europe
Theme 3: Innovation and society
Theme 4: Public policy and social change
Research funding
The IRCHSS receives funding of €8 million per year, approximately €7.5 million is directly
utilised for the funding of research. Of which €1million (14%) is awarded to thematic
research funding and the remaining €6.5 million (86%) is awarded to responsive research
funding.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 74 of 124
Peer Review Process
Peer Review objectives
The
objectives
of
the
IRCHSS
include:
evaluate
applications,
make
funding
recommendations, fund research excellence, a transparent and impartial process and
international benchmarking.
Peer Review Process Overview
A one-phase peer review process is utilised by the IRCHSS for all research funding
instruments.
Methods
The IRCHSS employs a number of methods of peer review depending on the funding
instrument. The Post-Graduate Scholarship scheme exclusively utilises an international
assessment panel. The Senior schemes (Post-Doctoral Fellowship, Research Fellowship and
Senior Research Fellowship schemes) employ independent international reviewers followed
by an international assessment panel. The Thematic Research Project scheme is evaluated
exclusively by an international assessment panel.
Independent Reviewers
The senior research funding instruments employ independent (postal) reviewers. Each
application is evaluated by one independent reviewer who receives a fee of €65 per
application reviewed. A standard evaluation form is completed and applications are not
graded.
Domestic/Research Council member Assessment Panels
The IRCHSS does not operate Domestic/Research Council member Assessment Panels.
International Assessment Panels
International Assessment Panels are exclusively employed by the IRCHSS for all research
funding instruments. Panels are composed of approximately 6 members (Thematic
projects), 12-15 members (Senior schemes) and a maximum of 25 members (PostGraduate scheme), with panel size depending on the funding instrument, number of
applications received and the range of disciplines represented. The gender balance is
approximately 70% male to 30% female. Panel members are paid a flat fee of €1,000 plus
travel and accommodation expenses.
Assessment Panel member’s complete evaluation
forms for the Post-Graduate and Thematic Projects schemes, but not for the Senior
schemes. All applications are graded and ranked by the panel.
Number of reviewers
On average each application is assessed in detail by 2/3 reviewers.
Process for selecting reviewers
HERA Work Package 3 Task 3.2 Peer Review Report
Page 75 of 124
Council members nominate and select appropriate reviewers and reviewers are also
sourced individually by Research Council staff, for example via relevant contacts, networks
or the internet.
Criteria for selection of reviewers
1 = Academic excellence, relevant disciplinary (subject area) competence
2 = Create gender balance, encourage young academics
3 = Previous peer review experience
Reviewer recognition
Reviewers do not receive formal recognition of their participation in the peer review
process.
Reviewer Training
Reviewers do not receive training from the IRCHSS.
Conflict of interest
The IRCHSS has a conflict of interest policy in operation, in which International
Assessment Panel members verbally agree not to contribute to the discussion of an
application where there is a personal, professional, or other conflict of interest and if a
conflict of interest arises the member of the panel agrees to remove themselves
voluntarily from the discussion.
Grading system
The IRCHSS adheres to a grading system of: A++, A+, A, B, NR (Not Recommended).
Applications are graded and ranked during the international assessment panel meetings.
Application evaluation criteria
Research proposal
1 = Broad aims and objectives of the research, proposal description, location of the
research proposal within the current state of research and the relative significance of the
contribution that the research proposal will make to the research field
2 = Methodology, theoretical framework, proposed schedule of development
3 = Bibliography in the research area, plans for publication and dissemination of the
research results
4 = Suitability of institution proposed
Principal applicant
1 = Academic record and achievements
2 = Previous awards/funding
3 = International collaboration
4 = Mobility of researchers
Feedback
Evaluation forms completed by independent reviewers are released to applicants on
request; however the independent reviewer’s identity remains anonymous.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 76 of 124
The identity of the assessment panel members and the grades assigned to applications are
released to applicants.
The evaluation process is subject to a national ‘Freedom of Information’ Act; therefore
discussions of the panel are recorded and released to applicants on request.
Feedback to reviewers
The final funding decisions are not formally communicated to reviewers.
Appeals process
A formal appeal process is currently not available to unsuccessful applicants.
Cost of peer review process
The total overall cost of an average research funding instrument peer review process is
approximately €40,000.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 77 of 124
HERA Partner
Ministry for Higher Education, Science and Technology (MHEST)
Slovenia
Ministrstvo za visoko šolstvo znanost in tehnologijo (MVZT)
Organisation Overview
Description
The Ministry of Higher Education, Science and Technology operates in the areas of higher
education, research, technology, metrology and the promotion of the information society.
The Scientific Research Council for the Humanities (SRCH) operates within the Slovenian
Research Agency (new law, before 2004 operated under ministry) and is responsible for all
humanities research.
It is noted that the Slovenian Research Agency (ARRS) was established by the MHEST in
October 2004. The Agency carries out its legal duties in the public interest by providing
permanent, professional and independent decision-making on the selection of programmes
and projects financed from the state budget and other financial sources.
Strategy
The main objectives of the MHEST include:
1.
Defining the strategic goals of research and development in Slovenia
2.
Standardising the higher education system with guidelines from the Bologna
agreement
3.
Promoting a favourable macroeconomic environment for innovation
4.
Establishing networks to link educational, cultural, research and development
spheres
Funding Instruments
The MHEST together with the Slovenian Research Agency operate the following research
funding instruments: ‘Research programmes’ (thematic funding instrument), ‘Research
projects’ (responsive funding instrument) and instruments that fund aim oriented research
and junior researchers.
Research funding
The estimated overall annual budget of the Ministry of Higher Education, Science and
Technology and the Slovenian Research Agency is €10 million euro.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 78 of 124
Peer Review Process
Peer Review objectives
1 = Evaluate applications, Fund research excellence, transparent and impartial process
2 = make funding recommendations
3 = international benchmarking
Peer Review Process Overview
A one-phase peer review process is utilised by all funding instruments.
Methods
The MHEST together with the Slovenian Research Agency employs both independent
domestic and international reviewers for all research funding instruments. An assessment
panel of the SRCH is used regularly. This board is responsible for the recommendations
and final approval of the thematic and responsive research projects.
Independent Reviewers
All funding instruments employ both independent domestic and international reviewers
with approximately 3 reviewers per application evaluated. Reviewers do not receive
financial compensation for evaluating applications. A standard application form is
completed by reviewers and all applications are graded.
Research Council member panel
An assessment panel consisting of Research Council members is regularly employed. The
panel consists of 14 members, with a gender balance of 65% male to 35% female.
Assessment panel members received financial compensation and travel expenses. Panel
members complete a standard evaluation form for projects (responsive), but not for
programmes (thematic). Panel members do not grade or rank applications.
International Panels
Exclusive international panels are not utilised by the MHEST and the Slovenian Research
Agency.
Number of reviewers
The average number of reviewers per application is 3.
Process for selecting reviewers
Reviewers are sourced individually by Research Council staff via contacts, research
networks and the internet. Applicants are also requested to recommend appropriate
reviewers.
Criteria for selection of reviewers
1 = Academic excellence, relevant disciplinary (subject area) competence
HERA Work Package 3 Task 3.2 Peer Review Report
Page 79 of 124
2 = Previous peer review experience
3 = Create gender balance
4 = Encourage young academics
Reviewer recognition
Reviewers do not receive formal recognition of their participation from the MHEST or
Slovenian Research Agency.
Reviewer Training
Reviewers do not receive training from the MHEST or Slovenian Research Agency.
Conflict of interest
A ‘conflict of interest’ policy exists, stating that reviewers are not permitted to evaluate
applications from their own research group.
Grading system
Applications are graded and ranked. The grading system is operated on a scale of 0%
(poor) to 100% (excellent).
Application evaluation criteria
Research proposal
1 = Broad aims and objectives of the research, Bibliography in the research area, Relative
significance of the contribution that the research proposal will make to the research field,
Methodology, Plans for publication and dissemination of the research results
2 = Proposal description, Theoretical framework
3 = Proposed schedule of development of proposal
5 = Location of the research proposal within the current state of research
Principal applicant
1 = Academic record and achievements
2 = International collaboration, Mobility of researchers
3 = Previous awards/funding
Feedback
Independent reviewer’s written evaluations and grade are available to applicants of
funding instruments and the identity of the reviewer remains anonymous to the applicant.
Assessment panel members written evaluations are released to the applicants; and their
identity remains anonymous. Assessment panel meeting discussions are recorded, but this
information is not released to applicants. The MHEST and Slovenian Research Agency are
subject to a ‘Freedom of Information’ policy, this is defined by the “Regulations of the
organisation and competences of expert bodies”.
Feedback to reviewers
Feedback on the peer review process is provided to reviewers via the annual report
published.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 80 of 124
Appeals process
An appeals process is open to unsuccessful applicants if they reply within 8 days of the
outcome of the peer review process.
Cost of peer review process
It is estimated that the operation of the entire humanities peer review process per annum
is €20,850.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 81 of 124
HERA Partner
The Research Council of Norway (RCN)
Norges forskningsråd
Organisation Overview
Description
The Research Council of Norway is a national strategic body and funding agency for
research and innovation activities. The Research Council funds all fields of research and
innovation. The Department for the Humanities within the Division for Science is
responsible for the funding of humanities research.
Strategy
The Research Council of Norway’s strategy states "Research expands frontiers", and
includes goals such as the commitment to enhancing quality in research, increased
research for innovation, expand the dialogue between research and society, increase the
internationalisation of Norwegian research and foster talent.
Research Funding Instruments
•
Independent basic research projects (researcher initiated) – humanities
•
Cultural research
•
Language Technology
•
Saami research
•
Gender research
Research funding
The overall budget of RCN is €590 million and the budget specifically dedicated to the
humanities is €18 million.
Peer Review Process
Peer Review objectives
1 = Evaluate applications Fund research excellence
2 = Impartial process
3 = Transparent process, International benchmarking
4 = Make funding recommendations
HERA Work Package 3 Task 3.2 Peer Review Report
Page 82 of 124
Peer Review Process Overview
A one-phase peer review process is utilised by all funding instruments.
Methods
Both independent domestic and international reviewers are employed by the RCN. (The
RCN/Department for the Humanities does not use assessment panels).
Independent Reviewers
Both independent domestic and international reviewers are employed by the RCN for both
responsive and thematic research funding instruments. Approximately 2-3 reviewers
evaluate every application and they each receive a fee of €108. A standard application
form is completed by reviewers and all applications are graded.
Domestic/Research Council member Panels
Domestic/Research Council member panels are not utilised by the RCN/Department for the
Humanities.
International Panels
International panels are not utilised by the RCN/Department for the Humanities.
Number of reviewers
The average number of reviewers per application is 2/3.
Process for selecting reviewers
Reviewers are selected from a database of domestic and international reviewers and were
necessary are also sourced individually by Research Council staff via contacts, research
networks and the internet. In addition applicants are requested to recommend appropriate
reviewers for their research proposal.
Criteria for selection of reviewers
1 = Academic excellence, impartiality
2 = Relevant disciplinary (subject area) competence
3 = Create gender balance
4 = Encourage young academics
Reviewer recognition
Reviewers do not receive formal recognition of their participation from the RCN.
Reviewer Training
Reviewers do not receive training from the RCN.
Conflict of interest
The RCN has a ‘conflict of interest’ policy in place, whereby reviewers are requested to sign
a declaration concerning impartiality.
Grading system
HERA Work Package 3 Task 3.2 Peer Review Report
Page 83 of 124
Applications are graded from 1-7, whereby 7 = the highest rating, and 1 = the lowest
rating. Only those applications that have passed the threshold and considered for funding
are ranked by the committee/programme board. This ranking is for internal use only and is
not provided as feedback to applicants.
Application evaluation criteria
In the RCN's appraisal-form, reviewers are requested to evaluate applications on
approximately 12 points. The evaluation criterion varies for each funding instrument. The
criteria are not graded in terms of importance, however the most important part of the
overall assessment, is a statement of the most important strengths and weaknesses of the
project.
Feedback
Independent reviewer’s written evaluations and grade are available to applicants of
funding instruments and the identity of the reviewer remains anonymous to the applicant.
The RCN is subject to a ‘Freedom of Information’ policy.
Feedback to reviewers
No feedback on the peer review process is provided to reviewers.
Appeals process
A limited appeal process is in place, in that applicants are only permitted to appeal against
RCN administrative procedures, for example if an unsuccessful applicant argues that a
procedural error or misuse of authority has occurred during the processing of applications.
Complaints may be made via the following, 1. Processing errors: a violation of the
provisions of the Public Administration Act and the Research Council's own regulations
regarding procedures for processing applications for research funding and 2. Misuse of
authority: violation of general requirements regarding objectivity or unreasonable
differential treatment.
Cost of peer review process
It is estimated that the approximate cost per peer review process is €85,000; this was
calculated on the average cost of evaluating 250 applications for the funding instrument
"Independent basic research projects".
HERA Work Package 3 Task 3.2 Peer Review Report
Page 84 of 124
HERA Partner
The Swedish Research Council
Vetenskapsrådet (VR)
Organisation Overview
Description
The Swedish Research Council has a division dedicated to the funding of humanities and
social sciences referred to as the ‘Scientific Council for the Humanities and Social
Sciences’.
Strategy
The Scientific Council for Humanities and Social Sciences assesses and prioritises basic
research in the humanities, social sciences, law and theology, and aims to create
conditions for better utilisation of research skills, leading to higher standards of quality and
innovation.
Funding Instruments
The Scientific Council for Humanities and Social Sciences operates research funding
instruments to fund research projects, domestic and international Postdoctoral fellowships,
Junior Research Faculty Positions, and also offers research publication grants.
Research funding
As part of the Scientific Council for the Humanities and Social Sciences research funding
agenda, it supports both government initiated research and its own thematic research
programmes. A budget of €31million is allocated on an annual basis to the Scientific
Council and of the research funded, 15% is thematic and 85% is responsive research.
Peer Review Process
Peer Review objectives
1 = Evaluate applications, Make funding recommendations, Transparent process
2 = Fund research excellence, Impartial process
3 International benchmarking
Peer Review Process Overview
A one-phase peer review process is utilised by all funding instruments.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 85 of 124
Methods
Panels for all funding instruments are exclusively composed of domestic assessors, as the
majority of applications are submitted in the native language and domestic assessors have
a wide knowledge of the local scientific community.
In the majority of funding instruments, independent reviewers are also employed to
complement the panel of domestic assessors. The assessment panel members call for an
independent reviewer (domestic or international) if it is deemed that expertise is not
sufficient within the panel itself.
Independent Reviewers
Independent reviewers are employed to complement assessment panels of domestic
assessors.
Domestic Panels
Panels consist of 7-8 members and a fee of €60 is paid per application reviewed and travel
expenses are covered. A standard evaluation form is completed by assessors and
applications are graded and ranked.
International Panels
International panels are employed by the VR in certain calls that are common to the entire
organisation and span several fields of research, for example Centres of Excellence.
Number of reviewers
The average number of reviewers per application is 2.
Process for selecting reviewers
Reviewers are selected by Council Board members and are also sourced individually by
Research Council staff via contacts, research networks and the internet.
Criteria for selection of reviewers
1 = Academic excellence, Relevant disciplinary (subject area) competence
3 = Create gender balance
4 = Previous peer review experience
5 = Encourage young academics
Reviewer recognition
Reviewers state their participation in assessment panels on their Curriculum Vitae.
Reviewer Training
Reviewers do not receive training from the VR, but they do receive written instructions of
the peer review procedure.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 86 of 124
Conflict of interest
The VR has a ‘conflict of interest’ policy in place, stating that, “Reviewers should not
review, in detail, projects in which they have a financial interest, family or friends, their
students or collaborators, or (usually) applicants from their own university institutions”.
Grading system
Applications are ranked and graded. Assessors grade applications from 1 to 7, for example
1 = insufficient quality and 7 = international competitive quality.
Application evaluation criteria
Research proposal
No data provided.
Principal applicant
No data provided.
Feedback
Independent reviewer’s written evaluations and grade are available to applicants of
funding instruments on request.
Assessment panel members are not identified to applicants, however the collective
statements (discussions) issued by the assessment panel are available to applicants. A
‘Freedom of Information’ policy is in operation.
Feedback to reviewers
No feedback on the peer review process is provided to reviewers.
Appeals process
The VR does not have an appeals process in place; however complaints can be made
through legal channels.
Cost of peer review process
This data was not obtainable.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 87 of 124
HERA Sponsoring Partner
Swiss National Science Foundation (SNSF)
Schweizerischer Nationalfonds zur Förderung der wissenschaftlichen Forschung (SNF)
Fonds national suisse de la recherche scientifique (FNS)
Organisation Overview
Description
The SNSF is Switzerland’s foremost institution in the promotion of scientific research. It
supports research in all disciplines. The main task of the SNSF is to evaluate the quality of
research proposals submitted by scientists and to provide funding on the basis of priorities
and available financial resources.
Strategy
The SNSF promotes independent free research to foster young scientific talent. It funds
interdisciplinary and problem-oriented research programmes which attempt to provide
scientifically sound solutions to problems of social significance and to develop scientific
centres of excellence.
Funding Instruments
The SNSF operates three main research funding instruments including Project Funding,
Targeted Research (NRP and NCCP) and the funding of individual scientists. Both
responsive and thematic research funding instruments are in operation.
Funding
The total budget allocated to funding humanities research is € 25 million euro.
Peer Review Process
Peer Review objectives
1 = Evaluate applications, Make funding recommendations, Fund research excellence,
Impartial process
2 = Transparent process, International benchmarking
Peer Review Process Overview
A one-phase peer review process is utilised by all responsive mode research funding
instruments and a two-phase process is conducted via expert groups for targeted research
instruments.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 88 of 124
Methods
International and domestic reviewers are utilised by the SNSF for responsive mode funding
allowing for a large pool of reviewers to be drawn from. (The definitive rating and ranking
is carried out by the Research Council). Thematic research funding instruments are more
complex and employ both independent reviewers (domestic and international) and panels
of domestic and international assessors.
Independent Reviewers (either Research Council member, domestic or international
reviewer)
Both responsive and thematic research funding instruments are evaluated by independent
reviewers. There are approximately 2-6 reviewers per application. Reviewers do not
receive monetary compensation, they complete a standard application form and
applications are not graded.
Research Council member assessment panel
Responsive mode research funding instruments utilise a panel of Research Council
members, this panel is composed of between 5 and 21 members (80% male to 20%
female) depending on the funding instrument. Research Council members are paid an
annual fee per year of €7,000 and their travel expenses are covered. Council members do
not complete a standard evaluation form and applications are graded.
Domestic and International assessment panel
Thematic research funding instruments use panels composed of domestic and international
assessors. Each panel has between 5 and 21 members and they receive financial
compensation. A standard evaluation form is not utilised by all panel members and
applications are graded.
International assessment panel
International assessment panels are only employed if a funding instrument awards a
certain level of funding, in this case the thematic programme NCCR utilises this method.
On average 10 members constitute the panel and they receive financial compensation for
their participation. Standard application forms are not used and applications are graded
but not ranked.
Number of reviewers
On average each application is assessed in detail by 5 reviewers.
Process for selecting reviewers
Council members nominate/select appropriate reviewers and reviewers are also sourced
individually by Research Council staff via contacts, research networks and the internet.
Criteria for selection of reviewers
1 = Academic excellence, Previous peer review experience
2 = Relevant disciplinary (subject area) competence
HERA Work Package 3 Task 3.2 Peer Review Report
Page 89 of 124
3 = Encourage young academics, Create gender balance
Reviewer recognition
Reviewers do not receive formal recognition of their participation in the peer review
process.
Reviewer Training
The SNSF does not offer training to its reviewers.
Conflict of interest
The SNSF has a ‘conflict of interest’ policy that includes regulations to avoid a potential
bias.
Grading system
The SNSF grades and ranks applications. The grading system is as follows: A, AB, B, BC,
C, CD, D (CD and D are not recommended, BC and C imply a medium range priority that
could potentially be approved if funds were available.
Application evaluation criteria
Research proposal
1 = Proposal description, Location of the research proposal within the current state of
research, Bibliography in the research area, Relative significance of the contribution that
the research proposal will make to the research field, Methodology, Theoretical framework
2 = Proposed schedule of development of proposal, Suitability of institution proposed
3 = Plans for publication and dissemination of the research results
Principal applicant
2 = Academic record and achievements
3 = International collaboration, Previous awards/funding
Feedback
Independent reviewer’s written evaluations are available to applicants of funding
instruments; however the independent reviewer remains anonymous and the applicant’s
grade is not released.
Applicants are not permitted access to the grades assigned by assessment panel members.
Assessment panel members are not anonymous and their discussions are recorded at the
meeting and the notes made available to applicants on request. The SNSF is not subject to
a ‘Freedom of Information’ policy.
Feedback to reviewers
Feedback is not available to reviewers.
Appeals process
An appeal process is in place and is operated by the Independent Federal Appeal
Commission.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 90 of 124
Cost of peer review process
Approximately 1% of the total budget is allocated to the administration costs of the peer
review process (€250,000).
HERA Work Package 3 Task 3.2 Peer Review Report
Page 91 of 124
HERA Sponsoring Partner
National Fund for Scientific Research (NFSR)
Fonds National de la Recherche Scientifique (FNRS)
Organisation Overview
Funding Instruments
The FNRS operates a number of funding instruments, entitled: Positions (research
positions within the FNRS), Impulse mandate, FRFC programmes and Post-Doctoral
Researcher. The FNRS does not operate thematic research funding instruments.
Peer Review Process
Peer Review objectives
1 = Evaluate applications (this is the first step of the selection process, the application
assessment is one of the most important responsibilities of the FNRS and is performed by
the Scientific Commission)
1 = Make funding recommendations (this is the second stage of the selection process also
performed by the Scientific Commission)
1 = Fund research excellence (this is the final step of the selection procedure, the funding
of research excellence is the most important aspect of the FNRS and is achievable thanks
to the high quality review ensured in the first two stages of the selection process)
1 = Impartial process
3 = Transparent process
4 = International benchmarking
Peer Review Process
Overview
A one-phase peer review process is utilised by all research funding instruments.
Methods
The FNRS employs assessment panels composed of both international and national
members. The international members of the panel evaluate the actual research proposal
and the national panel members then evaluate the relevancy of research proposals.
Independent Reviewers
The FNRS does not employ independent reviewers.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 92 of 124
Assessment panels of both Domestic and International members
All research funding instruments are evaluated by assessment panels consisting of both
domestic and international members. Panels are composed of 10 members, 5 from the
French speaking community, 3 from the Flemish community and 2 from outside of the
state. The gender ratio is approximately 85% male to 15% female.
Panel members are financially compensated, with French speaking and Flemish assessors
each receiving €50 for their participation and international assessors receiving a fee of
€400 and travel/accommodation expenses reimbursed. A standard evaluation form is not
utilised by panel members and applications are graded.
International Panels
Panels exclusively composed of International members are not utilised by the FNRS.
Number of reviewers
On average each application is assessed by 10 reviewers.
Process for selecting reviewers
Council members are selected due to their expertise. New panel members are nominated
by the current members of the Scientific Commission and appointed by the Board of
Trustees.
Criteria for selection of reviewers
1 = Academic excellence, Relevant disciplinary (subject area) competence
2 = Previous peer review experience, Create gender balance
3 = Encourage young academics
Reviewer recognition
Reviewers are listed on the FNRS website and they may also indicate their participation on
their Curriculum Vitae.
Reviewer Training
Reviewers do not receive training from the FNRS.
Conflict of interest
The FNRS does not have a formal ‘conflict of interest’ policy in place.
Grading system
The FNRS grades applications as follows: A++, A+, A, B, NR (Not Recommended);
however it does not rank applications.
Application evaluation criteria
Research proposal
1 = Broad aims and objectives of the research, Location of the research proposal within
the current state of research, Relative significance of the contribution that the research
HERA Work Package 3 Task 3.2 Peer Review Report
Page 93 of 124
proposal will make to the research field, Methodology, Theoretical framework, Feasibility of
the project
2 = Proposal description, Proposed schedule of development of proposal, Bibliography in
the
research
area,
Suitability
of
institution
proposed,
Plans
for
publication
and
dissemination of the research results
Principal applicant
1 = Academic record and achievements
2 = Previous awards/funding, International collaboration, Mobility of researchers
Feedback
Assessment panel member’s written evaluation and grades are not available to applicants;
however their identity is released. Discussions of the applications by assessment panel
members are recorded but not released to applicants. The evaluation is not subject to a
‘Freedom of Information’ policy.
Feedback to reviewers
The final funding decisions are not communicated directly to reviewers.
Appeals process
An appeal process is not in operation.
Cost of peer review process
This data was not obtainable.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 94 of 124
HERA Humanities in the European Research Area
WP3 Task 3.2 Peer Review workshop
Hosted by Irish Research Council for the Humanities and Social Sciences
(IRCHSS) in Dublin Castle
Schedule
Thursday 17 November 2005
19.00 Welcome reception and dinner for participants
Friday 18 November 2005
9.30 – 17.00 workshop
9.30 – 9.40
Opening welcome by Dr Maurice J. Bric, Chair, IRCHSS
9.40 – 10.10
Plenary address by Dr. Sven Hemlin
10:10 – 10:30
Discussion
10:30 – 11:00
‘Tour de table’ – short presentations by each HERA Work
Package Leader
11:00 – 11:15
Coffee break
11:15 – 12:15
Short presentations continued
12:15 – 12:45
Plenary address by Professor Chris Caswill
12:45 – 13:00
Discussion
13:00 – 14:00
Lunch
14:00 – 15:00
Parallel Breakout Sessions
Breakout session 1.
'Balance between National and International Peer
Review'
Breakout Session 2.
'Evaluation Procedures and Criteria'
Breakout Session 3.
'Identification and Selection of Experts'
Breakout Session 4.
'Feedback Processes'
15.00 – 15.30
Tea/coffee break
HERA Work Package 3 Task 3.2 Peer Review Report
Page 95 of 124
15.30 – 16.30
Feedback from Breakout Sessions
16.30 – 17.00
Conclusions from Speaker 1 and 2
17:00
Tour of Dublin Castle
Speakers
Professor Chris Caswill is Visiting Fellow at the James Martin Institute at Oxford
University, Visiting Professor at Exeter University, and Senior Research Fellow at University
College, London. He is Senior Research Associate at the Interdisciplinary Centre for
Comparative Research in the Social Sciences in Vienna, Adviser to the Research Council of
Norway, and Senior Policy Adviser to the EU-funded NORFACE ERA-NET project. Until the
end of 2003, he was Director of Research at the ESRC. His research interests are in
science policy, European research policy, the application of principal-agent theory and
interactive social science.
Publications:
•
Social science policy: challenges, interactions, principals and agents, Science and
Public Policy, 25 (5), 1998 (with E. Shove)
•
Introducing interactive social science, Science and Public Policy, Special Issue, 27
(3), 2000
•
Principals, agents and contracts, Science and Public Policy, Special Issue, 30 (5),
2003
•
Old games, old players – new rules, new results – the ERA as European science
policy, in Changing Governance of Research and Technology Policy: the European
Research Area (eds J. Edler, S. Kuhlmann and M. Behrens, Edward Elgar, 2003)
Dr. Sven Hemlin is a Senior Lecturer at the Department of Psychology, Göteborg
University, Visiting Research Fellow at SPRU, University of Sussex, Visiting Professor at the
Department for Management, Politics and Philosophy, Copenhagen Business School, and
Acting Director for the Centre for Research Ethics, The Sahlgrenska Academy, Göteborg
University. Dr. Hemlin’s research interests include: cognitive and social psychology based
science studies, research ethics, research policy, R&D management and research
evaluation studies.
Publications:
•
Social studies of the humanities. A case study of research conditions and
performance in Ancient History and Classical Archaeology and English. Research
Evaluation, 10, 53-61 (1995)
•
Research production in the arts and humanities. A questionnaire study of factors
influencing research performance. Scientometrics, 37, 417-432. (with Gustafson,
M) (1996)
•
(Dis)agreement in peer review. In P. Juslin, & H. Montgomery (Eds.), Judgment
and Decision Making: Neo-Brunswikian and process-tracing approaches (pp. 275301). Mahwah, New Jersey: Lawrence Erlbaum Associates, Inc. (1999)
•
Creative knowledge environments. Edward Elgar Publishing, UK Hemlin, S.,
Allwood, C M, Martin, B. (Eds.) (2004)
HERA Work Package 3 Task 3.2 Peer Review Report
Page 96 of 124
•
The shift in academic quality control. Science, Technology & Human Values. (with
Rasmussen, S.B.) (forthcoming
HERA Peer Review Workshop
18 November 2005
List of Participants
Ms Faye Auty
email: [email protected]
Arts and Humanities Research Council
Whitefriars
Lewins Mead
Bristol BS1 2AE
United Kingdom
Mr Tom Boland
email: [email protected]
Chief Executive Officer
Higher Education Authority
Marine House
Clanwilliam Terrace
Dublin 2
Ireland
Dr Annemarie Bos
email: [email protected]
Netherlands Organisation for Scientific Research
Council for the Humanities
P.O. Box 93425
2509 AK The Hague
The Netherlands
Dr Maurice J Bric, MRIA
email: [email protected]
Irish Research Council for the Humanities and Social Sciences
First Floor, Brooklawn House
Shelbourne Road
Ballsbridge, Dublin 4
Ireland
Dr Louise Byrne
email: [email protected]
Office of the Chief Science Adviser to the Government
Wilton Park House
Wilton Place
Dublin 2
Ireland
Dr Marc Caball
email: [email protected]
Irish Research Council for the Humanities and Social Sciences
First Floor, Brooklawn House
Shelbourne Road
HERA Work Package 3 Task 3.2 Peer Review Report
Page 97 of 124
Ballsbridge, Dublin 4
Ireland
Dr Anne Cody
email: [email protected]
Health Research Board
73 Lower Baggot Street
Dublin 2
Ireland
Mr Tim Conlon
email: [email protected]
Irish Research Council for the Humanities and Social Sciences
First Floor, Brooklawn House
Shelbourne Road
Ballsbridge, Dublin 4
Ireland
Dr Jane Conroy
email: [email protected]
French Department
National University of Ireland, Galway
Ireland
Ms Patricia Cranley
email: [email protected]
Health Research Board
73 Lower Baggot Street
Dublin 2
Ireland
Ms Julie Curley
email: [email protected]
Irish Research Council for the Humanities and Social Sciences
First Floor, Brooklawn House
Shelbourne Road
Ballsbridge, Dublin 4
Ireland
Dr Brendan Curran
email: [email protected]
Health Research Board
73 Lower Baggot Street
Dublin 2
Ireland
Mr Carl Dolan
email: [email protected]
International Affairs Manager (HERA)
Arts and Humanities Research Council
Whitefriars
Lewins Mead
Bristol BS1 2AE
United Kingdom
HERA Work Package 3 Task 3.2 Peer Review Report
Page 98 of 124
Dr phil Margrét Eggertsdóttir
email : [email protected]
Stofnun Árna Magnússonar
Árnagarði við Suðurgötu
IS-101 Reykjavík
Iceland
Professor Elisabet Engdahl
email: [email protected]
Department of Swedish
Göteborg University
Box 200
S 405 30 Göteborg
Sweden
Dr Adolf Filáçek
email: [email protected]
Academy of Sciences Czech Republic
Division of Humanities and Social Sciences
Národni 3
CZ-117 20 Prague 1
Czech Republic
Professor Audrone Glosiene
email: [email protected]
Institute of Library and Information Science
Vilnius University
Universiteto 3, LT-01513 Vilnius
Lithuania
Dr Torunn Haavardsholm
email: [email protected]
Director
The Research Council of Norway
P.O. Box 2700 - St. Hanshaugen
N-0131 Oslo
Norway
Ms Gillian Hastings
email: [email protected]
Health Research Board
73 Lower Baggot Street
Dublin 2
Ireland
Professor Arne Jarrick
email: [email protected]
Deputy Secretary General
Swedish Research Council
SE-103 78 Stockholm
Sweden
HERA Work Package 3 Task 3.2 Peer Review Report
Page 99 of 124
Dr Lena Johansson de Château
email: [email protected]
Research Officer
Swedish Research Council
Humanities and Social Sciences
Regeringsgatan 56
S-103 78 Stockholm
Sweden
Dr Grete Kladakis
email: [email protected]
Director
Danish Research Agency
Research Council for the Humanities
Artillerivej 88
DK-2300 Copenhagen
Denmark
Dr Ruediger Klein
email: [email protected]
Humanities Unit
European Science Foundation (ESF)
1, quai Lezay-Marnesia
F-67000 Strasbourg
France
Dr Davor Kozmus
email: [email protected]
Ministry for Higher Education, Science and Technology
Trg OF 13
1000 Ljubljana
Slovenia
Professor Kristin Kuutma
email: [email protected]
University of Tartu
Linda 5-6
10422 Tallinn
Estonia
Mag Monica Maruska
email: [email protected]
Austrian Science Fund (FWF)
Weyringergasse 35
A-1040 Vienna
Austria
Professor Elizabeth Meehan
email: [email protected]
Council Board member
Irish Research Council for the Humanities and Social Sciences
Brooklawn House
Shelbourne Road
Ballsbridge, Dublin 4
HERA Work Package 3 Task 3.2 Peer Review Report
Page 100 of 124
Ireland
Dr Kustaa Multamäki
email: [email protected]
Academy of Finland
POB 99
FI-00501 Helsinki
Finland
Professor Arto Mustajoki
email: [email protected]
Department of Slavonic, Baltic Languages and Literatures
P.O. Box 24
00014 University of Helsinki
Finland
Ms Ülle Must
email: [email protected]
Archimedes Foundation
Väike-Turu 8
51013 Tartu
Estonia
Dr H.J.W. (Jan) Nap
email: [email protected]
Netherlands Organisation for Scientific Research
Council for the Humanities
P.O. Box 93425
2509 AK The Hague
The Netherlands
Dr Rudolf Novak
email: [email protected]
Austrian Science Fund (FWF)
Weyringergasse 35
A-1040 Vienna
Austria
Dr Cathal Ó Domhnaill
email: [email protected]
Science Foundation Ireland
Wilton Park House
Dublin 2
Ireland
Professor Jaroslav Pánek
email: [email protected]
Academy of Sciences Czech Republic
Národni 3
CZ-117 20 Prague 1
Czech Republic
Ms Solbjørg Rauset
email: [email protected]
Senior Advisor
The Research Council of Norway
P.O. Box 2700 - St. Hanshaugen
HERA Work Package 3 Task 3.2 Peer Review Report
Page 101 of 124
N-0131 Oslo
Norway
Professor Eda Sagarra
email: [email protected]
Department of Germanic Studies
Trinity College Dublin
Dublin 2
Dr Eiríkur Smári Sigurðarson
email: [email protected]
Icelandic Centre for Research
Laugavegi 13
IS-101 Reykjavik
Iceland
Dr Monique van Donzel
email: [email protected]
European Science Foundation
1 Quai Lezay-Marnésia
F-67080 Strasbourg
France
Dr Giedrius Viliunas
email: [email protected]
Department of Lithuanian Literature
Vilnius University
Universiteto 3
LT-01513 Vilnius
Lithuania
HERA Work Package 3 Task 3.2 Peer Review Report
Page 102 of 124
Glossary of Terms
Research Funding Instrument
Refers to the actual funding mechanisms operated by the funding organisation. Also
referred to as research funding method, programme, scheme, grant, award, and
scholarship/fellowship.
Organisation
Refers to the actual body that operates and administers the research funding (for
example: research council, research agency)
Thematic Research
Research funding instruments, whereby research proposals submitted must fall within
specific pre-determined themes. Also referred to as top-down, targeted or strategic
research.
Responsive Research
Research funding instruments, whereby research proposals submitted are not restricted to
specific themes; applicants are unlimited in their choice of research area. Also referred to
as bottom-up research.
Reviewer
Individual expert employed to evaluate an application (research proposal) to a research
funding instrument. Also referred to as an expert, reader or assessor.
Independent Reviewers
Experts selected to evaluate individually an application (independent
reviewers/panel). Also referred to as postal reviewers or external reader.
of
other
Assessment Panel
Actual meeting held with several assessors present, whereby applications to research
funding instruments are discussed and recommendations for funding are made. Also
referred to as assessment board or expert board/panel. Assessment panel members are
referred to as assessors.
Referee
Person that provides a verbal or written reference for an individual applicant or application
applying for research funding.
Non-nationals
Persons not citizens of a specific nation-state where they have submitted a proposal for
funding.
HERA Work Package 3 Task 3.2 Peer Review Report
Page 103 of 124
Work Package:
3
Task:
3.1/3.2
Questionnaire:
Survey of partner Research Councils
Application procedures and Peer Review processes
Task Leader:
Irish Research Council for the Humanities and Social
Sciences (IRCHSS)
Objective:
This questionnaire will form the basis of the report on best
practice in application procedures and enable the provision
of an overview of the procedures and processes of each
partner Council. The peer review section of the
questionnaire will advance Task 3.2, collating the
information necessary for the peer review workshop and
subsequent report.
Please note that all questions relate to the funding of HUMANITIES ONLY
Please note that this questionnaire is intended to provide an overview of
procedures and practices in partner Research Councils.
It is therefore acceptable to provide general data and not detailed information in
respect of questions. Please insert additional rows/space as necessary.
Contents
HERA Work Package 3 Task 3.2 Peer Review Report
Page 104 of 124
Section 1
Organisation information..........................................4
Q1
Q2
Section 2
Definition of terms………………………………………………4
Q3
Section 3
Application process for responsive research……….6
Application process for thematic research………....6
Timeframe for application process………………………7
Application procedure………………………………………….8
Language……………………………………………………………..8
Eligibility……………………………………………………………….8/9
References……………………………………………………………9
Success rate…………………………………………………….....9
Applications received……………………………………………10/11
Peer Review……………………………………………………….12
Q15
Q16
Q17
Q18
Q19
Q20
Q21
Q22
Q23
Q24
Q25
Q26
Q27
Q28
Q29
Q30
Q31
Q32
Q33
Q34
Section6
Call for applications timeframe…..………………………5
Announcement of calls………………………………………..5
Application Process……………………………………………..6
Q6
Q7
Q8
Q9
Q10
Q11
Q12
Q13
Q14
Section 5
Describe terms…………………………………………………….4/5
Call for applications…………………………………………….5
Q4
Q5
Section 4
Name...........................................................4
Contact details............................................. 4
Peer review objectives………………..……………………….12
Overview of process……………………………………………..12
Responsive research instruments………………………..12/13
Thematic research instruments………………………....13
Independent reviewers…………………………………………13/14
Domestic/Council member panels……………………….14/15
International panels………………………………………….….15/16
International panels criteria………………………………..16
Reviewers per application……………………..…………….16
Reviewer selection process………………………………….16
Reviewer selection criteria……………………………….….17
Reviewer recognition……………………………………………17
Reviewer training…………………………………………………17
Conflict of interest……………………………………………….17/18
Rating and ranking……………………………………………….18
Evaluation criteria………………………………………………..18/19
Feedback to applicants………………………………………..19/20
Feedback to reviewers…………………………………………20
Appeals process………………………………………….……….20
Cost of process………………………………………………….…20
Finalisation of funding awards……………………………..20
Q35
Q36
Q37
Q38
Final procedure…............................................20/21
Contract………………………………………………………………..21
Payment of funds…………………………………………………21
Public informed…………………………………………………….21
HERA Work Package 3 Task 3.2 Peer Review Report
Page 105 of 124
SECTION 1: Organisation Information
Q1 Name of organisation
English language
(name and
acronym)
Native language
(name and
acronym)
Q2 Contact details
Address
Telephone
Fax
E-mail
Website
Contact
personnel
Http://www.
First name
Surname
Title
Position
SECTION 2: Definition of terms
Q3 Please supply definitions of the following terms
Post-graduate student ____________________________________________
_________________________________________________________________
PhD student _____________________________________________________
_________________________________________________________________
Post-Doctoral researcher __________________________________________
_________________________________________________________________
Principal Investigator ____________________________________________
_________________________________________________________________
Project Leader ___________________________________________________
_________________________________________________________________
Programme ______________________________________________________
HERA Work Package 3 Task 3.2 Peer Review Report
Page 106 of 124
_________________________________________________________________
Thematic Research funding ____________________________________
__________________________________________________________
Please include additional terms and definitions if necessary
__________________________________________________________
SECTION 3: Call for applications
Q4 Call for applications timeframe
Please complete the following table, by supplying the following details:
- Title of each funding instrument
- 5 if a continual or fixed call
- No. (number) of calls per year (if applicable)
- Date the call is launched (month, year)
- Closing date (deadline for submission of applications) (month, year)
- Time-period (number of weeks/months) to submit application
Instrument
title
Continual
call
Fixed
call
No. calls
per year
Date call
launched
Closing
date
Time
period
Q5 How does the organisation announce the call?
Please 5 the appropriate box
… Organisation website
… Mail-shot (a dedicated email sent to a relevant database of interested parties)
… National press
… Regional press
… Email to key individuals
… Site visits to relevant institutions
… Other (please state) _____________
HERA Work Package 3 Task 3.2 Peer Review Report
Page 107 of 124
SECTION 4: Application Process
Q6 Overall application process employed for responsive research funding
instruments
Please 5 the appropriate box
… One-phase process
Applications are submitted in full and reviewed
… Two-phase process
Applicants submit a brief summary of their research proposal, this is assessed
and the best applications are invited to submit a detailed outline of their research
proposal
Q7 Overall application process employed for thematic research funding
instruments
Please 5 the appropriate box
… One-phase process
Applications are submitted in full and reviewed
… Two-phase process
Applicants submit a brief summary of their research proposal, this is assessed
and the best applications are invited to submit a detailed outline of their research
proposal
HERA Work Package 3 Task 3.2 Peer Review Report
Page 108 of 124
Q8 Timeframe for total application process
Please indicate the average timeframe (in months) per research funding instrument
If no independent reviewers or assessment panels are utilised, please insert n/a (not applicable) in the relevant box
Programme
title
Call launched
to
Closing date
Closing date
to
Completion of
application
processing
Completion of
application
processing
to
Completion of
independent
reviewer stage
Completion of
independent
reviewer stage
to
Completion of
assessment panel
meeting – final
evaluation stage
Final evaluation
to
Final decision
making
Final decision
making
to
Funding awarded
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
____ months
Please note any additional comments
HERA Work Package 3 Task 3.2 Peer Review Report
Page 109 of 124
Q9 How do applicants apply to funding instruments?
Please 5 the appropriate box
… Submit paper application by post
… Submit application via email
… Electronic online application process
If an electronic online process is in place, please indicate:
Date of implementation ____________________
If an electronic online process is not in place, please 5 where appropriate
It is planned within the next five years …
There are no current plans …
Q10 Language
Please 5 the appropriate boxes
Call for proposals
… National language
Please state__________________
… English
… Other language (please state)
________________________
… Other language (please state)
________________________
Applications submitted in
… National language
Please state___________________
… English
… Other language (please state)
________________________
… Other language (please state)
________________________
Q11 Are non-nationals eligible to apply for funding?
Please complete a table for each funding instrument
Title of funding instrument:
…
…
…
…
…
…
…
No, non-nationals are ineligible
Yes, non-nationals are eligible
Yes, open to EU nationals only
Yes, open to international nationals
Yes, open to EU nationals, if resident in state for ______ number of years
Yes, open to international nationals, if resident in state for __ number of years
Other criteria (please state) ________________________________
Title of funding instrument:
…
…
…
…
…
…
…
No, non-nationals are ineligible
Yes, non-nationals are eligible
Yes, open to EU nationals only
Yes, open to international nationals
Yes, open to EU nationals, if resident in state for ____ number of years
Yes, open to international nationals, if resident in state for __ number of years
Other criteria (please state) ________________________________
Title of funding instrument:
… No, non-nationals are ineligible
HERA Work Package 3 Task 3.2 Peer Review Report
Page 110 of 124
…
…
…
…
…
…
Yes, non-nationals are eligible
Yes, open to EU nationals only
Yes, open to international nationals
Yes, open to EU nationals, if resident in state for ____ number of years
Yes, open to international nationals, if resident in state for __ number of years
Other criteria (please state) ________________________________
Q12 Do applicants supply references with their applications?
Please 5 the appropriate box
… Yes, written reference(s) is(are) included
… Yes, referee is identified by the applicant and contacted by the organisation
… No
If yes, how many references are required to be submitted per application ______
Q13 Success Rate
Please indicate the % on an annual basis
If 100% = total number of applications received for all funding instruments
Then ______% = number of applications that are actually funded
HERA Work Package 3 Task 3.2 Peer Review Report
Page 111 of 124
Q14 Applications received per annum
Please complete the following table:
Include details per funding instrument operated: title of funding instrument, number and % of applications received per year (with gender
breakdown where available), number and % of applications that received funding (with gender breakdown where available).
*T = total
*M = male
*F = female
*Mean = average number and % of applications
*% = percentage
*No. = number
****PLEASE NOTE THAT ONLY ESTIMATES ARE REQUIRED HERE
Title
2003
T
Number of applications
received
% of applications received
Number of applications
funded
% of applications funded
No. applications
M
2004
F
T
M
2005
F
T
M
Mean
F
T
100%
100%
100%
100%
100%
100%
100%
100%
% applications
No. funded
100%
100%
100%
100%
% funded
No. applications
100%
100%
100%
100%
% applications
100%
100%
100%
100%
HERA Work Package 3 Task 3.2 Peer Review Report
Page 112 of 124
M
F
No. funded
% funded
No. applications
100%
100%
100%
100%
% applications
No. funded
100%
100%
100%
100%
% funded
No. applications
100%
100%
100%
100%
% applications
No. funded
100%
100%
100%
100%
% funded
No. applications
100%
100%
100%
100%
% applications
No. funded
100%
100%
100%
100%
% funded
100%
100%
100%
100%
Please note any additional comments:
HERA Work Package 3 Task 3.2 Peer Review Report
Page 113 of 124
SECTION 5: Peer Review
Q15 Peer Review objectives
Please rate (from 1 to 5) the objectives that relate to your organisation
(1 = highest rating, 5 = lowest rating)
Objectives
Rating
1Æ5
Additional comments
Evaluate applications
Make funding
recommendations
Fund research excellence
Transparent process
Impartial process
International
benchmarking
Other (please state)
____________________
Q16 Overview of Peer Review Process
Please 5 the appropriate box
… One-phase process
Applications are submitted in full and reviewed
… Two-phase process
Applicants submit a brief summary of their research proposal, this is assessed
and the best applications are invited to submit a detailed outline of their research
proposal
If a two-phase process is in place, who evaluates the initial summaries submitted
by candidates? ____________________________________________________
Q17 Peer review process utilised by responsive research funding
instruments
Please 5 the box(es) that apply to the peer review process employed
… Independent domestic reviewers only
… Independent international reviewers only
… Independent domestic and international reviewers
… Research Council members as independent reviewers
… Panel of domestic assessors only
… Panel of international assessors only
… Panel of domestic and international assessors
… Panel of Research Council members only
HERA Work Package 3 Task 3.2 Peer Review Report
Page 114 of 124
… Other (please state) ______________________________
Please describe why the organisation utilises this(these) method(s)?
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Q18 Peer review process utilised by thematic research funding
instruments
Please 5 the box(es) that apply to the peer review process employed
… Independent domestic reviewers only
… Independent international reviewers only
… Independent domestic and international reviewers
… Research Council members as independent reviewers
… Panel of domestic assessors only
… Panel of international assessors only
… Panel of domestic and international assessors
… Panel of Research Council members only
… Other (please state) ____________________________________
Please describe why the organisation utilises this(these) method(s)?
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Q19 Details of Independent Reviewer
(either Research Council member, domestic or international reviewer)
Please supply the following details
** If your organisation does not use this method, please 5 the not applicable box
… Not applicable (n/a)
Please list the funding instruments (and 5 whether thematic or responsive) that
utilise independent reviewers
Title of funding instrument
Thematic
Responsive
1
2
3
4
5
How many independent reviewers per application? __________________
Do independent reviewers receive financial compensation for evaluations?
… Yes
… No
If yes, how much do they receive?
EURO €
Do reviewers use a standard evaluation form?
… Yes
HERA Work Package 3 Task 3.2 Peer Review Report
Page 115 of 124
… No
If yes, please enclose a copy (in English) with the questionnaire
Do reviewers grade (rate) the application?
… Yes
… No
Q20 Details of Domestic/Research Council member Panels
Please supply the following details
** If your organisation does not use this method, please 5 the not applicable box
… Not applicable (n/a)
** If your organisation has both domestic and international assessors on the
same panel, please 5 the box below and complete both domestic and
international tables
… Domestic and international assessors on the same panel
Please list the funding instruments (and 5 whether thematic or responsive) that
utilise a domestic/Council panel
Title of Funding Instrument
Thematic
Responsive
1
2
3
4
5
How many assessors on average compose a peer review panel? ____________
Does the panel size (number of assessors) differ per funding instrument?
… Yes
… No
If yes, please elaborate ______________________________________________
_________________________________________________________________
How many panel members are male _____ female ______
Indicate the average gender balance: ____% male and ____% female members
Do panel members receive financial compensation for evaluations?
… No
… Yes, a flat fee + travel expenses
… Yes, a fee per application reviewed (per written evaluation completed) +
travel expenses
… Yes, a flat fee only
… Yes, a fee per application reviewed (per written evaluation completed) only
… Yes, travel expenses only
If applicable, please indicate the flat fee
EURO €
If applicable, please indicate the fee per application reviewed
EURO €
Do panel members complete a standard evaluation form per application?
HERA Work Package 3 Task 3.2 Peer Review Report
Page 116 of 124
… Yes
… No
If yes, please enclose a copy (in English) with the questionnaire
Do panel members grade (rate) the applications?
… Yes
… No
Do panel members rank the applications?
… Yes
… No
Q21 Details of International Panels
Please supply the following details
** If your organisation does not use this method, please 5 the not applicable box
… Not applicable (n/a)
** If your organisation has both domestic and international assessors on the
same panel, please 5 the box below and complete both domestic and
international tables
… Domestic and international assessors on the same panel
Please list the funding instruments (and 5 whether thematic or responsive) that
utilise an international panel
Title of Funding Instrument
Thematic
Responsive
1
2
3
4
5
How many assessors on average compose a peer review panel? ____________
Does the panel size (number of assessors) differ per funding instrument?
… Yes
… No
Is yes, please elaborate ______________________________________________
_________________________________________________________________
How many panel members are male _____ female ______
Indicate the average gender balance: ___% male and _____ % female members
Do panel members receive financial compensation for evaluations?
… No
… Yes, a flat fee + travel expenses
… Yes, a fee per application reviewed (per written evaluation completed) +
travel expenses
… Yes, a flat fee only
… Yes, a fee per application reviewed (per written evaluation completed) only
… Yes, travel expenses only
If applicable, please indicate the flat fee
HERA Work Package 3 Task 3.2 Peer Review Report
Page 117 of 124
EURO €
If applicable, please indicate the fee per application reviewed
EURO €
Do panel members complete a standard evaluation form per application?
… Yes
… No
If yes, please enclose a copy (in English) with the questionnaire
Do panel members grade (rate) the applications?
… Yes
… No
Do panel members rank the applications?
… Yes
… No
Q22 When are International Review panels employed?
Please 5 the appropriate box(es)
… Standard procedure for all applications
… Only if no national expert is identified
… Only if a conflict of interest with national experts arises
… Only if funding instruments/applicants are of a certain academic standard
Please state ___________________________________________________
… Only if funding instruments award a certain level of funding
Please indicate the level of funding necessary:
EURO €
… Other (please state) ______________________________________________
Q23 Number of reviewers per application
Please indicate the estimated number of reviewers per single application
(consider those reviewers that evaluate the application in detail)
__________
Q24 Identify the process for selecting reviewers
Please 5 the appropriate box(es)
… Evaluating applications is a designated task of Council members
… Council members are selected due to their expertise
… A reviewer is selected from a database of domestic reviewers
… A reviewer is selected from a database of international reviewers
… Council members nominate/select appropriate reviewers
… Reviewers are sourced individually by Research Council staff
Please state (for example, where reviewers are sourced via Research Council
contacts, research networks, internet etc.) ____________________________
_______________________________________________________________
… Applicants are requested to recommend appropriate reviewers
… Other (please state) ______________________________________________
HERA Work Package 3 Task 3.2 Peer Review Report
Page 118 of 124
Q25 Criteria for selection of reviewers
Please rate (from 1 to 5) the following criteria in terms of importance
(1 = highest rating, 5 = lowest rating)
Criteria
Rating
1Æ5
Additional comments
Academic excellence
Previous peer review
experience
Relevant disciplinary
(subject area) competence
Encourage young
academics
Create gender balance
Other (please state)
_____________________
Other (please state)
_____________________
Q26 Reviewer recognition
Please 5 the appropriate box
Do reviewers receive formal recognition for their participation in the peer review
process?
… Yes
… No
If yes, please 5 how reviewers are recognised
… noted on CV (Curriculum Vitae)
… noted in academic publications
… Other (please state) ______________________________________________
Q27 Reviewer Training
Do reviewers receive training?
… Yes
… No
If yes, please specify _____________________________________________
Q28 Conflict of interest
Does the organisation have a ‘conflict of interest’ policy or clause that applies to
independent reviewers?
… Yes
… No
If yes, please supply details __________________________________________
_________________________________________________________________
Does the organisation have a ‘conflict of interest’ policy or clause that applies to
HERA Work Package 3 Task 3.2 Peer Review Report
Page 119 of 124
domestic, Council members or international assessment panels?
… Yes
… No
If yes, please supply details __________________________________________
_________________________________________________________________
Q29 Are applications rated and ranked?
Are applications rated (graded)?
… Yes
… No
If yes, what grading system is utilised?
Is the grading system A++, A+, A, B, NR (Not Recommended) utilised?
… Yes
… No
If no, please state the grading system __________________________________
_________________________________________________________________
Are applications ranked?
… Yes
… No
Please elaborate if necessary _________________________________________
_________________________________________________________________
Q30 Application evaluation criteria
Please rate (from 1 to 5) the following criteria in terms of importance
(1 = highest rating, 5 = lowest rating)
Criteria
Rating
1Æ5
Additional comments
Research proposal
Broad aims and objectives
of the research
Proposal description
Proposed schedule of
development of proposal
Location of the research
proposal within the current
state of research
Bibliography in the
research area
Relative significance of the
contribution that the
research proposal will
make to the research field
Methodology
Theoretical framework
Suitability of institution
proposed
Plans for publication and
HERA Work Package 3 Task 3.2 Peer Review Report
Page 120 of 124
dissemination of the
research results
Other (please state)
Other (please state)
Principal applicant
Academic record and
achievements
International collaboration
Previous awards/funding
Mobility of researchers
Other (please state)
Other (please state)
Q31 Feedback to applicants
Please 5 the appropriate box(es)
An independent reviewer’s written evaluation is available to the applicant?
… Yes
… No
An independent reviewer’s rating (grade) is available to the applicant?
… Yes
… No
An independent reviewer is anonymous to the applicant?
… Yes
… No
An assessment panel member’s written evaluation is available to the applicant?
… Yes
… No
An assessment panel’s rating (grade) is available to the applicant?
… Yes
… No
Assessment panel members are anonymous to the applicant?
… Yes
… No
Are discussions of the applications by assessment panel members recorded or
minuted at the meeting?
… Yes
… No
If yes, are the comments recorded or minuted available to applicants?
… Yes
… No
HERA Work Package 3 Task 3.2 Peer Review Report
Page 121 of 124
Is the evaluation process subject to a national ‘Freedom of Information’ Act or
similar legislation?
… Yes
… No
If yes, please state ________________________________________________
Please describe alternative feedback practices to applicants if necessary
_________________________________________________________________
Q32 Feedback to reviewers
Please 5 the appropriate box
Is feedback provided to reviewers?
… Yes
… No
If yes, please state ________________________________________________
Q33 Appeals process
Please 5 the appropriate box
Is an appeals process available to unsuccessful applicants?
… Yes
… No
If yes, please state ________________________________________________
Q34 Cost of peer review process
Please indicate an estimate of the overall cost of the peer review process
(estimate the cost of evaluating one research funding instrument)
EURO €
Please elaborate if necessary _________________________________________
SECTION 6: Finalisation of funding awards
Q35 Procedure for finalising funding of successful applications
Please 5 the appropriate box(es)
Are the assessment panels’ recommendations final?
… Yes
… No, Council members make final recommendations
… No, please state reason __________________________________________
If yes, please identify the procedure following the assessment panels’ final
recommendations?
HERA Work Package 3 Task 3.2 Peer Review Report
Page 122 of 124
Please 5 the appropriate box(es)
… The Council board automatically agrees to fund all final recommendations
… The Council board agrees to fund the recommendations that fall within the
funding allocated to the specific funding instrument
… The Council board agrees to fund recommendations that fall within the
organisation’s research priorities
… The Council board agrees to fund recommendations that fall within the
organisation’s designated percentage of funding allocated to certain
disciplines (responsive mode) and themes (thematic mode)
… The Council board is not required to sanction the recommendations
… The executive/management of the organisation sanctions the
recommendations
… Other (please state) ______________________________________________
Q36 Does the successful applicant/researcher enter into a formal
contract with the organisation?
Please 5 the appropriate box
… Yes
… No
If yes, please supply details ___________________________________________
_________________________________________________________________
Q37 Payment of funds
Please 5 the appropriate box
How are research funds transferred to successful applicants?
… Funds are transferred directly to successful applicants
… Funds are transferred to the institutions supporting successful applicants
… Other (please state) _______________________________________________
Q38 How is the public informed of applications that have been awarded
funding?
Please 5 the appropriate box(es)
… Organisations website
… Other websites
… Mail-shot (a dedicated email sent to a relevant database of interested parties)
… Organisation newsletter
… Organisation brochure
… National press
… National journals
… International journals
… Award ceremony
… Other (please state) _________________
HERA Work Package 3 Task 3.2 Peer Review Report
Page 123 of 124
Appendix
Please supply contact details of the relevant personnel responsible for
the completion of the questionnaire. The contact person should be
available to liaise both by telephone and email with regard to the
questionnaire.
Name of contact person
Title of contact person
Position in organisation of contact
person
Telephone number of contact person
Email address of contact person
Please enclose the following, if available:
Application Procedure documentation
•
•
•
Sample of call launched (e.g. press advertisement/email circulated etc.)
Sample of application form
Sample of evaluation form
HERA Work Package 3 Task 3.2 Peer Review Report
Page 124 of 124