The societal impact of applied health research

The societal impact of applied health research
Koninklijke Nederlandse Akademie van Wetenschappen
The societal impact of applied
health research
Towards a quality assessment system
Council for Medical Sciences
Amsterdam, 2002
3
© 2002 Royal Netherlands Academy of Arts and Sciences
No part of this publication may be reproduced, stored in a retrieval system or
transmitted in any form or by any means, electronic, mechanical, photo-copying,
recording or otherwise, without the prior written permission of the publisher.
Council for Medical Sciences
P.O. Box 19121, 1000 GC Amsterdam, the Netherlands
Telephone: 00-31-20-5510779
Fax: 00-31-20-6204941
E-mail: [email protected]
WWW-address: http://www.knaw.nl/rmw
isbn 90-6984-360-9
The paper in this publication meets the requirements of ∞ iso-norm 9706 (1994)
for permanence
4
Content
Summary 7
Background 9
Assessing societal impact: experience in practice 13
Analysis of societal impact of health care research and propositions for assessment
criteria 17
Suggested methods and procedures 25
Conclusions and recommendations 31
List of references 33
Appendix 1. Composition of the Applied Health Research Committee 37
Appendix 2. Invitational meeting ‘The societal impact of applied health research’,
19 June 2001 39
Appendix 3. ‘Measuring the social impact of research’ by R. Smith 51
Content
5
6
Summary
In addition to scientific quality, societal impact is an explicit objective for important areas of applied health research. The current methods for evaluation of the
scientific quality of health research are in principle satisfactory. In the Netherlands,
the Council for Medical Sciences of the Royal Netherlands Academy of Arts and
Sciences has executed such evaluations several times, according to a widely accepted
methodology. Recently, a new national system of quality assurance for all academic
research has been introduced, based on self-assessments and site visits. An accepted
methodology for the evaluation of the societal impact of applied health research,
however, was until now not available. The latter type of evaluation should be complementary to the evaluation of scientific quality in the sense that scientific quality
is a sine qua non and that societal impact is, for applied health research, an important additional requirement. The Council for Medical Sciences considers the development of such a methodology of importance, since this would be an incentive
for investigators to improve their performance in this respect and would avoid
societal impact being considered as not more than an optional element.
The present report represents a general outline for an evaluation methodology of
societal impact of applied health research, including potentially suitable indicators,
as a basis for further specification and development. In short, the following procedure is recommended:
The evaluation of societal impact should be included in the new national quality
assessment system which will come into force in 2003. One single external review
committee should evaluate both scientific quality and societal impact. The research
Summary
7
mission is indicative for how and to what extent societal impact of the research
should be evaluated. Where applicable, research institutes or groups are asked to list
and describe in the self-assessment report the indicated output they consider relevant in realizing their societal mission.
Elements that are relevant when assessing societal impact are:
a. the mission of the research team or the institute;
b. its performance in relation to that mission;
c. the prospects for the future;
d. recommendations for adjustments, where appropriate.
Relevant output categories and a qualitative description of indicators of societal
impact for various possible applications are listed in the report.
Before finalizing the report, the Council for Medical Sciences discussed it with an
international forum of researchers and experts in the field of research quality assessment. In this meeting it was concluded that the new national procedure for research
assessment offers ample possibilities to include an evaluation of the societal impact
of applied health research. Most criteria listed in the report can be easily implemented as part of the self-assessment.
The issues dealt with in this report apply to those areas of health research, where
an explicit part of the mission is to ensure that the research findings can be applied
within the foreseeable future. However, given the broad scope of applied health
research, the report more specifically focuses on health care research. Even so, the
recommendations can be seen in a wider perspective.
8
Background
Introduction
In February 1998 the Executive Board of the Council for Medical Sciences (formerly
called Medical Committee) was approached by one of its members, Prof J.A.
Knottnerus, who pointed out that there was no accepted methodology for evaluating important areas of applied health research, in particular health care research, as
to its societal impact. The Board was asked to consider developing such a methodology, which would help achieving a balanced assessment and enable the comparison
of the performance of many applied health research teams in the Netherlands,
complementing and in addition to the customary assessment of scientific quality.
The Board responded that it recognized the need for a methodology of this kind and
agreed it was important to develop one.
A practical strategy was then devised, resulting in the setting-up of an Applied
Health Research Committee, whose remit was (a) to survey methods of measuring
and assessing applied health research and any problems this gave rise to and (b) to
draw up indicators for future assessments and thus create the basis for an evaluation
methodology. The results were to be presented in a report and discussed at a workshop organized by the Council for Medical Sciences and then embodied in practical
recommendations.
In this report, after giving a further definition of the subject under study, an inventory is presented of experiences in the evaluation of societal impact of applied
health research. Subsequently, possible indicators of societal impact are discussed,
and a methodology of assessment of applied health research is outlined.
Background
9
Definition of the subject
Evaluation of health research has become increasingly important for the prognosis
of departments, institutes and individual careers. In the Netherlands, the Royal
Netherlands Academy of Arts and Sciences (knaw) has played a central role in this
context, with its extensive evaluations of and setting the standards for the research
output of the eight Dutch medical faculties on a four-yearly basis.1,2,3 Both qualitative and quantitative output indicators were used and external peers were consulted. Furthermore, the knaw acknowledges the Dutch research schools, in which the
strongest research groups combine their efforts in a thematic programme and
organize national PhD courses. After a research school is acknowledged, the research output is critically evaluated by the knaw every five years. The evaluation
procedure is based on a peer-review system. Recently, a new national system of
quality assurance for all academic research has been developed.28 This system,
which is based on self-assessments and site visits by peers, will replace all research
evaluations of publicly funded research from the year 2003.
When the Discipline Plan and Discipline Reports on (Bio)Medical and Health
Sciences Research were being drawn up (in 1988, 1994 and 1998),1,2,3 each time the
need was felt for additional methods and criteria for assessing the societal impact of
applied health research. Applied health research differs from ‘fundamental’
(bio)medical research in its dual mission, which is both scientific and societal: it is
explicitly concerned not only with the acquisition of scientific knowledge as such
but also with the usefulness and implementation of scientific achievements. So, relevance to society is an important explicit objective of applied health research and
evaluation should, therefore, not be restricted to scientific quality. A formal evaluation would acknowledge the importance of societal impact of the research at issue.
Societal impact may include ‘implementable’ output that is ready for use as well as
‘implemented’ output that has already been applied . This kind of evaluation would
be an incentive for investigators to improve their performance in this respect and
would avoid societal impact being considered as not more than an optional element.
In 1991, the Council for Medical Sciences of the knaw set up a Health (Care)
Sciences Committee, whose remit included adopting a position on how to assess
the quality of health (care) sciences. Given the relatively close ties between the
health care sciences and national systems and factors and the culture of publication,
the Committee recommended that, in addition to journals listed in the Science
Citation Index or Social Science Citation Index (sci/ssci), a list be drawn up of the
principal scientific publication media for this type of research.4 This ‘blue list’ was
composed and used for the first time in the evaluation procedure of the Discipline
Report 1994,2 and it was used again – in updated form – in the Discipline Report
1998.3
10
While the current methods for evaluation of the scientific quality of research are
in principle satisfactory, the development of a new instrument for evaluation of
societal impact cannot, however, be detached from the evaluation of scientific
quality. The latter type of evaluation should be complementary to the evaluation of
scientific quality in the sense that scientific quality is a sine qua non and that societal
impact is, for applied health research, an important additional requirement.
The issues dealt with in this report apply to different areas of applied health
research. An explicit part of the mission of applied health research is to ensure that
the research findings can be applied within the foreseeable future, i.e. fairly soon
after the research is completed. Examples are applied clinical research and research
and development of medical technology. It is not possible, however, to discuss all
areas of applied health research in sufficient depth in this report. Therefore, this
report predominantly focusses on a part of applied health research that is called
health care research. Health care research, in the definition of the Committee,
includes research into the effectiveness and cost-effectiveness of health care interventions, care management strategies and guidelines, and health services research.
This report discusses aspects of societal impact specific to this area. Although this
provides a good example for other types of applied health research, additional
analyses will no doubt be needed in other areas of applied health research to develop
suitable indicators of its societal impact.
Background
11
12
Assessing societal impact:
experience in practice
There have been various attempts in the past to evaluate the societal impact of research. The Committee examined these in order to establish what elements could
be used in this report.
In 1994 a field trial took place with a quality control system based on visitations,
supervised by the Committee for Experimental Visitations of Health Research
(bevg), which involved ten institutions and funds in the area of health research.
Each visitation comprised a management evaluation and a substantive quality
assessment which had to explicitly include an assessment of the societal and/or
applied value. A set of criteria were drawn up for this latter assessment: the checklist
can be found in the Appendix to the Supervisory Committee’s report.5 In its report
the Supervisory Committee concluded that ‘Societal relevance is best assessed on
the basis of the mission of the organization being inspected. Quantifying the societal value of research, however, is almost a contradiction in terms: its possible value
can only be estimated by a properly constituted external review committee.’
nivel (Netherlands Institute of Primary Health Care) in Utrecht was one of the
organizations that took part in the bevg field trial, which gave nivel a stimulus by
making the societal aspects of research visible and resulted in its developing tools for
monitoring and assessing the societal impact of research. Since then, it was decided
that self-evaluations will be drawn up every five years for external visitations: these
deal with both the scientific and societal aspects of the quality of research. The
societal impact of research findings is measured by their implementation through
professional journals, publications in the press, invitational conferences and mem-
Assessing societal impact
13
berships of advisory committees. By yearly consultation with all relevant stakeholders nivel ensures that both individual projects and research programmes as a whole
meet the clients’ needs. Since January 2000 the research has been certified in line
with iso 9001 criteria.
The mission of the tno institutes is to enhance the innovative technological
strength of industry, commerce and government by means of strategic research,
applied research and consultancy. Every four years the institutes undergo an external audit by an international committee of peers designed to assess their knowledge
status and market relevance. On the audit committee are scientists and representatives of the most important potential clients. To supplement this, customer satisfaction audits are carried out, focussing on the quality of service and accessibility.
Research in the universities has been assessed since 1993 on a disciplinary basis by
means of the vsnu’s assessment of research quality system. The assessment – which
is mainly scientific – of faculty research programmes is carried out by an international committee of peers on the basis of a self-evaluation/report on research
achievements during the past five years, stating scientific publications, theses and
dissertations, professional journals, patents and plans for the future. This committee looks at scientific quality, productivity, relevance and prospects for the future.
The self-evaluation may include a description of the societal or technological relevance of the research, enabling the former to be taken into account in the assessment of societal impact. In fields such as the technical sciences and chemistry the
utilization of research (measurable from patents and – in some cases – concrete
products) is also taken into account. The most recent assessment of medical research carried out under the auspices of the knaw and vsnu jointly (Discipline
Report on (Bio)Medical and Health Sciences Research in the Netherlands 19983)
was done as outlined above. In addition, an evaluation of the organizations’ research management was included based on self-evaluation, taking such aspects as
mission, infrastructure, financial management and personnel and career policy into
account.
Hardly any of the quality assessments of university research published so far deal
with the societal impact of the research separately in an explicit fashion, let alone
judge it by separate criteria especially developed for this purpose. An exception is
the vsnu assessment of agricultural sciences (published in February 1999). A supporting study was carried out for this assessment which yielded information on
both the scientific and societal value of the research programmes in this field. The
study was adapted for a publication of more general interest (‘Methodology for
incorporating the societal value of research’6) and discussed in a workshop,7 organized by the National Council for Agricultural Research (nrlo) and the Consultative Committee of Sector Councils (cos). The methodology comprises three parts:
14
constructing a specific profile for the research programmes – the ‘Research Embedment and Performance Profile’ (repp) –, a user analysis and a feedback to the research programme’s mission. The repp represents the embedment and performance (type and extent of activity) of the research programmes in the relevant
societal environment, based on specific indicators classified into five social domains:
a. Science and certified knowledge : the production of knowledge claims for validation by the scientific community (peers);
b. Education and training : the training of researchers and the generation of skills
(embodied or tacit knowledge);
c. Innovation and professionalism : the production of knowledge with a view to
gaining a competitive advantage;
d. Public policy : the production of knowledge and skills for policy or social
purposes;
e. Collaboration and visibility : ‘internal’ orientation and performance in the
contribution to the research organization’s goals as compared with orientation
towards other national and international institutions.
A supplementary user survey of stakeholders then establishes the significance of the
research programmes to the stakeholders’ activities. Finally, feedback and evaluation take place on performance in relation to the research team’s mission.
This type of evaluation is designed not so much to yield a comparison of similar
teams or institutions as to provide guidance to a team or institution on its future
development.
Some other examples of assessment of societal impact can be found in a survey
drawn up in 1997 by Rip and Van der Meulen for the Ministry of Education, Culture and Science.8
Measuring the societal impact of applied research is a topic in other countries too.
The Royal Academy of Engineering (London, uk) recently published a report,
‘Measuring Excellence in Engineering Research’, 9 exploring the development of a
methodology for measuring excellence in engineering research. As with applied
health research, the evaluation of science and engineering research is a complex
process. The conclusion was that excellence in engineering can be identified by five
different characteristics:
a.
b.
c.
d.
e.
Strategy ;
Mode I research (fundamental research);
Mode II research (applied research);
Scholarship;
Vitality and Sustainability.
Assessing societal impact
15
Measuring excellence should not be limited to the quality of published research
findings, but is to be found in the combination of these characteristics. In addition
to the five characteristics, peer-review is highly regarded and must play a leading
role in the assessment process. While the focus of the report is on engineering, it is
suggested that the methodology is adaptable to different, more general needs,
although the weightings given to different measures will vary across sectors.
To summarize the relevant findings from this survey of earlier efforts and findings, we can conclude the following as to the assessment of the societal impact of
research:
– The need for a valid methodology for assessing the societal impact of research is
an international issue;
– A research team’s mission is indicative for how and to what extent the societal
impact of the research should be evaluated;
– A start has already been made on developing some indicators of societal impact
of research;
– The extent to which research findings result (or may result) in innovations in
care and health care policy has hitherto played little if any part in societal impact
assessments;
– Self-evaluation and external audits are important elements in the assessment;
– Feedback from stakeholders and users of research output can make essential
contributions to the evaluation.
16
Analysis of societal impact of health
care research and propositions for
assessment criteria
Over the past years the evaluation of research output has been almost exclusively
focussed on scientific quality and the role of bibliometric analysis of the scientific
impact of international research publications has become increasingly important.10,11,12 Since consequences of an evaluation can be profound, evaluation of the
societal impact of health care research should also be considered.
Societal impact of health care research can be divided into: (1) relevance for health
care providers and the process of health care delivery, and (2) relevance for policy
makers and the process of designing, implementing and monitoring policy decisions. In assessing the societal impact of research it seems prudent to discriminate
between potential (ex ante) and realized (ex post) societal impact:
– ex ante evaluation focusses on the transformation of societal problems into
research questions;
– ex post societal impact refers to the degree to which research is able to answer
research questions, and can subsequently translate the scientific conclusions
into practical solutions or policy implications.
The assessment of the societal impact of health care research can be supported by
using indicators for the relevance and urgency of the research problem addressed
(like burden of disease, the amount of uncertainty with regard to the research question at issue, and the organizational and monetary benefits involved) and the likelihood that the research outcomes will have impact on decision-making and priority
setting in health care.13
Analysis of societal impact
17
A final objective is, of course, that the health outcome of individuals and the
population would be improved. However, evaluating this would be a long term
challenge, requiring extensive specific research. In a more general societal impact
assessment, therefore, indirect indicators are to be used.
In the next paragraphs various aspects of societal impact will be discussed and
possible indicators will be proposed. Not all mentioned indicators will be equally
important at the same time. The criteria and indicators used for an evaluation will
always be dependent on the mission of a specific research group or institution.
Speaking about societal impact, it is important to note that for many areas of
applied health research an important endpoint can be the production of output that
is ready and available for implementation and decision-making. In this context, the
decision of indeed using such ‘implementable’ output is a matter of those responsible in the executive professional and policy field. In other research, e.g. especially
initiated to solve specific societal problems, the implementation itself should be
evaluated (‘implemented’ output).
Bibliometric analysis of non-indexed journals
The role of non-indexed journals in quality assessments
Bibliometric analyses are, at present, generally based on counting publications or
citations to papers published in journals indexed by the Science Citation Index
(sci) or the Social Science Citation Index (ssci) of the Institute of Scientific Information (isi). A limitation of such analyses is, typically, that publications in non(s)sci journals such as, for instance, peer-reviewed national scientific journals, are
ignored.10,14,15 The percentage of relevant journals which are not included in the
(s)sci probably varies per field, and might be the largest in the field of applied health
research (as compared with fundamental and strategic health research). In its peerreviewed evaluation of research output, the knaw has made an effort to remedy this
situation by listing non-(s)sci journals which are relevant to health care research in
the so-called ‘blue list’. The extent to which this has, indeed, made a meaningful
contribution to the assessment of the scientific quality of health research, cannot be
easily evaluated. Firstly, the ‘blue list’ appears to be a somewhat heterogeneous
collection of non-(s)sci English, Dutch and one French journals, that are not
always peer-reviewed and the extent to which they cater for a scientific forum is not
always clear. Secondly, the contribution of the listed journals to a bibliometric
analysis can only consist of publication counts, as no impact factors for these journals are available and neither are there any counts of the citations to the articles at
issue. Thirdly, the ‘blue list’ focusses exclusively on health care research, thus ignoring other categories of applied health research, for which non-(s)sci publications
might be equally important. Fourthly, it can be argued that there will probably be a
18
strong correlation between the scientific quality of the papers submitted by a department, institute or individual to indexed and non-indexed journals. There is, for
instance, some evidence that the scientific quality of randomized clinical trials
(rcts) published in the English language is similar to those published in other
languages by the same authors, although statistically significant results are much
more prevalent among publications of rcts in the English language.16,17,18 On the
whole, one gets the impression that a bibliometric analysis in the context of peerreviewed evaluation of scientific quality might in most cases be restricted to (s)sci
publications. However, in research fields that to a large extent focus on topics that
are specifically of local (e.g. national) interest, such as the study of national law or
policy issues, the peer-reviewed part of the ‘blue list’ media can contribute to a
balanced scientific evaluation regarding these topics.
Are publications in national professional journals suitable indicators of societal impact?
Dutch investigators may publish their research output in English for an international forum, but also in their native language.19 Publications in national professional journals, both peer-reviewed and non-peer-reviewed, seem to play a very
important role in the communication of the results of applied health research by
Dutch investigators to the workers in the health profession at issue. Consequently,
these publications in journals such as the Nederlands Tijdschrift voor Geneeskunde,
Huisarts en Wetenschap, Tijdschrift voor Gezondheidswetenschappen, Tijdschrift voor
Verpleegkunde and Nederlands Tijdschrift voor Fysiotherapie may play a central role
in realizing the societal impact of the research output of Dutch investigators. Only
approximately 10% of physicians in the Netherlands read international medical
journals, which suggests that publication in a national professional journal is
important if research output is to have any impact at all on daily practice and the
quality of health care in the Netherlands.20
There are many examples of a profound societal impact of publications in national professional journals,21,22 although it is also true that these partly concern the republication of papers in (s)sci journals. Authors of articles published in national
professional journals typically receive many reactions from their colleagues, and are
often given some coverage in the lay press. Civil servants and politicians seem to
have a similar tendency to favour professional journals in the Dutch language. This
clearly enhances the impact of the research output at issue on health policy. In
conclusion, counting the publications in national professional journals may provide a crude indication of the societal impact of the research output of a department, an institute or an individual investigator when applied health research is at
issue.23 While the ‘blue list’ seems to be of moderate importance for the evaluation
of scientific quality, the national professional journals included in this list might be
very useful for the evaluation of societal impact.
Analysis of societal impact
19
Quality improvement and implementation activities
Quality improvement (qi) is concerned with the systematic and planned approaches aimed at assuring that patients get the best care possible. It includes formulation
of guidelines, protocols, indicators and criteria for optimal care, assessment of
actual care, and activities aimed at improving care when necessary. Implementation
of knowledge is a new concept addressing the systematic and planned process of
integrating (new) research findings or valuable procedures and techniques within
normal practice routines. It usually includes a review of current research findings,
formulation of guidelines for practice, analysis of determinants of and problems
related to using the guidelines or research, specific implementation actions or
programmes and evaluation of the use of the guideline. qi and implementation of
research findings are closely related and have become important research fields
themselves. In this respect, indicators for assessing the societal impact of these types
of research are also closely related and overlapping. The weighting given to the
different indicators for the evaluation of societal impact will therefore vary according to the area of research. Proposed indicators:
a. Guidelines and protocols
Widely used methods to implement research evidence are the development and
dissemination of clinical practice guidelines and practice protocols. A considerable amount of such guidelines are published in the Netherlands by a variety of
organizations e.g. the Health Council of the Netherlands (gr), the College of
Care Insurances (cvz), the Dutch Institute for Quality in Health Care (cbo),
the Dutch College of General Practioners (nhg), other scientific societies of
physicians, paramedics and nurses, non-academic research institutions (tno,
Trimbos Institute, nivel) and hospitals. International guidelines are increasingly seen. The guidelines can be important products of scientific and educational work and should be valued as such. However, not all guidelines are of
good quality and, in order to be evaluated positively as a relevant product, the
guideline should meet specific quality criteria24,25. Inclusion of research work
within systematically developed clinical guidelines or participation in setting up
such guidelines can be used as an indicator for the societal impact of research
work. Internationally acknowledged criteria for guideline formulation (agreeinstrument) may be used in this respect.
b. Indicators and instruments for assessing the quality of care and use of guidelines
A specific very important type of tool in the field of qi is concerned with assessing the actual care delivered to patients. A wide variety of indicators, criteria and
instruments is used (practice audit systems, surveys among patients or care
20
providers, videotaped contacts, observation, existing data sources). Seen from a
societal perspective these are important products, requiring considerable effort,
which can only partly be published in the international literature. Criteria for
evaluation of this activity have to do with:
– the systematic development is reported;
– the validity and reliability is reported;
– the applicability in clinical practice and acceptance by the target group are
tested;
– the tools are included in a concrete product or instrument used in assessing
or improving the quality of care.
c. Methods and programmes for improving care and implementing evidence or valuable procedures
The development of a method, strategy, tool or programme to change clinical
practice is a very creative and relevant undertaking, that usually is not valued in
research evaluations, although it takes considerable time and investment of
researchers. Examples: a well tested, interactive programme for Continuous
Medical Education (cme), a small group peer-review method, a computer
decision-support tool, a method for practice visits to support teams or practices,
a reminder system for prescribing, a feedback method on test ordering, etc.
Evaluation of the value of such activities can be related to :
– the systematic development and testing of the product is reported;
– collaboration with organizations that are important in spreading the
method or product has been achieved;
– dissemination at a wider scale, wide accessibility has been achieved.
d. Systematic reviews
Another important product of scientific work with considerable educational
impact is (participation in) official review groups, particularly in Cochrane
Collaboration Groups. Inclusion of a review in the Cochrane Library can be
seen as an indicator of esteem.
e. Indicators of esteem and key positions in relevant professional networks
This refers to integration of the research activities within the relevant context,
for instance the active collaboration with organizations representing the target
groups of the above mentioned guidelines, reviews, instruments and quality
improvement programmes and activities.
Analysis of societal impact
21
Relevance of health care research for policy
Health care policy might benefit from health care research in developing rational or
evidence-based policy. Utilization of results of health care research by health care
policy makers can be instrumental (i.e. direct application of results), conceptual (i.e.
generating new policy ideas) or strategic (i.e. having a role in the political process).
Policy is characterized by phases, the policy cycle. The relation between research
and policy differs according to these phases. In this respect, research questions as
well as the assessment of policy relevance also differ, parallelling the different phases. Additionally, health care research has many different target groups with different
interests. The interest of these target groups may change according to the phase of
policy making.
Areas for the development of ‘ex ante’ indicators of policy relevance
In a number of areas ex ante indicators can be developed which focus on the transformation of societal problems into policy research questions:26
a. The first area is the relevance of the policy problem, which in health care research
can be defined in terms of the general goals of the health system, such as coherence, equity and accessibility, and the quality of care in relation to costs.27
b. Another important area is the translation of policy problems into research questions
addressing both the exploration of the policy problem and the possible contribution of research to the solution of the problem.
c. A third area is the feasibility of conducting research in complex field situation.
Areas for the development of ‘ex post’ indicators of policy relevance
Specific indicators can be developed for the degree to which the research output is
tuned to the policy arena. For example:
a. The degree of (semi)-governmental funding of studies, programmes or institutes.
b. The degree to which results of research reach those who have to use it. Assessment must therefore be focussed on how research output is communicated to
the target groups.
c. The degree to which target groups (could) have had any use of the results, e.g.
citation-analysis.
d. Evaluation of the implementation strategy of an institute or research group.
22
Other output and products
Apart from publications in national professional journals, clinical guidelines and
quality of care programmes, research in the field of health care research may result in
other intended products, which can be relevant for health care practice or policy.
One can think of:
a. ict-facilities and software to be used in (the evaluation and improvement of )
health care practice, public health, or in policy development.
Examples are: (contributions to) practice and patient information systems and
communication facilities, including their development, evaluation and improvement; and the design, development and implementation of health care
information networks to support health policy.
b. Development, evaluation and improvement of health care technologies and
services. For example: home care facilities for intravenous infusions, noninvasive diagnostic techniques applicable in primary care settings, and shared
care services.
Based on health care research and expertise, these instruments and products are
generally developed, validated and implemented for practical purposes and are
often setting-specific. Therefore, their quality and success in terms of impact on
health care and policy cannot be satisfactorily evaluated using, for example, bibliographic measures. Also, while there is no tradition of claiming patents in health care
research, an exclusively or primarily patent-oriented approach would also not work
given the setting-specific and low profit nature of most of this work. Therefore,
evaluating patent claims would not cover this output nor its quality or impact.
Independence of research
Since in applied health research interaction between research and contract partners
(e.g. governmental and (non)professional organizations) can be intensive, and as
the results and the way these are presented are generally of great interest for these
partners, special attention is needed to warrant scientific and professional independence of research teams. Clearly, in order to achieve valid results and justified
societal impact of research independence is a basic requirement. Whether independence has indeed been warranted, should therefore be part of the assessment.
Possible indicators in this field are:
a. full and independent responsibility of research teams for the operationalisation
of the research topic.
Analysis of societal impact
23
b. full and independent responsibility of research teams for the research methodology used.
c. full and independent responsibility of research teams for the analysis, production and publication of the results.
24
Suggested methods and procedures
General approach
Certain prerequisites need to be taken into account when developing a methodology for assessing the societal impact of applied health research. Firstly, the methodology needs to link up closely with the current ways of evaluating scientific research.
Secondly, it must not only provide an assessment of the current status and level of
the research, it must also enable recommendations to be made for the future. Thirdly, it must be efficient from the point of view of both the researcher and those carrying out the assessment, enabling the necessary information to be obtained efficiently and as quickly as possible. Finally – and crucially –, it must be workable in
practice. Elements which are relevant when assessing societal impact are:
a.
b.
c.
d.
the research team’s / the organization’s mission;
its performance in relation to that mission;
the prospects for the future;
recommendations for adjustments, where appropriate.
The research mission (a) should be clearly articulated, and should be reviewed in
the light of both scientific and societal challenges in the field. As to performance (b)
it should be considered whether the profile and quality of the research programme
and its output are in line with the mission objectives. For (a) and (b), both ex ante
and ex post assessments are important. For (c) and (d), of course, ex ante assessment
is essential.
Suggested methods and procedures
25
Given the breadth of the field, the absence in many cases of standardized and
easily accessible sources and performance data, and the need to tailor the approach
to specific cases, the Committee does not consider it possible nor useful to produce
detailed quantified yardsticks for general use. We shall list relevant output categories and give a qualitative description of indicators for various possible applications.
The actual criteria should be elaborated in advance for specific assessments.
Output indicators
Based on the previous chapter the following indicators of societal impact of health
care research have been mentioned:
– communication to health professional workers: publications in national professional journals;
– development and dissemination of clinical practice guidelines and practice
protocols;
– indicators and instruments for assessing the quality of care and use of guidelines;
– development of methods/programmes for improving care and implementing
evidence or valuable procedures;
– degree of (semi)-governmental funding of research;
– communication of the research output to societal target groups;
– use of the research output and feedback by the target groups;
– process evaluation of the implementation strategy also considering whether the
output should be ‘implementable’, or, where relevant, ‘implemented’
– ICT-facilities and software to be used in (the evaluation and improvement of )
health care practice;
– development, evaluation and improvement of health care technologies and
services;
– independence of research teams as to operationalisation or research topic,
methodology, analysis and publication of results.
Potentially suitable indicators of societal impact are listed in Table i.
Information regarding many of the indicators for evaluation of societal impact is
not yet routinely available and will, at least to a certain extent, be difficult and/or
expensive to obtain. While, for example, a listing of relevant teaching activities,
memberships and authorships can easily be provided by the investigators involved,
it will be a time consuming task to make a citation analysis or – even better– a content analysis. Such a citation/content analysis will generally be institute-specific.
Therefore key indicators should be detailed for the evaluation of specific groups or
programmes. Indicators for evaluation of societal impact will currently be mostly
qualitatively assessed. However, development of better quantification systems,
26
Table 1. Criteria and indicators of societal impact of research output
criterium
indicator
content analysis
professional publications
treatment guidelines and protocols (nhg, cbo)
policy documents (vws, gr, cvz)
Cochrane library
textbooks
teaching materials
lay publications
ict and software (Internet/cd-rom)
citation analysis
scientific publications (both (s)sci and non-(s)sci) cited
in documents mentioned above citations in professional
journals, policy documents, protocols and guidelines
authorships
(co-)authorships of documents mentioned above under
‘content analysis’
products
health care technologies and services
instruments, programmes, methods for (assessment or
implementation of ) care
funding of research
(semi)governmental funding
publicity
presentations for a non-scientific audience
fact sheets
public media
internet
memberships
member of committee issuing a policy document or a
treatment guideline member of advisory committee
teaching
contributions to initial and post-initial education of
health care professionals based on research output
implementation
strategy
membership of advisory committees
interactions between researchers and public administration
feedback from target groups
independence
operationalisation of research questions
research methodology
analysis and publication of results
Suggested methods and procedures
27
using e.g. increasingly available digital databases via internet, should be better
explored for assessment purposes. The indicators of societal impact listed in Table I
predominantly concern documents focussing on the situation in the Netherlands,
although European and other international initiatives would seem to be increasingly relevant.
Weighing the different indicators of societal impact in relation to each other is
another challenge. For publications in professional journals or policy documents
there is no ‘societal’ counterpart of the (Social) Science Citation Index. In the
overall assessment it is the total profile of a research project, programme or institute
in relation to its research mission that is important (see e.g. Wamelink and Spaapen6
and rae-report9).
Procedure
Following the recommendations of the knaw/nwo/vsnu working group on Quality Assurance for Research (‘Quality obliged: towards a new quality assurance
system for scientific research’28), under the new national quality assessment system
for all academic research which comes into force in 2003, research organizations will
have to carry out regular self-evaluations of the quality of their research (at least
twice every five years). Once every five years an external review committee will visit
them to carry out a visitation on the basis of written information and interviews. If
applicable, the quality questions under the new system may also refer to the socioeconomic impact of research. In the case of applied health research, where the
societal usefulness of the research within the foreseeable future is an explicit aspect
of the mission, the Committee considers that this should be included in the assessment along with scientific quality. Performance and prospects for the future should
also be assessed, and recommendations may be made. Research institutes and
groups can be asked, in the context of their self-evaluation report, to list and describe the indicated ‘non sci-research paper’ output they consider relevant in realizing their – societal – mission. They can also be asked to present other data as to the
– potential – societal impact of the research results. In order to produce a coherent
evaluation and not burden the research teams unnecessarily it is important that
scientific quality and societal impact be assessed together by a single external review
committee. The committee’s membership needs to be geared to this task.
For the evaluation of societal impact, feedback from stakeholders is important.
Therefore, if warranted by the mission (or the way in which the research team interprets it), an external review committee may consult a panel of stakeholders, such as
professionals, patients’ organizations, health care institutions, policy-makers and
research clients. They can be approached in different ways (by letter or interview,
28
individually or collectively) using a predetermined standard protocol. Stakeholders
can have different perspectives. Therefore, stakeholders should be carefully chosen
in relation to the subject and the objectives of the evaluation. Furthermore, for the
purpose of efficiency the Committee advises to confine these approaches to specific
questions to be asked to the stakeholders which are relevant to the assessment,
evaluation and/or feedback.
Suggested methods and procedures
29
30
Conclusions and recommendations
Conclusions
This report contains a survey for an evaluation methodology for the societal impact
of applied health research. The Committee concludes that:
1. There is a strong need for a single, widely accepted methodology for evaluation
of the societal impact of (applied) health research.
2. In the Netherlands the current methods for evaluation of the scientific quality
of research are generally satisfactory. A new national system of quality assurance
for all academic research will come into force in 2003.28
3. Evaluation of societal impact should and can be integrated into this new evaluation system.
4. A research mission is indicative for how and to what extent societal impact of
the research should be evaluated. The research mission itself should be clearly
articulated, and should be reviewed in the light of both scientific and societal
challenges in the field.
5. Key criteria and indicators for evaluation of societal impact and their relative
weight must be detailed in advance in relation to the research mission.
6. In the overall assessment (of both scientific quality and societal impact) it is the
total profile of a research project, programme or institute in relation to its
research mission that is important.6,9
Conclusions and recommendations
31
7. External review, by one single committee, is highly regarded in the assessment
of both scientific and societal impact.9 External review committees’ membership needs to be geared to this task. The review committee should therefore also
include expertise in evaluating societal impact.
8. For the evaluation of societal impact, feedback from stakeholders is important.
Stakeholders to be consulted should be selected in relation to the objectives of
the evaluation.
9. Special attention is needed to evaluate independence of research in order to
warrant valid results and justified societal impact.
10. The results of the assessment of societal impact, together with the results of the
evaluation of scientific quality, will be a basis for improvement of research
quality and for further development, and can guide institutional and academic
research policy.
Recommendations
As the next steps to take, the Committee recommends:
1. Universities and research institutes are requested to implement the assessment
of societal impact of applied health research, according to the methods and
procedures suggested in the report, in the new national quality assessment
system that comes into force in 2003.
2. In order to guide this process, pilot reviews of institutes for applied health
research are to be carried out in order to fully develop the methodology and its
application.
3. The outcome of societal impact assessment, together with the outcome of the
evaluation of scientific quality, should imply incentives for researchers in the
field of applied health research. Therefore, a favourable outcome of societal
impact assessment should be adopted by universities and funding organisations as an important additional criterion in deciding about research grants in
the field of applied health research.
4. Available results of societal impact assessment can be reviewed on the level of
both the institute or programme (ex post and ex ante) and the project. In this
respect zon has already made a start by requesting that results from subsidized
programmes and projects should be useful for implementation in practice.
32
List of references
1. Commissie Geneeskunde, knaw. Disciplineplan Geneeskunde 1988, Amsterdam: knaw, 1988.
2. Reneman R.S. and Klaassen A.B.M., et al. Discipline-advies Geneeskunde 1994,
Amsterdam: knaw, 1994.
3. Medical Committee, Royal Netherlands Academy of Arts and Sciences. Discipline Report on (Bio)Medical and Health Sciences Research in the Netherlands
1998, Amsterdam: knaw & vsnu, 1999.
4. Leenen H.J.J., van der Grinten T.E.D., Knottnerus J.A., Lamberts H., van der
Maas P.J. and Roscam Abbing E.W., Subcommissie Gezondheids(zorg)wetenschappen, Commissie Geneeskunde, knaw. Rapport van de Subcommissie
Gezondheids(zorg)wetenschappen, Amsterdam: knaw, 1991.
5. Begeleidingscommissie Experimentele Visitaties Gezondheidsonderzoek.
Kwaliteit verzekerd, advies visitaties gezondheidsonderzoek, Amsterdam: knaw,
1994.
6. Wamelink F.J.M. and Spaapen J.B. De evaluatie van universitair onderzoek.
Methodiek voor het incorporeren van de maatschappelijke waarde van onderzoek,
nrlo-rapport 99/12, Den Haag: nrlo, 1999.
7. Klep L. De maatschappelijke oriëntatie van onderzoeksgroepen. Hoe sta je in de
wereld. nrlo/cos, 2000
List of references
33
8. Van der Meulen B. and Rip A. Maatschappelijke kwaliteit van onderzoek tussen
verantwoording en management. Een inventarisatie van beoordelingspraktijken,
1997.
9. Ruffles P.C., Allen R.W.K., Clarke D.W., Edwards M.F., Ion S.E., Midwinter
J., Monaghan M.L., et al. Measuring excellence in Engineering Research. London: Royal Academy of Engineering, 2000.
10. Bouter L. Meten is weten? Tijdschrift voor Gezondheidswetenschappen 1998; 76:
354-355.
11. Van Raan A.J.F. Advanced bibliometric methods as quantitative core of peer
review based evaluation and foresight exercises. Scientometrics 1996; 36: 397420.
12. Moed H.F., de Bruin R.E. and van Leeuwen Th.N. New bibliometric tools for
the assessment of national research performance: database description, overview of indicators and first applications. Scientometrics 1995; 33: 381-422.
13. Oortwijn W.J., Vondeling H. and Bouter L. The use of societal criteria in priority settings for health technology assessment in the Netherlands: initial experiences and future challenges. Internat. J. Technol. Assessment in Health Care 1998;
14: 226-36.
14. Schoonbaert D. and Roelants G. Citation analysis for measuring the value of
scientific publications: quality assessment tool or comedy of errors? Trop. Med.
Internat. Health 1996; 1: 739-752.
15. Wouters P. The citation culture (PhD-thesis). Amsterdam, 1999.
16. Moher D., Fortin P., Jadad A.R., Jüni P., Klassen T., Le Lorier J., et al. Completeness of reporting of trials published in languages other than English: implication for conduct and reporting of systematic reviews. Lancet 1996; 347: 363366.
17. Egger M., Zellweger-Zahner T., Schreider M., Junker C., Lengeler C. and
Antes G. Language bias in randomised controlled trials published in English
and German. Lancet 1997; 350: 326-329.
18. Gregoire G., Derderian F. and Le Lorier J. Selecting the language of the publications included in a meta-analysis: is there a Tower of Babel bias? J. Clin. Epidemiol. 1995; 48: 159-163.
19. Vandenbroucke J.P. On not being born a native speaker of English. British
Medical Journal 1989; 298: 1461-1462.
34
20.Visser H.K.A. Het belang van publiceren in Nederlands wetenschappelijke
tijdschriften met een extern beoordelingsysteem. Nederlands Tijdschrift voor
Geneeskunde 1998; 142: 798-801.
21. Van Maldegem B.T., Walvoort H.C. and Overbeke A.J.P.M. Effecten van
artikelen gepubliceerd in het Nederlands Tijdschrift voor Geneeskunde. Nederlands Tijdschrift voor Geneeskunde 1999; 143: 1957-1962.
22. Van Maldegem B.T. and Overbeke A.J.P.M. Berichten in Nederlandse nationale kranten naar aanleiding van artikelen uit medisch-wetenschappelijke
tijdschriften. Nederlands Tijdschrift voor Geneeskunde 1999; 143: 1969-1972.
23. Bouter L.M. and Knottnerus J.A. Maatschappelijke relevantie van toegepast
gezondheidsonderzoek. Nederlands Tijdschrift voor Geneeskunde 2000; 144:
1178-1183.
24. Shaneyfelt T.M., Mayo-Smith M.F. and Rothwangl H. Are guidelines following guidelines? JAMA 1999; 281: 1950-1951.
25. Raad voor Gezondheidsonderzoek. Advies gezondheidszorgonderzoek.
’s-Gravenhage, 1994.
26. Groenewegen P.P. and J.M. Bensing. Maatschappelijke kwaliteit van gezondheidszorgonderzoek. TSG 1995; 73: 245-249.
27. Hurst J.W. Reforming health care in seven European nations. Health Affairs
1991; 10: 7-21.
28. Kwaliteit verplicht; naar een nieuw stelsel van kwaliteitszorg voor het wetenschappelijk onderzoek (Quality Obliged; towards a new quality assurance system for
scientific research). Rapport van de werkgroep Kwaliteitszorg Wetenschappelijk
Onderzoek. knaw/nwo/vsnu april 2000.
List of references
35
36
Appendix 1
Composition of the Applied Health
Research Committee of the Council
for Medical Sciences
Prof. J. André Knottnerus (chair)
General Practice, University of Maastricht
Prof. Jozien M. Bensing
Health Psychology/nivel, University of
Utrecht
Prof. Lex M. Bouter
Epidemiology, Vrije Universiteit Medical
Center, Amsterdam
Prof. Rob C.W. Burgersdijk
Dentistry, University of Nijmegen
Prof. Hans J. Geuze
Cell Biology, University of Utrecht
Prof. Richard P.T.M. Grol
Quality of Care, University of Maastricht/Nijmegen
Prof. Leo B.A. van de Putte
Rheumatology, University of Nijmegen
Prof. Richard Smith (consultant)
Editor British Medical Journal, London
Prof. Jan P. Vandenbroucke
(consultant)
Clinical Epidemiology, University of
Leiden,Chairman Council for Medical
Sciences, Royal Netherlands Academy of
Arts and Sciences
Executive Secretary, Council for Medical
Sciences, Royal Netherlands Academy of
Arts and Sciences
Staff, Council for Medical Sciences,
Royal Netherlands Academy of Arts and
Sciences
Dr. Marij J. Stukart (secretary)
Dr. Michiel H.W. Hooiveld
Appendix 1
37
Appendix 2
Invitational meeting
‘The societal impact of applied health
research’, 19 June 2001
Programme
13.00 - 13.30 hrs.
13.30 - 13.35 hrs.
13.35 - 13.55 hrs.
13.55 - 14.20 hrs.
14.20 - 14.45 hrs.
14.45 - 15.10 hrs.
15.10 - 15.40 hrs.
15.40 - 16.05 hrs.
16.05 - 16.30 hrs.
16.30 - 16.40 hrs.
16.30 - 17.30 hrs.
17.30 - 18.30 hrs.
Registration
General introduction by the
Chairman of the day
Introduction Draft Report
Quality assured for Applied
Health Research
Societal impact in ex ante
evaluation of scientific research
Medical Journals: Do they
change anything?
Tea / coffee break
Assessment of uk Engineering
Research in Universities – some
lessons learned
What role for societal impact in
quality assessment of scientific
research in medical faculties ?
Summary of presented views
Panel discussion
Drinks and refreshments
Prof. Jan P. Vandenbroucke
Prof. J. André Knottnerus
Prof. Jan H. van Bemmel
Prof. Eduard C. Klasen
Prof. Richard Smith
Prof. John E. Midwinter
Prof. Paul J. van der Maas
Prof. J. André Knottnerus
39
List of speakers
Prof. Jan P. Vandenbroucke
Clinical Epidemiology, University of Leiden; Chairman Council for Medical
Sciences, Royal Netherlands Academy of Arts and Sciences
Prof. J. André Knottnerus
General Practice, University of Maastricht; Chairman Applied Health Research
Committee of the Council for Medical Sciences
Prof. Jan H. van Bemmel
Rector Erasmus University Rotterdam; former Chairman knaw/nwo/vsnu
Working Group Quality Assessment of Scientific Research
Prof. Eduard C. Klasen
Director Netherlands Organisation for Scientific Research
Dr. Richard Smith
Editor British Medical Journal
Prof. John E. Midwinter
Pender Professor University College London; Working Group Measuring Excellence in Engineering Research, Royal Academy of Engineering, United Kingdom
Prof. Paul J. van der Maas
Dean Erasmus Medical Centre Rotterdam; Council of Medical Faculties in the
Netherlands
Summary of lectures and discussion
After the opening of the invitational meeting by Prof. Vandenbroucke, the meeting
starts with an introduction into the draft report by Prof. Knottnerus, chairman of
the subcommittee that prepared the report. He summarizes the most important
conclusions of the draft report:
–
–
–
–
–
The measurement of societal impact of research is an international issue;
For applied health research, societal impact is not just an optional element;
Societal impact should be evaluated in addition to scientific quality;
Indicators can be defined in the various relevant fields;
Assessment should and can be integrated with evaluation of scientific quality.
The subcommittee will finalise the report later this year. Furthermore, the subcommittee has the intention to perform some pilot studies, which should result in a
procedure for routine assessments of the societal impact of research.
Next, Prof. Van Bemmel, former Chairman knaw/nwo/vsnu Working Group
Quality Assessment of Scientific Research, addresses two subjects:
40
First, the question is addressed if applied health research is truly different from
the other medical subdisciplines. In the previous Discipline Assessment of 1998 all
subdisciplines belonging to the area of applied health research were assessed separately in one cluster: Public Health Research. Evaluation of the results showed that
in general, basic, and some clinical health research disciplines scored better than
surgical and applied health research disciplines. Applied Health Research was
evaluated comparable to that of Surgery.
It is concluded that, repeating the recommendations of the Discipline Assessment 1998, the quality of the field would be greatly enhanced if it would join forces
with basic disciplines such as Epidemiology
Secondly, Prof. Van Bemmel discusses if the new assessment procedure, as described in the report ‘Quality Obliged’, is able to take into account aspects such as
societal impact of applied health research. In summary, this new assessment procedure implies that periodically, but at least every 3 years there shall be a self-assessment and at least every 6 years, the self-assessment shall be reviewed by peers, in
connection with site visits. In the draft report several possible factors are mentioned
that express societal impact of research. Some of these factors can be easily implemented as part of the report on self-assessment. Therefore, it is concluded that the
newly proposed procedure for research assessment offers ample possibilities to
include the evaluation of the societal impact of applied health research.
Prof. Klasen, Director Netherlands Organisation for Scientific Research (nwo),
addresses the fact that nwo’s procedures for ex ante evaluation of research proposals
in responsive mode open funding competition are sometimes (e.g. in areas like
applied health research) too narrow, because they are judged principally on the basis
of academic merit. This tends to result in lower scores to proposals relating to applied health research. He demonstrates by three examples (stw, wotro and zon)
that taking societal impact into consideration in open funding competitions sometimes has an added value. However, several methodological problems are associated
with ex ante assessment of societal impact. Firstly, there are no generally accepted
objective criteria. Secondly, there is no generally accepted hierarchy of peers in
relation to the societal impact of research. Finally, the result of ex ante evaluations by
a funding agency has to be a rigid one-dimensional list of projects accepted for
funding and projects rejected. This means that both academical merit and societal
impact have to be combined at some point to reach a single overall decision. Although in general, the primary funding criterion should still be academic quality,
the additional criterion of potential societal impact should come into play, for
instance, where resource constraints mean that hard choices have to be made between otherwise very good or excellent proposals. These methodological problems
Appendix 2
41
might be solved but these problems force nwo to be very careful and conscientious
in selecting its evaluation procedures and cautious about participating in experiments.
Prof. Smith, Editor British Medical Journal starts with a couple of remarks about the
draft report:
– Measuring societal impact of research is an international issue.
– Before setting up a methodology and making a final report, it would be wise to
consult and to broadcast the subject to more people
– The subcommittee stated that the current methods for assessing scientific quality
are satisfactory and in addition it is implicated that peer review is well established
and does not have any problems. However, there have been strong criticism on
both subjects in Britain.
He continues his seminar on the importance of influence for the British Medical
Journal (BMJ).
Influence is the first part of the mission of the BMJ and profit the second one.
Influence is so important because it serves the needs of doctors and others to have
impact upon the international debate on health issues. Influence is hard to measure
and hard to define. Influence is in some way the polite word for power. There are
different levels of influence:
–
–
–
–
–
–
Something changes because of what BMJ has published.
Setting the agenda or legitimising an issue.
Leading by examples and being followed.
Being quoted and cited.
Being paid attention to.
Being known about.
Based on these different levels Prof. Smith presents a scoring system that can be set
up to measure influence. He concludes that it is important to try and measure the
influence of journals, that achieving change is the highest level of influence and,
finally, that the proposed scoring system might be the beginning of something
useful.
Prof. Midwinter has experience with assessment of Engineering Research in Universities in the uk. He is the former Chairman of the Research Assessment Exercise
(rae) Panel and shares his knowledge and experiences about these exercises that
were held in 1992 and 1996. The final purpose of the rae is to focus available money
on successful groups or departments. As in medicine, engineering has a strong
42
societal content. This seems to require additional measures over science, but they
are often very difficult to measure. In the uk a single rae process exists for all disciplines. It is constraint to operate in a set of well-defined forms. This procedure
allowed the panel to get results back in a highly structured form and hence, it was
easy to compare one department with another quickly. The procedure is reasonably
fair, because everyone gets the same guidelines.
As a result of the 1996 Engineering exercise some important issues came up:
– the importance of relevance of research: is the assessed group well-plugged into
society?
– the issue of exploitation: is it used, does it matter?
– the use of direct and indirect peer reviews to get more insight into a department
– the (un)reliability of the citation index for applied research
– the fact that sites visits are not necessary
Based on his assessment experience Prof. Midwinter states a couple of general
observations:
–
–
–
–
You need a transparent process with clear rules.
You have to minimise workload for the assessors and the assessed ones.
The procedure must work for a large number and range of departments.
Rules will inevitably shape national priorities; ergo: one must be clear about the
purpose of the exercise.
– The Assessment Panel must command respect.
More information on the rae can be found on http://www.hefce.ac.uk
Prof. Van der Maas, Dean Erasmus Medical Centre, Rotterdam, focusses his seminar on the role of societal impact in quality assessment of universities. Universities
are needed to educate people who are able to develop new insights. To establish the
first aspect, universities need independence. Society is paying for a specific type of
independence of a kind, devoted to finding and disseminating the truth. That, in
itself, is relevant to society, even if it does not have a short term application. The
question is how to assess the quality of universities. In research, the obvious first
assessment criterion is the scientific relevance. Nevertheless, the societal impact of
research is also a possible aspect to evaluate for disciplines like population health,
medical practice and health care system, and commercial application. Several
groups would be responsible for effectuating societal impact. In the department of
public health in Rotterdam researchers have the responsibility to only accept or
apply for research grants, if the project/proposals ensures a real possibility of doing
innovative work and is internationally publishable. In addition, the results should
Appendix 2
43
also have application in public health. In order to obtain an optimal application of
results, the research should have strong connections to communities, organisations, etc., who are responsible for the implementation of the results.
Prof. van der Maas is of the opinion that there would be room for evaluation of
societal impact of some parts of university research. Especially if universities acknowledge that some types of research are both innovative and applicable and if
researchers are allowed to apply for grants for programmes with an explicit societal
application purpose.
Some aspects, however, have to be kept in mind. Societal impact itself is not a
value, because detrimental societal impact is measurable as easily as a positive
societal impact. The first responsibility of a researcher is to make the intrinsic scientific quality of his research visible. The societal impact should only be evaluated if it
is mentioned as an explicit purpose of the research and/or institution. The deans of
the medical faculties do support the notions that are laid down in the subcommittees’ report. University research should, however, be judged primarily on scientific
impact. Assessment of scientific quality and societal impact should be one procedure.
Prior to the panel discussion, Prof. Knottnerus summarises the most important
conclusions of the seminars and discussions:
1. Integration of the assessment of societal impact in the newly designed research
assessment system (‘Quality Obliged’) is possible and preferable.
2. It is important to characterise the mission of a research group as to its relevance to
societal impact:
3. Ex ante assessment of societal impact is important, but difficult. It is very positive that zon and nwo would support an experiment in this field.
4. One committee should evaluate both scientific quality and societal impact.
5. Assessing scientific quality is not out of debate. The Netherlands has a good
tradition in this respect. In the last two assessments differences between different fields of research were taken into account.
6. The BMJ experience with indicators for measurement of influence can be very
useful and helpful.
7. Interaction of impact and quality: both high impact of bad research and no
impact of high quality applied research should be avoided.
44
8. Relevance/utility. Stakeholders can have different perspectives. Therefore, stakeholders should be carefully chosen in relation to the subject and the objectives of
the evaluation.
9. Academic focus. Academic research should be innovative and independent from
the source of finance or sponsoring. Scientific relevance is a sine qua non.
10. Responsibility of researchers. When societal impact is part of the mission, it
should be evaluated properly.
11.The deans of the medical faculties support the recommendations of the subcommittee.
In the panel discussion the following issues are discussed.
– Reproducibility of the assessment procedure:
In the last Discipline Assessment there was an enormously high correlation
between different referees. However, ex ante evaluation of zon research proposals
with respect to implementation of the results showed a lot of discrepancy between
different peers. One expects that reproducibility of ex ante evaluations of societal
impact will be more difficult than that of ex post evaluations.
– The need for more concrete criteria of societal impact.
In this respect it was discussed that a distinction should be made between the
assessment of institutions, versus individual research projects and complete
programmes of activities.
– The need for pilot studies
In this respect three possibilities were discussed: (1) preparing an integration of
scientific quality and societal impact assessment anticipating on the evaluation
according to´Quality Obliged´, (2) starting a number of pilots on evaluation of
societal impact, (3) make use of evaluation material that has been used in the past
to evaluate programs or projects on societal impact
– The future of scientific publishing
It is expected that within a period of 5 to 10 years people will publish on the internet. One of the major drives will be the desire for circumvention of the publishers
which make money with little benefit for the researchers and the public.
A full report of the meeting can be found on http://www.knaw.nl/rmw
Appendix 2
45
List of participants
Prof. dr. W.G. van Aken
Gebied Medische Wetenschappen/nwo
Dr. W.M. Ankum
Academisch Medisch Centrum, Afdeling Verloskunde en Gynaecologie
Dr. G. van Ark
Medische Wetenschappen/nwo
Dr. E.P. Beem
Medische Wetenschappen/nwo
Prof. dr. D.W. van Bekkum
Introgene B.V.
Prof. dr.ir. J.H. van Bemmel
Erasmus Universiteit Rotterdam, College van Bestuur
Prof. dr. J.M. Bensing
Nederlands Instituut voor Onderzoek van de Gezondheidszorg (nivel)
Prof. dr. A.J.M. Berns
Nederlands Kanker Instituut, Afdeling Moleculaire Genetica
Dr. P.J.E. Bindels
Academisch Medisch Centrum, Afdeling Huisartsgeneeskunde
Dr. ir. J.O. de Boer
Gebied Medische Wetenschappen/nwo
Prof. dr. L.M. Bouter
Vrije Universiteit emgo-Instituut
Prof. dr. W. van den Brink
Academisch Medisch Centrum, Psychiatrisch Centrum
Prof. dr. R.C.W. Burgersdijk
Katholieke Universiteit Nijmegen,Tandheelkunde
Dr. D.H. Collijn
Universiteit Maastricht, Faculteitsbureau
Mr. M.H.J. Coppens-Wijn
dmw/vsu
Dr. B.V.M. Crul
Hoofdredacteur Medisch Contact
S.P.H. Ellenbroek
Gebied Medische Wetenschappen/nwo
Prof. dr. H.J. Geuze
Universitair Medisch Centrum Utrecht, Disciplinegroep Celbiologie
Prof. dr. T.E.D. van der Grinten
46
Erasmus Universiteit Rotterdam, Faculteit der Geneeskunde en Gezondheidswetenschappen
Prof. dr. P.P. Groenewegen
nivel
Prof. dr. R. Grol
Katholieke Universiteit Nijmegen, hsv/wok
Prof. dr. J.C.J.M. de Haes
Academisch Medisch Centrum, Afdeling Medische Psychologie
Dr. L. Henkelman
zon, Programma-coördinator Zorg
Prof. dr. J.H.J. Hoeijmakers
Erasmus Universiteit Rotterdam, Faculteit der Geneeskunde en Gezondheidswetenschappen
Dr. ing. M.H.W. Hooiveld
knaw, Commissie Geneeskunde
W. Joling
zon, Algemeen Beleid en Kwaliteit
Prof. dr. L.P. ten Kate
Academisch Ziekenhuis vu, Afdeling Klinische Genetica en Antropogenetica
Dr. A.B.M. Klaassen
Nederlandse Hartstichting,Afdeling Onderzoek
Prof. dr. E.C. Klasen
Directeur nwo
Prof. dr. N.S. Klazinga
amc-uva, Afdeling Sociale Geneeskunde
Prof. dr. J.A. Knottnerus
Universiteit Maastricht, Onderzoeksinstituut ExTra-CaRe
Drs. J.M.T. Lely
Ministerie van ocenw, bmo
Prof. dr. P.J. van der Maas
Erasmus Universiteit Rotterdam,Instituut Maatschappelijke Gezondheidszorg
Prof. dr. J.W.M. van der Meer
Academisch Ziekenhuis Nijmegen, Afdeling Algemene Interne Geneeskunde
Prof. J.E. Midwinter
University College London
Dr. T. de la Moth
British Medical Journal
Dr. W.J. Oortwijn
umc St Radboud, Afdeling mta
Appendix 2
47
Drs. C. de Pater
zon, Secretaris Programma Preventie
Prof. dr. L.B.A. van de Putte
Academisch Ziekenhuis Nijmegen, Interne Geneeskunde
Dr. M.M. van Rees-Wortelboer
Medische Wetenschappen/nwo
Prof. dr. R.S. Reneman
Universiteit Maastricht, Cardiovascular Research Institute
Prof. dr. J.J. van Rood
Leids Universitair Medisch Centrum, Stichting Europdonor
Prof. dr. A.H. Schene
Academisch Medisch Centrum, Polikliniek Psychiatrie
Drs. J.W. Smeenk
Stichting Sanquin Bloedvoorziening
Drs. H.J. Smid
zon, Algemeen Directeur
Prof. R. Smith
bmj Publishing Group, bma House
Prof. dr. G.B. Snow
Academisch Ziekenhuis vu, Afdeling Keel-, Neus- en Oorheelkunde
Dr. J.B. Spaapen
Bureau knaw
Dr. J.E. Speksnijder
Nederlandse Hartstichting, Coördinator Onderzoek
P. van Splunteren
zon, Staf Implementatie
Dr. M.E.A. Stouthard
Amsterdams Medisch Centrum
Dr. M.J. Stukart
knaw, Commissie Geneeskunde
Prof. dr. F. Sturmans
Erasmus Universiteit Rotterdam, Instituut Maatschappelijke Gezondheidszorg
Prof. dr. J.P. Vandenbroucke
Leids Universitair Medisch Centrum, Afdeling Klinische Epidemiologie
Prof. dr. C.J.H. van de Velde
Leids Universitair Medisch Centrum, Afdeling Heelkunde
Dr. E.M. ten Vergert
Academisch Ziekenhuis Groningen, Hoofd mta-Bureau
48
Prof. dr. H.K.A. Visser
Emeritus Hoogleraar Erasmus Universiteit Rotterdam
Prof. dr. P.C. van der Vliet
Universitair Medisch Centrum Utrecht, Afdeling Fysiologische Chemie
Prof. dr. G. van der Wal
Academisch Ziekenhuis vu, Faculteit der Geneeskunde
Prof. dr. Ch. van Weel
kun; Faculteit der Medische Wetenschappen, Afdeling Huisartsgeneeskunde
Drs. I. van der Weijden
Vrije Universiteit, Faculteit Exacte Wetenschappen
Prof. dr. D. de Wied
Emeritus Hoogleraar Universiteit Utrecht
Dr. R.A.G. Winkens
Academisch Ziekenhuis Maastricht, Transmuraal & Diagnostisch Centrum
Prof. dr. J.W. Wladimiroff
Academisch Ziekenhuis Rotterdam/Dijkzigt, Afdeling Obstetrie/Gynaecologie
Dr. J.O.M. Zaat
Redactiecommissie Huisarts en Wetenschap
Prof. dr. J. van der Zee
Universiteit Maastricht, Faculteit der Gezondheidswetenschappen
Prof. dr. W.G. Zijlstra
Emeritus Hoogleraar Rijksuniversiteit Groningen
Appendix 2
49
50
Appendix 3
Appendix 3
51