Professional Application Development

SDI assessment

Assessment/Evaluation
• A study designed and conducted to assist some audience to assess
an object’s merit and worth (Stufflebeam, 2000)
• The fundamental purpose of evaluation is to create greater
understanding (Taylor-Powell, et al., 1996)

Assessment framework
• System of methods and processes to support
evaluation/assessment.
1

Questions


Why there is a need to do an assessment?
How the assessment results are further used?
Assessment purposes (Chelimsky, 1997)



Accountability – to test if the
program works
Knowledge – to better understand
the program
Developmental – to improve the
program
Summative
evaluation
Formative
evaluation

Inspire and assessment
7
Monitoring
Reporting
Quantitative
Yearly
Qualitative
Every 3 yrs

Link to monitoring
Some principles of assessing SDIs
11



Complex nature of a SDI;
Truly complex problems can only be approached
with complex resources (Cilliers, 1998);
Multi-faceted view is needed in understanding
concrete SDI initiative (De Man, 2006);




Use multiple assessment methods and
approaches;
Do not oversimplify;
Incorporate different views/understandings;
Flexibility;
Multi-view SDI framework


Multi-view SDI assessment framework based on
NSDI as CAS reasoning.
Characteristics






several assessment approaches
flexible (extensible)
multiple methods
reduced bias
full picture of SDI performance
serves multiple purposes of assessment
SDI assessment
Assessment result




Framework applied in 21 countries
Data collected by means of survey
Survey filled in by SDI coordinators (on behalf of
authorized SDI institution)
4 SDI assessment approaches were used:
Clearinghouse, SDI readiness, INSPIRE State of Play,
Organizational
Clearinghouse suitability approach (in% )
0
20
40
60
100
49
Argentina
Brazil
80
0
100
Canada
50
Chile
76
Colombia
60
Cuba
Denmark
0
47
Ecuador
Guyana
0
46
Jamaica
76
Malaysia
75
Mexico
49
Nepal
Netherlands
0
36
Poland
Serbia
0
96
Spain
Turkey
0
52
Uruguay
Average per sample
41
120
SDI Readiness approach (in% )
0
10
20
30
40
50
60
53
Argentina
56
Brazil
64
Canada
59
Chile
66
Colombia
53
Cuba
65
Denmark
42
Ecuador
41
Guyana
58
Jamaica
39
Malaysia
58
Mexico
Nepal
32
59
Netherlands
48
Poland
Serbia
37
70
Spain
Turkey
37
55
Uruguay
Average per sample
70
50
80
INSPIRE State of Play approach (in% )
0
10
20
30
40
50
60
70
80
52
Argentina
50
Brazil
74
Canada
50
Chile
76
Colombia
Cuba
59
Denmark
59
Ecuador
59
Guyana
27
65
Jamaica
44
Malaysia
73
Mexico
55
Nepal
59
Netherlands
39
Poland
55
Serbia
71
Spain
Turkey
Uruguay
Average per sample
32
52
56
Organizational approach (in% )
0
Argentina
20
40
60
80
50
75
Brazil
100
Canada
75
Chile
100
Colombia
Cuba
75
Denmark
75
Ecuador
75
Guyana
50
100
Jamaica
Malaysia
50
75
Mexico
Nepal
50
75
Netherlands
Poland
50
Serbia
50
75
Spain
Turkey
50
Uruguay
50
Average per sample
100
70
120
Average results in percentages from all assessment approaches
0
10
20
30
40
50
60
70
80
90
85
Canada
79
Colombia
78
Spain
70
Mexico
67
Jamaica
62
Cuba
58
Chile
56
Ecuador
54
Average per sample
Malaysia
52
Uruguay
52
51
Argentina
50
Denmark
48
Netherlands
46
Nepal
45
Brazil
43
Poland
35
Serbia
Turkey
30
Guyana
29
Assessment results

One year later another measurement of the Dutch SDI
was performed
Clearinghouse suitability approach (in% )
0
20
40
60
49
Argentina
Brazil
80
December
2007 120
100
0
100
Canada
50
Chile
76
Colombia
60
Cuba
Denmark
0
47
Ecuador
Guyana
0
46
Jamaica
76
Malaysia
75
Mexico
49
Nepal
Netherlands
0
36
Poland
Serbia
0
96
Spain
Turkey
0
52
Uruguay
Average per sample
41
Clearinghouse suitability approach (in% )
0
20
40
60
49
Argentina
Brazil
80
October
2008
100
0
100
Canada
50
Chile
76
Colombia
60
Cuba
Denmark
0
47
Ecuador
Guyana
0
46
Jamaica
76
Malaysia
75
Mexico
49
Nepal
79
Netherlands
36
Poland
Serbia
0
96
Spain
Turkey
0
52
Uruguay
Average per sample
45
120
Differences in Clearinghouse suitability SDI
approach



National Georegister “almost” ready
Clearinghouse suitability indicators measured
georegister
www.nationaalgeoregister.nl
SDI Readiness approach (in% )
0
10
20
30
40
December
60 2007
70
50
53
Argentina
56
Brazil
64
Canada
59
Chile
66
Colombia
53
Cuba
65
Denmark
42
Ecuador
41
Guyana
58
Jamaica
39
Malaysia
58
Mexico
Nepal
32
59
Netherlands
48
Poland
Serbia
37
70
Spain
Turkey
37
55
Uruguay
Average per sample
50
80
SDI Readiness approach (in% )
0
10
20
30
40
October
60 2008
70
50
53
Argentina
56
Brazil
64
Canada
59
Chile
66
Colombia
53
Cuba
65
Denmark
42
Ecuador
41
Guyana
58
Jamaica
39
Malaysia
58
Mexico
Nepal
32
71
Netherlands
48
Poland
Serbia
37
70
Spain
Turkey
37
55
Uruguay
Average per sample
51
80
Differences in SDI readiness approach results
Indicators
Politician vision regarding SDI
December 2007
October 2008
0.35
0.8
0.5
0.65
Umbrella legal agreement(s)
0.65
0.8
Digital cartography availability
0.65
0.8
0.5
0.5
Human Capital
0.99
0.99
SDI culture
0.65
0.8
Individual leadership
0.65
0.65
Web connectivity
0.73
0.73
Telecommunication infrastructure
0.68
0.68
Geospatial software availability
0.65
0.8
Own geoinformatics development
0.65
0.8
Open source culture
0.35
0.5
0.5
0.65
Return on investment
0.35
0.65
Private sector activity
0.5
0.5
Institutional leadership
Metadata availability
Government central funding
INSPIRE State of Play approach (in% )
0
10
20
30
40
December
60 2007
70
50
52
Argentina
50
Brazil
74
Canada
50
Chile
76
Colombia
Cuba
59
Denmark
59
Ecuador
59
Guyana
27
65
Jamaica
44
Malaysia
73
Mexico
55
Nepal
59
Netherlands
39
Poland
55
Serbia
71
Spain
Turkey
Uruguay
Average per sample
80
32
52
56
INSPIRE State of Play approach (in% )
0
10
20
30
40
October
60 2008
70
50
80
52
Argentina
50
Brazil
74
Canada
50
Chile
76
Colombia
Cuba
59
Denmark
59
Ecuador
59
Guyana
27
65
Jamaica
44
Malaysia
73
Mexico
55
Nepal
71
Netherlands
39
Poland
55
Serbia
71
Spain
Turkey
Uruguay
Average per sample
32
52
57
INSPIRE State of Play approach - changes
Dec 2007
Oct 2008
The SDI-initiative has a long-term and clear vision about the national SDI
0,5
1
There is documented data quality control procedures applied at the level of the national
SDI
0
0,5
Concern for interoperability goes beyond conversion between data formats (e.g.
hardware/software/data definitions)
0,5
1
One or more standardized metadata catalogues are available covering more than one data
producing agency
0,5
1
One national on-line access service for metadata (clearinghouse) is available providing
metadata of more than one data producing agency
0
0,5
There are one or more web mapping service available for core spatial data
0,5
1
INDICATOR
Organizational approach (in% )
0
Argentina
20
40
60
80
50
75
Brazil
100
Canada
75
Chile
100
Colombia
Cuba
75
Denmark
75
Ecuador
75
Guyana
50
100
Jamaica
Malaysia
50
75
Mexico
Nepal
50
75
Netherlands
Poland
50
Serbia
50
75
Spain
Turkey
50
Uruguay
50
Average per sample
100
70
120
Average results in percentages from all assessment approaches
0
10
20
30
40
50
60
70
80
90
85
Canada
79
Colombia
78
Spain
74
Netherlands
70
Mexico
67
Jamaica
62
Cuba
58
Chile
56
Ecuador
54
Average per sample
Malaysia
52
Uruguay
52
51
Argentina
50
Denmark
46
Nepal
45
Brazil
43
Poland
35
Serbia
Turkey
30
Guyana
29
Conclusions





Dutch SDI in comparison with sample countries rather
low (Dec 2007) – lack of clearinghouse
but…
Dutch GII is on the right track of development
Much has changed in one year: Georegister, GIDEON
Specific assessment of Dutch GII would complement
general multi-view assessment
Conclusions


Multi-view framework is general and therefore allows
for comparison with other NSDIs
But it is not “GIDEON goals” specific


Need to measure the realization of GIDEON goals
Which indicators would reflect best GIDEON goals
realization ?
Assessing the Dutch SDI
Embedding GEO in eGovernment
Key GEO registers
Implementing INSPIRE
Supply optimization
Cooperation
Creating added value
Knowledge, innovation and edu.
Organization, steering, directing
Assessing GIDEON – Dutch GII

Ministrie I&M asked for

GIDEON implementation progress reporting and
monitoring (yearly)
1st GIDEON assessment type
2nd GIDEON assessment type
3rd GIDEON assessment
type
Phase 2 Assessment approach
Indicator 1. The number of visitors of the Dutch national georegister.
Indicators 2. Availability of datasets and services
Indicator 3. The use of view and download services
Indicator 1. General governmental policy terms for (re)use of
that are available without any restrictions
Indicator 3. Yearly turnover of the geo-information
business in the Netherlands
}
Indicator 1. The level of cooperation within 5 chains of the GIDEON.
Indicator 2. The use of geo information within
e-government processes
Indicator 2. The number of unfulfilled vacancies in geo-sector
Indicator 3. Expenditure of Geo ICT sector in the Netherlands
on research and development.
Indicator 4. Expenditure of research sector on R&D.
geographical information.
Indicator 2. The percentage of datasets from GIDEON annex 1
} {
Indicator 1. The number of Geo-information events
}
Indicator 1.1 Gemiddelde aantal bezoekers NGR per dag
Hotspots van het aantal terugkerende gebruikers van het NGR
Indicator 1.2 Beschikbaarheid van datasets en services
Indicator 2.1: Algemeen overheidsbeleid voor het (her)gebruik van geo-informatie
Indicator 2.2: Het % datasets dat beschikbaar is zonder gebruiksbeperkingen
Indicator 2.3: Jaarlijkse omzet van de geo-bedrijfssector in Nederland
Indicator 4.1: Het aantal geo-events in Nederland per jaar
Lessons



The Dutch GII goals unprecisely defined
Therefore difficult to select assessment indicators
Not only progress/goals assessment BUT quality
assessment
Conclusions



Assessment often an integral part of SDI
Difficulties due to SDI complexity
Helps to monitor the implementation and
achievement of goals
51

Questions?
52