Step 7: Design the Dashboard Interface Things to Consider

ND Community Call
Data Dashboards: Part 2
February 19, 2013
Dashboards vs. Report Cards
 What’s a dashboard?
 A navigation system that can graphically
represent current program performance—
highlighting key areas of strength and weakness—
as well as predict or forewarn of programs that
are not on track to meet program performance
goals at a glance
 Supports decision-making
2
Essential Steps
1. Define program priorities
2. Explore existing data
3. Map current and potential data sources
4. Select performance indicators
5. Set performance targets and threshold criteria
6. Conceptually group indicators
7. Design the dashboard interface
8. Develop the dashboard
9. Implement the dashboard
3
Step 4: Select Performance Indicators
 Things to Consider:
 Good dashboards need good data; good data is:
 accessible
 clean
 timely
 comprehensible
 actionable
 Types of indicators (e.g., Inputs, outputs, leading, lagging,
student level, teacher level, classroom level, school/facility
level, district level)
 The inclusion of leading indicators that correlate with
lagging indicators
4
Leading and Lagging Indicators
Leading Indicators
Lagging Indicators
are outputs and shortterm outcomes:
are long-term or desired
outcomes:



Demonstrate signs of growth or
change in a given direction
suggesting early wins and areas
of improvement
Provide an early read on
progress towards long-term
outcomes
Measure conditions that are
prerequisite to the desired
outcomes (i.e., predict lagging
indicators)
 Measure the success and
consequences of activities
that have already occurred
 Measure achievement of the
desired outcomes
5
Leading and Lagging Indicators
On the next slide, identify the leading
indicators and their corresponding lagging
indicators?
6
Leading and Lagging Indicator
 Teacher turnover
rate
 Number of youth
who earn a CTE
certificate
 Course completion
rate
 Number of youth
who begin a
technical trade while
in aftercare
 Number of
disciplinary incidents
 Hours of professional
development
7
Step 4: Select Performance Indicators
What kind of indicator
are each of the
following and why?
Any caveats?
 Graduation rate
 Enrollment rate
 GED enrollment rate
 Number of CTE certificates
awarded
 Number of CTE certificates
earned
 Recidivism rate
 Types of CTE courses offered
 Number of CTE courses
offered





Per pupil spending
Number of youth served
Percentage of HQT by FTE
Bed count
High school transcript
 Average SAT/ACT score
 Course completion rate
8
Step 5: Set Performance Targets and
Threshold Criteria
 Things to Consider:
 In terms of your priorities, where do you want your
subgrantees and facilities to be in one year? Two years?
Three years?
 What performance benchmarks might you set to measure
their progress along the way?
 How will you know when to target a subgrantee or facility
for technical assistance? At what point might you sound the
alarm?
9
Step 6: Conceptually Group Indicators
 Things to Consider:
 How might you categorize your selected indicators in a way
that makes it easier for you to identify subgrantees/facilities
that are not meeting your performance targets?
 Demographics?
 Outcomes (academic vs. transition vs. behavioral)?
 Facility features and characteristics?
 Staffing?
 Priorities?
 Common administrative challenges?
 Common program implementation problems?
10
Step 6: Conceptually Group Indicators
How might you group the following indicators?
 Graduation rate
 Enrollment rate
 GED enrollment rate
 Number of CTE
certificates awarded
 Types of CTE courses
offered
 Number of CTE courses
offered
 Per pupil spending
 Number of CTE
certificates earned
 Number of youth served
 Recidivism rate
 Bed count
 Course completion rate
 High school transcript
 Average SAT/ACT score
 Percentage of HQT by FTE
11
Step 7: Design the Dashboard
Interface
 Things to Consider:
 The KISS (keep it simple sally) principle applies
 Display high-level information that the user can
understand
 No extraneous or irrelevant details
 No meaningless color coding, variety, or decorative
elements
 Data without a context is trivia: What data are essential to
tell the story visually (i.e., without narration or analysis)?
12
Step 7: Design the Dashboard
Interface
 Things to Consider:
 Choose the right display
 Tabular (spreadsheet), graphical, or some combination?
 Bar chart, pie chart, gauge, map or time series graph?
 Highlight important data at a glance
 Emphasize important data by its position on the
dashboard
 Emphasize important data by visual attributes like color
intensity, size, line width
 All dashboard data should be visible on a single screen
without scrolling
13
Dashboard Interfaces
 For Discussion:
 What do like and not like about these dashboard interfaces?
14
Next Steps
 Homework:
 One-on-one follow-up call to discuss homework, finalize
indicators, threshold criteria, and conceptually groupings
 Collect and submit sample data (scrubbed of any personally
identifiable information) associated with these indicators
15
Next Steps
 For Discussion:
 What resources do you have available to support the
development and implementation of a dashboard?
 Human?
 Financial?
 Technical/technological?
 For our next call, would a hands-on tutorial on Excel and/or
another decision support tool like Tableau be helpful?
16