Outcomes Assessment for Dietetics Educators

Outcomes Assessment
for
Dietetics Educators
Carolyn J. Haessig, PhD, RD
Armand S. La Potin, PhD
Revised Edition
Commission on Accreditation for Dietetics Education
American Dietetic Association
Outcomes Assessment
for
Dietetics Educators
Carolyn J. Haessig, PhD, RD
Armand S. La Potin, PhD
Revised Edition
Commission on Accreditation for Dietetics Education
American Dietetic Association
© 2002,
The American Dietetic Association. All rights reserved. No part of this publication may be reproduced,
stored in a retrieval system, or transmitted in any form or by any means, without the prior written consent of the
publisher. Printed in the United States of America.
The views expressed in this publication are those of the authors and do not necessarily reflect the policies and/or
official positions of the Commission on Accreditation for Dietetics Education of the American Dietetic Association.
The American Dietetic Association disclaims responsibility for the application of the information contained herein.
CONTENTS
Introduction to Outcomes Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1
Stage 1: Setting Program Goals, Student Learning Outcomes, and Outcome Measures . . . . . . . . . .6
Stage 2: Developing a Programmatic Assessment Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .12
Stage 3: Gathering Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .17
Stage 4: Selecting Assessment Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .19
Stage 5: Establishing a Timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .25
Stage 6: Closing the Loop—Analyzing, Utilizing, and Reporting Assessment Findings . . . . . . . . .28
Appendixes
A. Primary Trait Analysis Scales—A Variation on the Theme . . . . . . . . . . . . . . . . . . . . . . . . .32
B. Programmatic Assessment Plan—Sample Program Goals to Be Assessed, Years 1-5 . . . . . .35
C. Programmatic Assessment Plan—Sample Learning Outcomes for Students in a DPD . . . . .38
D. Programmatic Assessment Plan—Sample Learning Outcomes for Students in a DI . . . . . .41
E. Programmatic Assessment Plan—Sample Learning Outcomes for Students in a CP . . . . . .44
F. Programmatic Assessment Plan—Sample Learning Outcomes for Students in a DT . . . . . .45
Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .48
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .49
1
INTRODUCTION TO OUTCOMES ASSESSMENT
This handbook, Outcomes Assessment for Dietetics Educators, is designed to assist in planning,
conducting, analyzing, and reporting systematic and comprehensive assessment of program goals and
student learning outcomes. The Commission on Accreditation for Dietetics Education (CADE) of the
American Dietetic Association (ADA) mandates that dietetics education programs address continuous
program improvement in order to earn program accreditation. In December 2001, CADE adopted the
2002 Eligibility Requirements and Accreditation Standards, effective for all programs seeking or
maintaining accreditation in 2003 and beyond. As a result, it became apparent that some revisions to
this handbook were needed in 2002 in order to reflect the thinking and vocabulary of program
accreditation today.
OUTCOMES ASSESSMENT—WHAT’S IT FOR?
The expectation is that leaders can achieve program effectiveness and continuous improvement by
linking mission, goals, curriculum, outcomes, and evaluation in a cyclical fashion as noted in the CADE
Accreditation Handbook (1, p6). As stated in Standard One, "The dietetics education program has
clearly defined a mission, goals, program outcomes, and assessment measures and implements a
systematic, continuous process to assess outcomes, evaluate goal achievement, and improve program
effectiveness" (1, p18). Standard Two requires the dietetics education program to have "a planned
curriculum that provides for achievement of student learning outcomes and expected competence of the
graduate" and "is consistent with the mission, goals, and measurable outcomes for the program" (1,
p21). In addition, dietetics educators are expected to demonstrate "periodic evaluation of the curriculum
objectives, content, length, and educational methods, to improve educational quality" (1, p22). Periodic
evaluation should include a comparison of the current curriculum with new knowledge and technology
impacting dietetics practice.
This handbook will provide dietetics educators with techniques to articulate goals and student
learning outcomes, to formulate outcome measures, and to design, implement, and use a programmatic
assessment plan (PAP). The handbook also includes information to assist dietetics educators in applying
assessment findings and reporting assessment results. It includes charts to illustrate some of the key
assessment principles and provides numerous examples to benefit readers in accomplishing outcomes
assessment. A glossary and an annotated bibliography of assessment references are also included.
OUTCOMES ASSESSMENT—WHAT IS IT?
The task of conducting assessment is initially challenging because participants may have different
interpretations of what is intended when outcomes assessment is proposed. For students, assessment
may mean getting a certain grade, satisfying the expectations for a practicum, or passing a registration
examination. For program directors and senior administrators, the results of assessment may mean the
difference between expanding or abolishing professional academic majors and minors.
2
In 1992, the Assessment Forum, a group of “practitioner-students of assessment” brought together
by the American Association for Higher Education (AAHE), published “9 Principles of Good Practice
for Assessing Student Learning.” These principles are (2, pp2-3):
1. The assessment of student learning begins with educational values.
2. Assessment is most effective when it reflects an understanding of learning as
multidimensional, integrated, and revealed in performance over time.
3. Assessment works best when the programs it seeks to improve have clear, explicitly stated
purposes.
4. Assessment requires attention to outcomes but also and equally to the experiences that lead
to those outcomes.
5. Assessment works best when it is ongoing, not episodic.
6. Assessment fosters wider improvement when representatives from across the educational
community are involved.
7. Assessment makes a difference when it begins with issues of use and illuminates questions
that people really care about.
8. Assessment is most likely to lead to improvement when it is part of a larger set of conditions
that promote change.
9. Through assessment, educators meet responsibilities to students and to the public.
From these nine principles, outcomes assessment has evolved into, and is defined for the purposes
of this handbook as, a comprehensive process for evaluating and, if needed, improving programs and
student learning.
OUTCOMES ASSESSMENT—WHAT PURPOSES DOES IT SERVE?
Improving Teaching and Learning
Outcomes assessment in dietetics education serves several purposes. First, and most important, it
provides information about what students have learned, what skills and competencies they have
developed, and what values they have acquired as a result of participating in their dietetics education
program. This information is the basis for continuously improving teaching and learning and,
consequently, dietetics education programs.
Additionally, students enrolled in a dietetics education program in which outcomes are assessed on
an ongoing basis benefit from participating in assessment activities. Outcomes assessment can enhance
student learning by giving students defined expectations and thus a clearer understanding of what is
expected of them in the learning process. This can assist students in understanding how the subject
matter in different courses is interrelated with varied practicums as well as how a comprehensive
educational program incorporates all learning experiences. Equally significant, when a faculty fully
integrates in its dietetics education program the knowledge, skills, and competencies for dietetic
technicians or dietitians (what is referred to in this handbook as “student learning outcomes”), students
know that they are being prepared in accord with the latest professional standards.
3
Outcomes assessment assists faculty by providing evidence of the knowledge, skills, and values
students have acquired. Faculty can use this critical information either to affirm that their content,
sequencing of experiences, and teaching methods have the desired impact on learning or to suggest
changes that can enhance learning. Unless program directors are able to demonstrate that they are using
results of outcomes assessment in this way, assessment falls short of its primary purpose.
Those who work with students also benefit as a result of applying assessment processes to teaching
and learning. Specifically, because outcomes assessment in dietetics education should be collaborative,
it can enhance partnerships among college personnel, practicum preceptors, and others with a stake in
dietetics education. In general, each gains a clearer idea of how others contribute as an integral
component in the program. For example, preceptors can provide faculty with a perspective of the
unique practicum environment so that course-based learning will prepare students to function in and
learn from practicums. Faculty can assist preceptors to understand what they can expect students to
have mastered before the practicum. Equally important, through mutual cooperation in assessing
outcomes, program directors are assured of high-quality practicum placements for their students, and
practicum preceptors can accept students knowing that they are prepared to meet their high standards of
performance.
Ultimately, clients benefit when they receive care from students and professionals who are part of
dietetics education programs that undergo continuous assessment. For example, hospital patients,
recipients of community-based services, and customers served by food service professionals have an
added level of assurance that their nutritional care is of high quality and that meals are nutritionally
balanced, aesthetically pleasing, and prepared under the highest standards of cleanliness and sanitation.
Achieving Accountability
Administrators of dietetics education programs experience simultaneous pressure to contain costs
and to maintain quality. Both are expectations of accountability. For dietetics education, three of the
most frequently asked questions with regard to accountability are:
•
How well are students learning what dietetics educators claim they are learning?
•
Are dietetics education programs being administered efficiently?
•
Does the program effectively meet the needs of the community in educating dietetics
professionals?
How well are students learning what educators claim they are learning? The assessment of student
learning for improvement of teaching and learning may also address the question of accountability.
Accrediting agencies and administrators want information that documents over time the extent of growth
in knowledge, skills, and values that students demonstrate as a result of participating in or completing
the program. Thus, the information collected to improve teaching and learning is also vital in
documenting the program's success in meeting broad program goals such as “The Program will prepare
graduates to be competent entry-level dietitians or dietetic technicians.”
4
Are programs being administered efficiently? Administrators and the taxpaying public are increasingly
concerned with how institutions for health care and higher education utilize money and other resources,
and both state and federal agencies are holding institutions more accountable as a condition for receiving
funds. For example, the Student Right-to-Know and Campus Security Act mandates the reporting of
rates of student transfers and the number that graduate. Federal regulations for student financial aid
stipulate the length of time that a student may receive various types of financial aid, and amendments to
the Higher Education Act of 1965 require performance measurement reports from programs receiving
grants from the Vocational and Applied Technological Education Act. The addition of these
requirements and others such as compliance with affirmative action and guidelines for human-subjects
research reflects growing concerns for accountability. Practitioners must increasingly confront the
reality that “occupancy rates” and staff-to-client ratios determine staffing levels. Likewise, faculty must
balance the need to provide relatively small laboratory or practicum classes with the expectation that
they achieve a specific faculty-to-student ratio. The use of full-time equivalents (FTEs) or other similar
measures to establish faculty and staff levels is an attempt to maintain efficiency.
Does the program effectively meet the needs of the community? Community needs are an important
consideration with regard to accountability. Dietetics educators are to consider “the need for the
program within the community” as they establish program goals (1, p19). Leadership in this area is
provided by CADE, which regularly reviews and revises the curriculum requirements based on the
knowledge, skills, and competencies for dietetic technicians and dietitians. The Commission also
provides a mechanism for doing this: the selection of a program emphasis for educators preparing
registered dietitians to address specific needs in their state and region.
OUTCOMES ASSESSMENT—HOW IS IT ACCOMPLISHED?
Assessment of outcomes, whether undertaken for improvement of teaching and learning or
accountability, or both, is part of a cycle, and the stages are undertaken in a sequential order. Dietetics
educators should integrate their assessment plan into the planning processes of the program and the
institution. The stages in outcomes assessment, which will be discussed individually in this handbook,
include:
Stage 1. Set program goals and student learning outcomes and formulate outcome measures.
Stage 2. Develop a PAP.
Stage 3. Identify which data are needed.
Stage 4. Select methods to collect data for assessment.
Stage 5. Establish a timeline for assessment activities.
Stage 6. Devise a system for analyzing, utilizing, and reporting assessment results.
The first stage demonstrates how to articulate broad program goals and student learning outcomes
and how to formulate outcome measures for both goals and student learning. These outcome measures
are what make the goals and learning outcomes measurable and thus assessable. The development and
importance of connections to institutional mission are identified in this stage as well.
5
The second stage in outcomes assessment is the development of a PAP. This plan describes what
program directors will document; specifies what activities to include in the assessment of program goals
and student learning outcomes, and identifies how, by whom, and when the necessary activities will be
accomplished. In addition, because dietetics programs are part of larger institutions, the program
director may be expected either to use the assessment plan of the organization as a whole or to integrate
the program's assessment activities into the organization's overall plan.
The third stage is to identify which data are needed and which may be already available. This
stage also includes a discussion of whose performance should be documented and assessed and whose
perceptions of the dietetics education program are important.
The fourth stage is to select the best assessment methods for determining whether the program is
achieving its goals and student learning outcomes and to determine who will collect and record needed
assessment data. Examples as well as guidelines for making these choices will be outlined. An
objective technique, commonly called primary trait analysis, will be introduced and may be used as an
optional method in outcomes assessment.
The fifth stage addresses the establishment of a timeline for assessment activities. The timeline
should include consideration of the four Cs: collaboration, coordination, cycles of assessment, and
constructive feedback. The section on constructive feedback includes definitions of formative and
summative assessment and relevant considerations with regard to their application.
The sixth stage includes a discussion of how dietetics educators might analyze, use, and report
assessment results to improve teaching and learning, and/or to address accountability issues such as
program effectiveness and efficiency.
Although this handbook provides the basic guidelines for conducting outcomes assessment,
dietetics educators who undertake the process are likely to discover that conducting assessment can be
an emotionally charged process. Some participants may construe the results as indicative that their work
is inferior, inappropriate, or somehow inadequate. However, assessment is not premised on faultfinding
but rather on the desire to continue to improve dietetics education programs. Educators undertake
outcomes assessment because they must answer the questions, “How do we know that students are
learning what we think they are?” and “How do we know that our program is meeting community
needs?” And, while there are certainly judgments to be made throughout the assessment process, the
determinations that educators make about goals and student learning outcomes are typically about
dietetics education programs rather than individual faculty, staff, or students. Regardless of one's
individual initial expectations, participation in outcomes assessment is important to all who have a
vested interest in dietetics education.
References
1. Commission on Accreditation for Dietetics Education. CADE Accreditation Handbook: Chicago, Ill:
American Dietetic Association; 2002.
2. American Association for Higher Education Assessment Forum (AAHE). 9 Principles of Good
Practice for Assessing Student Learning. Washington, DC: AAHE; 1992.
6
STAGE 1: SETTING PROGRAM GOALS, EXPECTATIONS FOR STUDENTS,
AND OUTCOMES MEASURES
The Commission on Accreditation for Dietetics Education (CADE) requires that dietetics
education program directors report assessment procedures and outcomes. As noted in the Accreditation
Standards, CADE requires the program to establish "outcomes and appropriate measures to assess
achievement of goals and program effectiveness" (1, p18) and "that its planning and evaluation process
includes evidence that data are collected and analyzed to identify the extent that goals for the program
are being achieved and feedback is incorporated to improve the program" (1, p18). Additionally, CADE
requires all dietetics education programs to have student learning outcomes and to implement "a process
to assess student progress toward achievement of student learning outcomes using a variety of methods
during and at the conclusion of the program" (1, p22). Thus, dietetics educators are expected to assess
program goals and the student learning outcomes, including but not limited to, the foundation
knowledge, skills, and/or competencies that CADE has determined are necessary for entry-level dietetic
technicians and dietitians.1
Articulation of these goals and learning outcomes is the logical starting point for the assessment
process and provides the basis for constructing the additional steps needed for their assessment.
Appropriate outcome measures and, eventually, sound assessment instruments must then be developed
from the goals and learning outcomes if the assessment process is to yield meaningful results. It is also
important to connect program goals and learning outcomes with the mission of the institution. This, too,
can facilitate the assessment process.
WHAT ARE PROGRAM GOALS AND HOW DO I ARTICULATE THEM?
A program goal is “an assessable statement of purpose or intent toward which effort is directed and
that serves as the basis of the program” (1, p74). Also, as noted in the third principle of the American
Association for Higher Education (AAHE), “Assessment works best when the programs it seeks to
improve have clear, explicitly stated purposes.” (2) Program goals help clarify the kind of program that
has been established and its direction. For example, some dietetics education programs are committed
to preparing professionals for a specific state or region, some to preparing individuals with a broad
foundation in liberal arts, and others to fostering leadership skills or service to the community. Program
goals also may address such attributes as personal development or discipline-specific career preparation.
Program goals should be broad, inclusive, realistic, and achievable.
Consider the following example: “The Program will prepare graduates to be competent entry-level
dietitians or dietetic technicians.” This goal indicates that the program will provide the essential
knowledge, skills, and values necessary to enable graduates to function as dietetics professionals.
Another goal statement that addresses the teaching and learning environment can be articulated as
follows: “Through encouragement, motivation and support, program faculty and staff will increase the
number of students who complete their dietetics program of study.” This goal statement specifically
addresses retention in the training of dietetics professionals.
Then, too, a program goal may be one that articulates the prudent use of the program's resources or
accountability. Such a program goal might be stated as: “The program's collective resources will be
used effectively and efficiently.”
_______________________________________
1For all dietetics education programs, the self-study process, as described in the CADE Accreditation Handbook, is
expected to include Examples of Evidence for Program Planning and Outcomes Assessment (Standard One), Curriculum
and Student Learning Outcomes (Standard Two), and Program Management (Standard Three). In some of the assessment
literature, these categories are integrated within, rather than addressed separately from, programmatic assessment.
7
A final example is: “The Program will prepare graduates to demonstrate a commitment to
community service.” This goal statement might reflect a founding premise or an essential mission of the
sponsoring educational institution. Although all of these goals are different, each is broad and
encompassing and addresses in part the objectives of the program.
WHAT ARE OUTCOME MEASURES AND HOW ARE THEY TIED TO GOALS?
All dietetics education programs must have outcome measures, defined by CADE as “standards for
determining a program's ability to meet its stated goals and the extent to which the program meets that
standard; measures of the end result or change” (1, p75). It is important to note that while program
goals are broad statements which capture program objectives that educators desire to assess, it is
outcome measures that make those goals quantifiable and hence assessable.
To formulate outcome measures from program goals, dietetics educators need to ask, “what
quantifiable criteria can be used in determining whether the goal can be met?” Let’s look again at the
previous example. “The program will prepare graduates to be competent entry-level dietitians or dietetic
technicians” addresses a basic and legitimate program goal, but assessment requires that specific
measurable criteria must be developed. One might ask, “What quantifiable criteria can be used in
determining whether entry-level dietitians or dietetics technicians are competent?” As with many
professional organizations, the American Dietetic Association (ADA) has established the Commission on
Dietetic Registration (CDR) to administer national examinations to credential graduates of professional
programs. Consequently, results of the registration examinations for dietetic technician and dietitian
(DTR and RD) can be used as an outcome measure in assessing the goal of competency. Specifically, an
average pass rate of at least 80% of graduates over a 5-year period is used as a benchmark by CADE
and can be used as an outcome measure for this goal. Thus, plausible outcome measures for this goal
might be expressed as follows:
•
Alumni achieve over a 5-year period a pass rate of at least 80% on the DTR or RD exam.
•
Within 12 months of completing the program, at least 60% of graduates will have passed the
DTR or RD exam, obtained employment related to their major, or enrolled in an accredited
continuing education program.
The second outcome measure reflects the faculty’s definition of “competent” as passing the DTR or RD
examination, securing employment, or gaining admission to a more advanced education program. A
more general outcome measure might also include the requirement that graduates meet all of the
competencies mandated by CADE within that program. This could be expressed specifically as follows:
Students achieve a rating of “yes” for “meets competency consistently” for all of the entry-level
competencies.
Other measurable criteria for this program goal could be formulated from the views of employers
who hire the program’s graduates. Consequently, a quantifiable outcome measure may be expressed as:
When surveyed, the mean rating for “knowledge base” that employers will give program graduates in
their employ will meet or exceed the rating of “3” or “satisfactory.”
Outcome measures also can be formulated from the program goal “Through encouragement,
motivation, and support, program faculty and staff will increase the number of students who complete
their dietetics program of study.” Once again, dietetics educators need to formulate the quantifiable
criteria to determine whether the faculty and staff attained the goal. One way to formulate an outcome
measure from this goal is to assign a realistic number to assess attainment of retention, such as:
“Eighty-five percent of the students who enter the program will complete it.”
8
Another way might be to quantify student perceptions of those initiatives that could enhance
retention. These could be expressed as follows:
•
Students indicate on surveys “satisfactory” or better scores with respect to the
encouragement, motivation, and support provided by the program’s academic advisement
staff.
•
Students indicate on surveys “satisfactory” or better scores with respect to the
encouragement, motivation, and support provided by the faculty and/or preceptors.
Yet a third method to formulate outcome measures is to address and quantify a retention initiative
directly, such as: “All students at risk of not completing the program are identified within the first 25%
of the program duration.”
Now let’s derive outcome measures from the program goal “The program’s collective resources
will be used effectively and efficiently.” As before, the quantifiable criteria derived from the goal are
critical here. The outcome measures can be formulated from the perception of stakeholders regarding
the administration of the program, such as: Students will indicate on surveys that they perceive that
resources were utilized effectively to support instruction; or Institutional auditors will judge the
program’s use of space and budgetary resources as efficient according to commonly accepted guidelines.
Other outcome measures could quantify generally accepted indexes used in determining effectiveness
and efficiency, such as the following:
•
Dietetics education staff shall maintain a student-to-educator ratio consistent with the norm
for the institution or agency.
•
The program will update at least annually 3% of the resource and reference materials for
faculty and student use.
A final example illustrates this process. “The program will prepare graduates to demonstrate a
commitment to community service” requires dietetics educators to formulate quantifiable criteria in
determining achievement of the goal. In other words, how can one measure “a commitment to
community service”? One possible criterion might be the involvement of graduates in community
service. This can be expressed as: “Graduates will indicate on the alumni survey that they participate in
one or more community service activities.”
WHAT ARE LEARNING OUTCOMES AND HOW DO I ARTICULATE THEM?
Student learning outcomes are broad statements of what dietetics educators expect students to
know, understand, and be able to do and/or value as a result of completing a dietetics program. For
dietetics educators, these learning outcomes also would include broad statements that encompass the
Foundation Knowledge and Skills and/or the Competencies for entry-level dietetic technicians or
dietitians.
One approach to formulating these broad learning outcomes would be to use the categories that
CADE provides as focal points. For example, the categories for the Foundation Knowledge and Skills
for the Didactic Component are communications, physical and biological sciences, social sciences,
research, food, nutrition, management, and health care systems. Using the first of these, communications,
a program faculty might develop the following as a learning outcome: “Students will demonstrate the
ability to communicate effectively.” Similarly, in the area of management a dietetic technician program
might opt for an outcome such as: “Students will demonstrate their understanding of various
management concepts and functions.”
9
A similar approach could be used for creating student learning outcomes for the supervised
practice component of dietetics education. Although CADE does not identify categories for the core
competencies for dietitians or dietetic technicians, some of those provided for the didactic component,
such as communications, management, research, food, nutrition, and health care systems could be used
to group the competencies. For example, several of the core competencies could be represented by:
“Students will demonstrate the ability to communicate effectively.” Other core competencies are
addressed by this outcome: “Students will demonstrate their ability to use efficiently and effectively the
techniques and tools for managing foodservice systems.”
Alternately, a program might opt to use categories such as nutrition therapy, community nutrition,
foodservice systems management, and education to formulate learning outcomes. Another possibility
might be to develop learning outcomes around the roles that graduates are expected to fulfill. In this
case, learning outcomes might be formulated for critical thinker, provider of nutrition care, ethical
practitioner, competent manager, and committed professional.
Numerous approaches are available for articulating learning outcomes. Program faculty members
should construct their expectations for students in a manner that is meaningful for them.
LINKING OUTCOME MEASURES TO STUDENT LEARNING OUTCOMES
As previously noted, CADE requires all dietetics education programs to assess student learning
outcomes. As with program goals, the use of outcome measures makes learning quantifiable and hence
assessable. The Foundation Knowledge and Skills and the Competency Statements are learning
outcome measures. For example, if the expectation is that “students will demonstrate the ability to
communicate effectively,” some of the outcome measures for a dietetic internship might include:
CD6.
Use current technologies for information and communication activities.
CD8.
Provide dietetics education in supervised practice settings.
CD37. Coordinate and modify nutrition care activities among caregivers.
CD39. Refer patients/clients to appropriate community services for general health
and nutrition needs and to other primary care providers as appropriate.
Other examples of student learning outcomes and outcome measures appears in the appendixes at
the end of the book. Appendixes C through F illustrate the components of a programmatic assessment
plan (PAP) for a didactic program in dietetics (DPD), dietetics internship (DI), coordinated program
(CP), and dietetic technician program (DT) respectively.
Articulating outcomes for student learning and making them measurable are essential components
of the assessment process. Such outcome measures have another benefit as well. They compel dietetics
educators to reach consensus on the meaning of the learning outcomes. Consider, once again, this
outcome: “Students will demonstrate the ability to communicate effectively.” Let us say that several
faculty members have included assignments in three different learning experiences to reach this
outcome. Ms. Burton interprets the requirement to mean that students can participate in mock
foodservice management conferences by verbally reporting their observations of the morning’s
production activities. Dr. Pier’s interpretation of the same outcome is to consider whether students can
successfully undertake a presentation on foods with fiber at a senior meal site. Mr. Ryder believes that a
measure of the outcome is whether students can complete several readings on group process and can
fulfill, within a student group, the usual group roles, (eg, leader, scribe, and facilitator).
10
No one would suggest that any of these perceptions is inappropriate or that students should have
only one of these assignments or experiences as a means of learning to communicate effectively.
However, the dietetics educators should agree on the measures for this outcome, so they can select
assessment methods appropriate to the outcome measure as noted in stage 4 of outcomes assessment.
Faculty in another dietetics education program might have a different set of student learning
outcomes and/or different groupings of Foundation Knowledge and Skills and/or Competencies as
measures used to meet those outcomes. What is important is that those concerned with a program’s
curriculum reach consensus with regard to the student learning outcomes and outcome measures that
give structure to their program.
Ideally, faculty should develop this shared vision of student learning and corresponding outcome
measures as part of the program planning process. However, as faculty and preceptors change and
courses and experiences evolve over time, an individual instructor’s understanding of specific learning
outcomes can become unclear. Undertaking assessment provides an excellent opportunity for dietetics
educators to refine or affirm outcome measures relative to student learning. Then, too, dietetics
educators should not expect their students to understand what “communicate effectively” means if their
faculty members have not decided or do not agree on what it means.
There is another reason to begin assessment with a discussion of student learning and appropriate
outcome measures. It is not uncommon at this early stage of assessment for dietetics educators to
discover that each thinks others are responsible for a particular learning outcome because everyone
thought it was covered in someone else’s course or supervised practice experience. Obviously, this is
likely to result in gaps in the students’ preparation.
Alternately, faculty may discover that nearly every course or experience includes learning
experiences related to the same outcome measure. For example, the curriculum may have six different
experiences or rotations that include the outcome: “CD8. Provide dietetics education in supervised
practice settings.” Unless outcome measures are carefully articulated and consensus reached about their
placement in the curriculum, students may be taught the same things repeatedly and may not be exposed
to others. Thus, by undertaking outcomes assessment, the faculty may find that there is unnecessary
duplication of efforts to assess learning. The faculty can waste precious time if nearly everyone teaches
or assesses the same outcome while giving little attention to other outcomes.
LINKING GOALS AND LEARNING OUTCOMES TO INSTITUTIONAL MISSION
As required by CADE, The program must establish goals and demonstrate “how these goals reflect
the program’s mission statement and the environment in which the program exists” (1, p18). It is
important to demonstrate that program mission and goals are consistent with, if not derived directly
from, the institutional mission, whether the institution is a college, hospital, agency, or corporation.
An institution’s mission is “a statement of belief that guides the planning and operation of an institution”
(1, p75). Defining a relationship between the parent organization’s mission and what the program is to
accomplish establishes planning and assessment linkages that are important to dietetics educators as well
as institutional and accrediting bodies. This relationship demonstrates that conclusions derived from
assessed goals should merit consideration at the institutional policy level. Establishing these linkages
early in the assessment process also may facilitate broad institutional support to undertake assessment.
It also can prove helpful in obtaining administrative support for program improvement and possibly
ensure the very survival of the program in a changing institutional climate.
11
Goals can be tied to the institution’s mission by citing appropriate phrases from the
institutional mission statement that are consistent with each program goal. To illustrate from the
examples, a program goal such as “The program will prepare graduates to be competent entrylevel dietitians or dietetic technicians” could be integrated with the institution’s mission to
provide graduates with a high quality of knowledge and skills development. An example of this
would be: “Graduates will meet the high standards expected of entry-level professionals.” A
goal stating “Through encouragement, motivation, and support, program faculty and staff will
increase the number of students who complete their dietetics program of study” may be linked
to references in the institutional mission statement that stress the importance of student
retention. For example, part of the institution’s mission might be “The institution is committed
to enhancing the graduation rate of its students.”
Program directors may opt to tie the student learning outcomes to the institutional mission
either directly or indirectly. To illustrate, consider how a direct connection could be made using
one of the earlier examples: “Students will demonstrate the ability to communicate effectively.”
This learning outcome may be linked to an institutional mission statement such as “graduates
will apply their knowledge of communication arts.”
However, CADE does not require that dietetics educators establish direct linkages between
student learning outcomes and institutional mission statements. And because student learning
outcomes are specific to career or profession and mission statements typically are broad, it may
be difficult if not impossible to make direct connections between the two. Thus, rather than
attempt to link each student outcome for professional competence directly to the institutional
mission statement, all may be linked indirectly as a group to the institution’s mission via the
program’s mission and program goal such as “The Program will prepare graduates to be
competent entry-level dietitians or dietetic technicians.” Dietetics educators should make either
direct or indirect linkages between their student learning outcomes and the institutional mission
as appropriate.
After you demonstrate processes for articulating program goals and student learning
outcomes, formulate outcome measures for the goals and learning, and link mission statements
to the goals and learning outcomes where applicable, it is necessary to develop a comprehensive
plan for assessment. This plan should note other components essential to the assessment
process.
References
1. Commission on Accreditation for Dietetics Education. CADE Accreditation Handbook:
Chicago, Ill: American Dietetic Association; 2002.
2. American Association for Higher Education Assessment Forum (AAHE). 9 Principles of
Good Practice for Assessing Student Learning. Washington, DC: AAHE; 1992.
12
STAGE 2: DEVELOPING A PROGRAMMATIC ASSESSMENT PLAN
Formulation of a programmatic assessment plan (PAP) is critical in the assessment process. A PAP
outlines the logical progression of activities for assessing whether a goal or learning outcome has been
met. It gives participants a clear understanding of the overall assessment process, expectations of them,
their function in the process, and deadlines for completing their work. A plan is important because
sharing in assessment fosters a sense of ownership for the process and the findings. Additionally, having
a written PAP can provide evidence that the process was undertaken in a planned and systematic fashion.
A PAP (Table 2.1) should document
•
the program’s goals and student learning outcomes that the program will assess.
•
the linkages between the program’s goals and student learning outcomes and the institutional
mission where applicable.
•
the outcome measures that faculty and staff will use to determine a program’s ability to meet
each of its stated goals and achieve the student learning outcomes.
•
the data required, and whether those data already exist or need to be collected as part of the
assessment process.
•
the groups that will undergo assessment.
•
the assessment methods that will be used to collect any needed data as well as those
individuals responsible for ensuring collection of the data.
•
the timeline for collecting the necessary data.
The PAP covers stages 1 to 5 of assessment. The sixth stage includes analyzing assessment
findings, using findings for improvement of teaching and learning, and accountability, as well as
reporting findings to various constituencies.1
Table 2.1
Basic Design for Programmatic Assessment Plan (Stage 2)
Program Goals or Student Learning Outcomes (Stage 1)
Institutional Mission Reference (Stage 1)
Outcome
Measures
Data
Needed
(Stage 1)
(Stage 3)
Data
Already
Available?
(Stage 3)
What
Groups Will
Be
Assessed?
Assessment
Methods
(Stage 4)
Who Will
Conduct
Assessment?
Timeline
(Stage 5)
(Stage 4)
(Stage 3)
_______________________________________
1Programmatic assessment is not complete without analyzing, reporting, and using assessment findings. Refer to the
chapter on Stage 6: Closing the Loop, pages 28-31 in this handbook.
13
WHY INCLUDE GOALS, STUDENT LEARNING OUTCOMES, AND OUTCOME MEASURES?
As discussed in stage 1, a program’s goals, student learning outcomes, and appropriate outcomes
measures provide the foundation of programmatic assessment. Recording them in a PAP helps educators
maintain the appropriate focus on these important stages of assessment. Also, as previously noted,
assessment provides an opportunity to reaffirm or refine goals, student learning outcomes, and outcome
measures to ensure that they are measurable and consistent with the program’s intentions.
Stage 1 demonstrated how to articulate program goals and student learning outcomes. As program
directors begin constructing a PAP, they must determine what program goals and student learning
outcomes for students need to be assessed. This includes identifying mandates.
Mandates for Goals and Expectations
In addition to complying with the requirements of the Commission on Accreditation for Dietetics
Education (CADE), dietetics educators may need to comply with the mandates of other accrediting
agencies. These mandates should be reviewed carefully to determine their impact on the PAP. Typical
expectations address improving teaching and learning, achieving accountability, administering programs
efficiently, and meeting the needs of the community served.
These parallel or additional mandates may come from regional accrediting associations for
colleges and universities1 that expect colleges and universities, including those sponsoring dietetics
education programs, to be engaged in self-study. Self-study provides evidence that the assessment of
program goals and student learning is an ongoing activity. Similarly, the staff and faculty of health care
organizations are expected to assess their accomplishment of measurable outcomes in accordance with
the expectations established by the Joint Commission on Accreditation of Healthcare Organizations
(JCAHO). Comparable accrediting agencies, legislative bodies, or funding sources may establish
assessment mandates for dietetics education programs housed in other settings, such as the armed
services or state agencies. Additional mandates may come from federal or state regulatory agencies,
such as a state department of education, the program’s sponsoring institution, and a grantor or other
funding source. Careful consideration of the agendas of groups important to the support of dietetics
education programs may reveal the need for additional program goals or even specific expectations for
students.
Selection Criteria for What to Assess
At this point, one might feel overwhelmed by the array of mandated and potential goals and
student learning outcomes that could be assessed. Additionally, program directors should be aware that
while the assessment process can enhance program quality and improve teaching and learning, its
application can require the expenditure of considerable resources. It may not be practical or even
feasible to assess many additional goals and learning outcomes, aside from the ones mandated by CADE
and other accrediting bodies. Therefore, program staff may wish to establish criteria for selection of
goals and learning outcomes to be assessed. Some examples of possible criteria follow.
_______________________________________
1New England Association of Schools and Colleges; Middle States Association of Colleges and Schools; Southern
Association of Colleges and Schools; North Central Association of Colleges and Schools; Northwest Association
of Schools and Colleges; and Western Association of Schools and Colleges.
14
1. Assessment of goal or learning outcomes serves more than one purpose.
2. Assessment of the goal or learning outcome can demonstrate improvement over time.
3. The assessed goal or learning outcome has a potentially significant impact on institutional or
external constituent groups.
Each of these criteria is described in further detail in the following sections.
Assessment of a goal or learning outcome serves more than one purpose. Assessed goals and learning
outcomes that serve several functions make efficient use of the assessment process and the resources
required to undertake assessment. To illustrate, consider this learning outcome: “Students will
demonstrate their understanding of the techniques and tools for managing foodservice systems.”
On the presumption that the program’s graduates are employed in part because they have this
ability, the findings from assessment of this outcome can be used to justify the program’s continued
institutional support. The results also can be used to assist dietetics educators in determining whether
their course content, sequencing of learning experiences, and teaching methods are successful. For
example, should the didactic component on quantity food preparation be completed the semester
preceding the practicum or should the two be taken concurrently? Are practicum placements in food
management as well structured as they should or could be? How do students use the information in the
prerequisite course in food preparation for this course?
It is beneficial to focus assessment efforts on those goals and learning outcomes whose findings
can be used in the assessment of multiple aspects of the dietetics education program.
Assessment of the goal or learning outcome demonstrates improvement over time. An important
CADE mandate (and one congruent with the objectives of dietetics educators) is that educators address
improvement in teaching and learning and not simply document the accomplishment of the required
knowledge and/or competency. Selection of outcome measures for which improvement over time can be
demonstrated may be helpful or even necessary for issues of accountability.
Consider from our sample learning outcome: “Students will demonstrate the ability to
communicate effectively.” Students’ accomplishment of this outcome could be assessed near the end of
the program; however, as illustrated in Appendixes C through F, assessment processes can be applied to
this outcome to show the improvement of teaching and learning over time. Thus, whereas course
assignments help determine each student’s grade in a course or evaluation in a supervised practice
setting, only certain assignments over several semesters or rotations are included in a PAP to document
both proficiency and improvement of communication skills for all students. After isolation and review
of these components or criteria, the program faculty can apply assessment techniques that yield data
documenting the evolutionary development of the students’ ability to communicate effectively. The
planned dietetics education program can be assessed with regard to its preparation of students to meet
this outcome. Dietetics educators can determine whether the content, sequencing of learning
experiences, and teaching methods succeed in helping them achieve goals and/or student learning
outcomes.
Assessed goal or learning outcome impacts institutional and/or external constituents. Typically,
assessment results are shared with CADE and other accrediting agencies as well as appropriate
administrators, program faculty, preceptors, and members of advisory committees. In addition, state
officials such as trustees, governing boards, legislators, and even members of a state’s executive branch
or an agency of the federal government may require submission of assessment reports. Such reports also
can be used to enhance the program’s connection to the institutional mission. Consider a program that
has as a goal: “Through encouragement, motivation, and support, program faculty and staff will
increase the number of students who complete their dietetics program of study.” Data gathered to assess
15
the goal can serve as a recruiting device for prospective students and as a solicitation for support from
alumni, area businesses, and community leaders.
Carousel Approach
After dietetics educators decide what they must assess and the criteria for determining what they
can assess, they need to narrow the selection of possible elective goals and learning outcomes, given the
available resources.
One technique that can assist in making such difficult and important decisions is a carousel
approach. This process can facilitate input from all constituent groups, including program staff,
preceptors, and faculty. Its sequential steps are as follows:
1. The program director decides who will help determine what could be assessed additionally
by providing each with a list of required assessment measures.
2. The program director divides the group into smaller “think tank” groups of no more than
five individuals each.
3. The program director gives each group 5 to 10 minutes to prepare a list of potential goals
and learning outcomes, beyond the mandates, that they think the program should assess with
regard to issues such as:
a. improving teaching and learning
b. documenting how well students are learning what faculty claim they are learning
c. documenting that the program is administered efficiently
d. documenting that the program effectively meets the needs of the community
4. After 5 minutes, lists are rotated to the next group on the “carousel.”
5. The new group reviews the list and adds its own ideas, with the process continuing until
each group has reviewed each list.
6. The groups display the lists around the room.
7. The program director gives each participant a set number of colored, circular, adhesive
labels and asks each to indicate assessment preferences by placing a “stick-on dot” beside
the item. (The program director must decide whether to ask the groups to use different
colors on each list or a single color on all lists. He or she may base the decision on whether
assessment is desired for goals and learning outcomes within each issue. See step 3.)
8. The program director reviews lists for patterns in terms of top-rated items (those with the
most dots).
9. The program director leads the group discussion and records those items ultimately selected
for assessment.
16
WHY IDENTIFY NEEDED DATA, AVAILABILITY, AND ASSESSMENT GROUPS?
The next step in constructing a PAP is to ascertain which data are needed for assessment and
which are available. A record of needed data—that is, information to be used as the basis for a
decision—and of the availability of these data is critical for the assessment process. Data collection
needs to be planned from the beginning, to ensure it is done in a timely fashion (stage 3). Once the
needed data have been identified, dietetics educators are able to determine which assessment instruments
are appropriate for obtaining them. Including these decisions in a PAP provides the opportunity for all
involved in a program to see how and why use of the specified assessment instruments is planned.
A PAP also should note from whom the needed information to assess the specific goals and
learning outcomes can be obtained (stage 3). For example, who is in the position of having information
relative to the competence of students or alumni?
WHY IDENTIFY ASSESSMENT METHODS AND THOSE RESPONSIBLE FOR DATA
COLLECTION?
Numerous types of assessment methods are available. The methods and tools used to conduct
assessment include tests, surveys, and simulations, among others. Conducting outcomes assessment
appropriately requires specificity and precision in selecting the best methods for particular goals or
student learning outcomes (stage 4). Equally important are identifying and recording those who will
administer or be responsible for constructing the various assessment instruments. Identifying those who
can assist with assessment instruments at the planning stage facilitates appropriate delegation and
coordination. Thus, it is necessary to record in a PAP those methods intended to yield the specific data
needed as well as the individuals responsible for collecting the data.
WHY ESTABLISH A TIMELINE?
A timeline links “due dates” to each of the various tasks planned (stage 5). A timeline helps to
ensure that needed activities are accomplished within specified scheduled periods and that all data are
available when needed for analysis. Timelines are essential in most, if not all, planning processes and
should be included in a PAP.
Now that we have described the components of the assessment process and defined the importance
of the PAP, it is necessary to identify the data needed for assessment.
17
STAGE 3: GATHERING DATA
After dietetics educators examine the outcome measures, they must identify the data needed to
accomplish assessment.
IDENTIFYING NEEDED DATA
The question that one typically should ask is, “What information do I need in order to know
whether the outcome measure has been achieved?” For example, consider the goal that states:
“Through encouragement, motivation, and support, program faculty and staff will increase the number
of students who complete their dietetics program of study.” One of the outcome measures for this goal
is: “Students indicate on surveys ‘satisfactory’ or better scores with respect to the encouragement,
motivation, and support provided by the program’s faculty and preceptors.” Thus, program faculty and
staff members should be sure that they develop and routinely administer a student satisfaction survey
that includes a question or questions about the students’ satisfaction with the encouragement, motivation,
and support provided by the faculty and staff. And, as will be discussed in stage 6, the program director
should regularly summarize and analyze the results of the survey in order to make a judgment about
achievement of the desired goal (Appendix B).
The same process applies when identifying the data necessary to determine the accomplishment of
student learning outcomes. For example, this learning outcome: “Students will demonstrate their ability
to use efficiently and effectively the techniques and tools for managing foodservice systems.” If one of
the outcome measures is “Graduates will have knowledge of food production systems,” then the data
needed will be evidence of students’ progress and/or ability. This evidence might include any of the
following: student examinations, accomplishments in lab settings, and real-life projects in a working
foodservice operation (Appendix C).
ARE THE DATA ALREADY AVAILABLE?
Assessment efforts should make maximum use of existing data and information, and the results of
data collection should yield benefits that justify the investment of time and other resources. Program
directors often have more data immediately available to them than they realize. For example, consider
this goal: “Through encouragement, motivation, and support, program faculty and staff will increase the
number of students who complete their dietetics program of study” (Appendix B). Another outcome
measure for this goal is: “Eighty-five percent of the students who enter the program will complete it.”
It is likely that the institution’s office of research, the registrar, the admissions office, or the program
director already maintains substantial amounts of this type of retention data. If the data for the program
are not immediately available, more than likely they can be generated rather quickly once contact has
been made with the appropriate person.
Program audits and relevant data that are available for audits or as a result of audits can be used in
assessing how efficiently the program deploys its resources. For some dietetics education programs,
institutional administrators and personnel in the state’s education or comptroller’s office may have useful
information that can document student and program performance. For example, state offices may
maintain a database of the scores on licensure exams, including those earned by the program’s
graduates. Program directors should seek possible sources that might be in a position to contribute to
the assessment of program goals and student learning outcomes.
18
WHICH GROUPS WILL BE ASSESSED?
The next step in determining the necessary data is to ascertain which groups will be assessed, that
is, whose performance will be assessed and whose perceptions will be solicited. A program director
should list on a programmatic assessment plan the groups which fit either or both of these categories.
When a program considers student learning outcomes, groups whose performance should be assessed
include obvious ones such as current students (at various stages within the program) and alumni. With
regard to program goals, it may be appropriate to assess faculty members’ and preceptors’ instruction
and/or supervision.
Most forms of outcomes assessment require solicitation of the perceptions or opinions of program
participants or those the program has affected in some meaningful way. Program directors should
decide who is likely to have valuable perceptions of what students know, what they can do, and how
they are applying their knowledge as well as who has perceptions about the efficiency and effectiveness
of the program. For example, most educators routinely ask their students to evaluate faculty and
preceptors. Graduates of dietetics education programs frequently are asked their opinions of how well
the program prepared them to continue their education, to pass their registration examination, or to
function in the workplace. Employers are asked for their perceptions of student competence as a means
of assessing program goals via outcome measures, such as: “When surveyed, the mean rating for
knowledge base that employers will give program graduates in their employ will meet or exceed the
rating of ‘3’, or ‘satisfactory’ ” (Appendix B).
Others who have valuable perceptions include faculty advisers for student groups, faculty in
advanced dietetics education programs who have contact with a program’s graduates, and members of
facilities or agencies where students complete elective experiences. Two groups whose perceptions are
viewed as increasingly important are those who leave a program before completing it and the
“decliners.” Decliners are those who are qualified to be admitted or are actually admitted, but who
choose to go elsewhere. Although the individuals in either of these categories may be reluctant to
complete extensive surveys, some will respond to brief questionnaires or short interviews.
Now that the reader has an understanding of the data gathering process, let us consider the
methods for conducting assessment.
19
STAGE 4: SELECTING ASSESSMENT METHODS
Dietetics educators have many different methods they can use to assess program goals and student
learning outcomes, although, as described in this stage, each method has specific attributes that define its
applicability. The traditional methods familiar to most dietetics educators include examinations, such as
the dietetic technician and dietitian registration examinations, as well as, quizzes, papers, reports,
projects, presentations (live and taped), demonstrations, chart entries, case studies, simulations, and field
experiences. Portfolios of students’ work are particularly valuable because they typically reflect the
breadth and depth of students’ preparation and the development of their knowledge and skills over time.
A portfolio also may contain a student’s self-evaluation or other reflective comments. When applied to
suitable situations, other assessment methods, such as capstone courses, surveys, exit interviews, and
audits can be equally useful. Capstone experiences, described in this stage, are particularly valuable for
assessment because their purpose is, in part, to provide opportunity for use of any number of multiple or
overlapping assessment methods.
As dietetics educators gain experience in identifying outcome measures, they may wish to select or
design assessment methods that simultaneously provide data needed for the assessment of multiple
outcome measures. For example, if an instructor selects as an assessment method a project requiring a
group of students to determine the nutritional adequacy of several menus, the instructor could assess
outcome measures, “Graduates will have demonstrated the ability to work effectively as a team member,
and, “Graduates will have demonstrated the ability to translate nutrition needs into food choices and
menus for people of diverse cultures and religions” (Appendix C).
HOW DO I DECIDE WHAT TO USE?
Table 4.1 lists specific methods that can be applied to program goals and student learning
outcomes for both formative and summative assessment. Formative assessment is administered during
an experience to allow time for corrections and improvements. Summative assessment includes the
application of end-of-experience measures and the use of data that provide a cumulative view of
achievement. Both types are discussed further in stage 5.
It may be useful here to describe some methods for outcomes assessment and discuss their
applicability to dietetics education.
Table 4.1
Methods to Assess Program Goals and Student Learning Outcomes
Application to
Application to Student
Assessment Methods
Program Goals
Learning Outcomes
Formative
Summative
Formative
Summative
Audits
X
X
Capstone experiences
X
X
Case studies
X
X
Descriptive statistics
X
Exams, quizzes
X
X
Exit interviews
X
Field experiences
X
X
Group activities
X
X
*
X
X
X
National RD/DTR exams
Oral presentations
X
X
Papers, reports, projects, chart entries
X
X
Portfolios
X
X
X
Simulations
X
X
Surveys
X
X
Videotapes
X
X
*
RD/DTR indicates registered dietitian/dietetic technician, registered.
20
Audits
Audits, which are generally mandated by groups beyond the immediate program staff, address
broad issues of program accountability and effectiveness. They can include, but are not limited to,
current and projected revenues from programs as well as cost analyses that demonstrate program cost
per student.
Audits provide data that are quantitative and best suited for the assessment of program goals.
Because audits generally are conducted by nonstakeholders or disinterested parties, the data resulting
from audits may be considered more objective than data derived from some other assessment methods.
Audits may be used for either formative or summative assessment.
Capstone Experiences
Capstone experiences provide opportunities for students to demonstrate their ability to integrate
and apply knowledge and broad concepts. As the name implies, these experiences occur typically near
the end of programs, after students have acquired broad knowledge and diverse skills. Program
directors may develop capstone experiences specific to their needs or select from models or simulations
that may be already available.
Dietetics educators should design capstone experiences to yield data that address the students’
ability to accomplish numerous competencies. Consequently, capstone experiences are best suited to the
summative assessment of program goals and student learning related to overall student abilities. They
underscore both consistencies and inconsistencies in the attainment of program goals that address
student learning outcomes or student learning.
Capstone experiences may provide students broad affirmation of their knowledge and abilities and
may furnish program faculty and staff with considerable evidence for use in modifying how and when
they teach knowledge and skills.
Case Studies
Case studies challenge students to understand complex examples of clients’ conditions or
management issues, either real or hypothetical. Typically students are presented with examples and
asked a series of questions that determine both their knowledge of subject area and their cognitive
reasoning ability. Case studies provide data regarding student learning and can be used in either
formative or summative assessment.
Descriptive Statistics
Descriptive statistics characterize primarily quantitative data. Their analysis and reportage is a
commonly used assessment method in dietetics education. Relevant data include, but are not limited to,
the following:
•
inquiries, applications, admissions, enrollment, retention, and graduation rates of students
•
measurement of knowledge, skills, and values at entering, midpoint, and graduation
•
performance on program, institutional, state, regional, and national exams
•
placement of alumni in advanced educational programs or professional positions or both
•
student and alumni perceptions and opinions
•
the accomplishments of faculty, staff, and preceptors
•
the accomplishments of the program’s students
•
cost-benefit data and other financial information
21
From descriptive statistics, program directors should identify the data useful for assessing specific
program goals and student learning outcomes. The statistics chosen should be those that address
outcomes measures rather than resources and inputs.
Exams and Quizzes
Examinations and quizzes provide opportunities to document students’ learning in rote or applied
knowledge, and can include essay or objective tests. The data can be qualitative or, with the use of
scaling techniques (see Appendix A), quantifiable. Quizzes and examinations are well suited to the
assessment of students’ knowledge.
Exams and quizzes are easily adaptable to both formative and summative assessment.
Consequently, they can document progression of student learning, either individually or collectively.
Exit Interviews
Exit interviews are typically verbal exchanges designed to ascertain how students completing a
program perceive their dietetics education. Exit interviews can also be conducted with students who
leave the program either voluntarily or involuntarily before completing it. Program directors may
develop exit interview questions specific to their needs or select from commercially available questions.
Exit interviews provide data that are primarily impressionistic; however, some demographic data
may be collected as well. These interviews are more suited to the assessment of program goals than to
specific student learning outcomes. Interviews with all or most exiting students can serve to highlight
consistencies and inconsistencies in perceptions of the value or attainment of program goals and
program quality. The interviewer also may ask questions that yield data about the student’s future
professional plans and the anticipated timeline to professional certification. This may be useful in
determining the scope of purposes for which students selected dietetics as a career choice or a specific
dietetics education program.
By giving interviewees the opportunity to indicate their preferences for optional program
objectives or experiences, program directors can use the results of exit interviews to refine elective
program goals and student learning outcomes.
Field Experiences
Field experiences provide students with experience in the environments in which they will function
after entrance into the dietetics profession. Field study should offer students broad exposure to clinical
experiences, with the opportunity to apply what they have learned in knowledge and skills. Most field
experiences are applied in conjunction with or at the conclusion of the students’ study of classroom
constructs. Field work can yield useful data in summative assessment of both program goals and
student learning outcomes.
Group Activities
Group activities refers to the engagement of students in collective interaction. Students may, for
example, work in groups to accomplish laboratory projects, case studies, and various presentations.
The data generated may be used in both formative and summative assessment of student
performance, specifically in knowledge application and communication skills. Group activities are not
suited to assessment of program goals.
22
Registration, Licensure, and Certification Exams
The Commission on Dietetic Registration (CDR) and many states use credentialing examinations
to maintain standards of professional competency for those prepared for dietetics. The pass rates can be
used in the summative assessment of program goals, and the scores of graduates can be used to ascertain
strengths and deficiencies of student learning. Both examination scores and pass rates can serve as
benchmark figures to compare a specific program to the national average.
Oral Presentations
Oral presentations are another method to document a student’s learning. They integrate applied
knowledge with communication skills. They can include reports on a research topic or summations of
experiential learning.
Oral presentations are similar to class exams and quizzes in that they can be scaled to yield
quantifiable data. They can be adapted to both formative and summative assessment, documenting the
learning curve in applied knowledge and communication skill competencies.
Papers, Reports, Projects, and Chart Entries
Papers, reports, projects, and chart entries are typically written examples of students’ work. All
can be used as viable assessment methods in documenting student performance in knowledge and skills
development for both formative and summative assessment.
Portfolios
A portfolio is a collection of samples of a student’s work that provides evidence of professional
growth and development over time. The portfolio could include projects, exams, papers, videotapes,
chart notes, and so on, done at the beginning of a program and at varied intervals throughout the
program. Many experts agree that students should attach a self-evaluation to each submission so that the
portfolio provides evidence of the students’ developing ability to evaluate themselves.
If educators will use portfolios to provide evidence of student competency or the attainment of
program goals and student learning outcomes, the policies and procedures regarding what will be
included, at what intervals, and in what format must be established before students enter the program.
Program directors may develop program-specific policies and procedures to guide the preparation and
use of portfolios, or they may borrow from other dietetics educators or other disciplines, such as teacher
education and the arts, that have been using portfolios for some time.
Portfolios should include evidence of both the student’s development and the depth and breadth of
knowledge and skills (competencies) attained throughout the program. As is true of capstone
experiences, if the evidence and data found in individual portfolios are summarized and analyzed
collectively, the program director should have considerable insight into how well the program’s students
were able to accomplish intended outcomes. Consequently, portfolios are suited to the summative
assessment of program goals related to overall student abilities and to both the formative and summative
assessment of student learning. The results from assessing portfolios can highlight consistencies and
inconsistencies in the attainment of program goals or student learning.
Portfolios offer students broad affirmation of their knowledge and abilities. These work samples
also provide dietetics educators considerable evidence that they can use to modify the learning
outcomes, the types and numbers of learning experiences and their sequence, and the relative merit of
assessment strategies employed to measure and document learning.
23
Simulations
Simulations are used when it is not feasible to demonstrate a skill in a real-world setting. Even
when real-world settings are available, some programs prefer to use actors or standardized patients to
provide consistency in assessing the capabilities of their students. Simulations allow for direct measures
of performance.
Surveys
Surveys can reveal how individuals (students and other stakeholders) perceive their dietetics
education. Students can include current learners and program alumni as well as those who fail to
complete the program. Stakeholders may include, but are not limited to, preceptors of students and
employers of graduates. Program directors may develop surveys specific to their needs or select from
commercially available survey instruments. When opting for a commercially available survey, dietetics
educators should take care to choose one that provides assessment of the goals and/or student learning
and outcome measures that they have selected.
Surveys provide data that are either impressionistic (subjective) or demographic (objective).
Consequently, they may be more suited to the assessment of the program than to specific student
learning outcomes. Surveying broad groups of diverse program participants and stakeholders can
highlight consistencies and inconsistencies in perceptions of the value or attainment of program goals
and program quality. Surveys can also provide demographic data that may be useful in determining
whether the program is meeting important program goals, such as registration of dietitians or
technicians, employment, and membership in and service to professional associations. Then too,
surveys can assist program directors in refining assessable program goals by giving survey participants
the opportunity to indicate their preferences for optional program objectives or experiences.
Videotapes
Videotapes of students’ oral assignments or interactions with patients, clients, or employees can be
used in the assessment of student performance in terms of knowledge and applied reasoning. Videos are
especially useful in the assessment of oral communication skills and students’ abilities to “think on their
feet.” Videotape recordings are ideally suited to formative and summative assessment because they
provide students with the opportunity to view their performance and identify strengths and weaknesses.
They can be used to document the improvement of student performance over time. As with other
assessment methods, taken collectively, videotapes of student presentations and their efforts to interview
or counsel clients can indicate how well a group or class of students is faring with regard to achieving
the program’s learning outcomes.
Primary Trait Analysis Scales
Primary trait analysis (PTA) scales include specific, measurable criteria arranged by degree of
attainment of the expectation. They are well suited for measuring the student’s mastery of competencies,
and can be used in the assessment of knowledge and skills as well. These scales may be used in both
formative and summative assessment. Specific examples of PTA appear in Appendix A.
WHO WILL CONDUCT ASSESSMENT?
The individual or group responsible for each phase of assessment should be identified and
recorded as part of a programmatic assessment plan (PAP) (Appendix B). The program director should
delegate the responsibility for accomplishing some of the tasks to others because sharing in assessment
fosters a sense of ownership for the process and the findings. Obvious possible participants include
program faculty and preceptors, but other individuals who may not be directly involved in the program
24
may make valuable contributions as well. If the dietetics education program is part of a major
institution, others may have expertise and experience in designing, selecting, and using some of the
assessment methods noted above. Examples include those who have training in educational psychology,
measurement and evaluation, institutional research, or the development and administration of surveys.
Those responsible for compiling statistical information for other programs or the institution as a whole
may be valuable experts in gathering, analyzing, and reporting new or existing data regarding the
dietetics education program.
ARE THE METHODS EASY TO APPLY?
The ease of application of any of these methods for outcomes assessment depends on how well
educators have prepared for assessment before undertaking the actual process. Methods that can
generate data to document attainment of program goals and student learning outcomes can be relatively
easy to apply if the individual who is undertaking assessment incorporates the specific outcome
measures into the method before administering it. Consider an illustration from this program goal:
“Through encouragement, motivation, and support, program faculty and staff will increase the number
of students who complete their dietetics program of study.” Since the PAP (Appendix B) notes that
students will be surveyed on a routine basis, dietetics educators should incorporate into the survey
instrument the information necessary to assess the outcome measure. In the case of a student opinion
survey, the survey designers should include questions related to the students’ satisfaction with the
“encouragement, motivation, and support provided by the program’s faculty and academic advisement
staff” and whether “resources were utilized effectively to support instruction.”
As already noted, it is easier to choose and apply assessment methods if dietetics educators reach
consensus with regard to the articulation of program goals, student learning outcomes, and the
formulation of outcome measures before they undertake assessment. Consider the following learning
outcome: “Students will demonstrate the ability to communicate effectively.” Two of the outcome
measures that the program’s educators agreed on for this expectation were: “Graduates will have
knowledge of counseling theory and methods” and “Graduates will have demonstrated the ability to
explain a public policy position regarding dietetics.” Once dietetics educators have agreed on the
measures appropriate to each learning outcome, they are ready to ask: “Given the descriptions of
assessment methods, which method or methods would best assess achievement of the outcome measure
or measures?”
Referring to Table 4.1, you can see that the methods best suited to the assessment of outcomes
requiring students or graduates to have “knowledge about …” may include class exams and quizzes;
papers, reports, and projects; group activities; oral presentations; case studies; videotapes; or field
experiences. The outcomes requiring students or graduates to have “demonstrated ability to …” might
be assessed by also including capstone experiences, simulations, and portfolios.
Before the assessment period, dietetics educators should decide collectively where and how they
can integrate assessment into their program. Whenever possible, they should draw on activities that are
already being used as standard evaluative practices. In our example noted in stage 1, the dietetics
educators in the program agreed on program goals, student learning outcomes, and the corresponding
outcome measures. Then faculty members modified the evaluative procedures in their courses to
incorporate the assessment methods described here.
After dietetics educators have established the appropriate methods for assessing the goals and
learning outcomes and have selected the individuals who will undertake the process, it is necessary to
integrate the tasks to ensure the timely completion of this round of assessment.
25
STAGE 5: ESTABLISHING A TIMELINE
The next stage of outcomes assessment is to establish a timeline for accomplishing the required
activities. Timelines will vary from program to program because each program director’s assessment
activities likely will take different forms depending on the purposes, goals and objectives, emphasis, and
unique features of the program; the number and backgrounds of the students; and the expectations of the
program’s stakeholders. However, regardless of individual differences, the program directors or other
person should construct the timeline to consider the four Cs: collaboration, coordination, cycles of
assessment, and constructive feedback.
COLLABORATION
As the program director or assessment coordinator designs the timeline (Appendixes C through F),
it is important that it reflects the fact that time is required for outcomes assessment to be collaborative.
For assessment to be useful and sustaining, it should involve most if not all of those who educate
dietetics students. These educators will require time individually and collectively to examine goals;
refine or affirm outcome measures; learn about, select, and/or modify assessment instruments; and use
results for improvement of teaching and learning. The timeline should realistically reflect the time
needed to accomplish assessment collaboratively so that participants can perform their normal jobrelated responsibilities while meeting the established deadlines for completing the assessment of
program goals and student learning outcomes. Additionally, it is necessary to strike a balance between
the number of faculty, preceptors, and staff involved and the need to coordinate activities with groups of
manageable size.
COORDINATION
If the assessment process is to succeed and the outcomes assessment activities are to be integrated,
participants must coordinate their participation. Whoever devises the timeline (the scheduler) should
work backward from the end date when the information is needed, allowing time for each of the planned
tasks to be accomplished in sequence. For example, if an annual assessment report is due June 15, the
scheduler must decide the amount of time necessary to write the report after data collection and analysis.
If the scheduler thinks 2 weeks are adequate, those responsible for gathering and analyzing the data must
complete their work by June 1. Continuing to work back in time, the scheduler decides the amount of
time needed to apply the assessment instrument from which the data would be compiled. The scheduler
must coordinate with those responsible for devising or applying the instrument regarding the time
necessary to perform their tasks. This is especially important because this part of the assessment process
may be expected to involve student assignments in scheduled courses or supervised practice
experiences. Consequently, if the scheduler gives the annual assessment report with a June 15 due date
on the timeline, assessment coordinators may need to plan for assessment of the specific outcomes or
goals no later than the previous autumn.
When dietetics educators are undertaking assessment for the first time, those responsible for
coordinating program assessment should build in as much lead time as possible because it is likely to
take longer than anticipated for participants to accomplish assessment tasks. It may also prove useful to
establish a cushion for the completion of tasks. For example, if a report is due June 15, if may be
advisable to plan to have it completed by June 1.
26
CYCLES OF ASSESSMENT
The third component in constructing a timeline for programmatic assessment is to consider
establishing specific assessment cycles. Assessment is never “finished”; thus, the tasks identified in the
programmatic assessment plan (PAP) should be repeated at regular, although not necessarily the same,
intervals. Some outcome measures may include wording that dictates the assessment timeline. For
example, consider this outcome measure: “Within 12 months of completing the program, at least 60%
of graduates will have passed the dietetic technician or dietitian registration examination, and/or
obtained employment related to their major, and/or enrolled in an accredited continuing education
program.” In this case, the goal suggests the timeline. The assessment instrument should be
administered at least 12 months after students complete the program.
For some other goals, dietetics educators need to determine when they should first use the
assessment instruments and what are appropriate intervals for reassessment. To illustrate when
considering the goal, “The program’s collective resources will be used effectively and efficiently,”
educators will have to determine whether it is important to measure utilization of resources annually or
over a more extended period.
It is difficult if not impossible to assess every goal or learning outcome at the same time. The
timeline should reflect manageable amounts of assessment activities occurring at any one time. Some
programs plan so that all activities will be completed once within a 5-year cycle, with several being
completed in each year of the cycle. The length of the cycle may correspond with the programmatic or
institutional reporting or the reaccreditation schedules.
CONSTRUCTIVE FEEDBACK—FORMATIVE AND SUMMATIVE ASSESSMENT
The fourth consideration in the development of the timeline is constructive feedback. Because the
improvement of teaching and learning is a primary purpose for assessment, it is important that
assessment activities be scheduled in such a way that the assessment findings and results are available
for improvement. Accordingly, assessment activities are typically categorized as either formative or
summative.
Formative assessment is administered during an experience so that there is time for the current
students to benefit from corrections and improvements as a result of the initial assessment. One type of
formative evaluation that typically occurs regularly in dietetics education programs is when students or
interns meet periodically with academic counselors or program directors to review their progress across
the whole program and to identify areas for further attention. As a result of these meetings, students
may be expected to accomplish additional learning experiences, change the sequence of courses, or to
include electives that add breadth to their education.
Another benefit of conducting formative assessment is that it helps to avoid creating excessive
stress for students, faculty, and preceptors because assessment occurs throughout the course or practicum
experience rather than near the end of experiences. When program goals are considered, formative
assessment typically occurs for the purpose of improving the unit’s delivery of programs or services.
In both situations, formative evaluation strategies may be employed at the beginning or at various
points throughout the students’ experiences or the program’s operation. Formative evaluations serve to
check on the progress that students or programs are making and to indicate whether adjustments in
teaching or administration are necessary.
Summative assessment refers to assembling and using data at the end of a sequence of activities in
order to provide a broad overview of teaching, learning, and overall program effectiveness or efficiency.
With regard to students, final exams and other culminating experiences are used to determine the
student’s readiness to move on to the next experience or course. Summative assessments may also
suggest ways to improve teaching and learning for future students.
27
At the programmatic level, summative assessments may be used to determine whether a program
should be continued, expanded, reduced, or modified. It is primarily these findings of summative
evaluation that are used to establish a program’s accountability, effectiveness, and efficiency to external
constituencies.
Assessment should not be initiated simply in response to an accreditation review or any other
mandate. Rather, the assessment process should be part of programmatic planning and ultimately
institutional planning. The creation and utilization of a PAP, as noted in stage 2, is an important
mechanism to ensure continuous improvement. Because the PAP is designed to improve teaching and
learning, as well as to demonstrate the accountability of a dietetics education program, it has the
potential to make high-level administrators more aware of the needs and positive attributes of their
program. This is why as noted in stage 1, the linkages to institutional mission are so important.
Ultimately then, outcomes assessment will become easier to undertake as participants perfect evaluative
processes and see tangible benefits from using them.
Now that we have applied the stages outlined in a PAP, it is time to address assessment results.
28
STAGE 6: CLOSING THE LOOP—ANALYZING, UTILIZING, AND
REPORTING ASSESSMENT FINDINGS
As noted in the Introduction, assessment of program goals provides data or information about the
effectiveness and efficiency of the program. Assessment of student learning outcomes addresses what
students have learned, the skills and competencies they developed, and the values they acquired as a
result of participating in their dietetics education program. For an outcomes assessment cycle to be
complete, groups involved in the assessment process must analyze the data and use the data for
improving both the program and teaching and learning. Ultimately, other internal and external
constituencies may expect to be informed of the accomplishments or actions that the program leaders
plan as a result of what they learned through assessment. Two examples, a program goal and a student
learning outcomes, will serve to illustrate the steps in what is often referred to as “closing the loop.”
CLOSING THE LOOP FOR A PROGRAM GOAL
Analyzing and Discussing Data
The relevant groups that should be involved in analyzing and discussing the data include those
who developed the PAP, selected the assessment methods, determined the appropriate experiences in the
program for evaluating student performance, and conducted the actual assessment. Consider a sample
program goal that addresses the question of retention: “Through encouragement, motivation, and
support, program faculty and staff will increase the number of students who complete their dietetics
program of study” (Appendix B).
Realizing that this goal would not give them insight as to how to improve retention, the dietetics
educators recognized as they began the assessment process that they would need additional information.
Consequently, as they planned their assessment, these educators knew they should collect both the data
needed to assess goal attainment and data that suggested avenues for improving retention. Hence, they
identified several questions on the institution’s broad student opinion survey that had relevance for
retention. These included areas such as academic advisement; academic programs and course-of-study
issues; and students’ satisfaction with social opportunities, facilities, and residence life opportunities.
Thus, the data to be collected relative to this goal included more than just the number and percent of
students retained, as recorded by the registrar or program director. It also included broad student
opinion information from a survey administered to a sample of enrolled students as well as data
collected from interviews with students who voluntarily left the institution before completing the
program.
After collecting these data, the program’s educators met to analyze and discuss the findings.
Analyzing the data collaboratively encouraged all to share in the results, just as they had shared in the
planning and implementation of assessment activities.
In their review of the data, they identified several important facts:
•
The percent of students retained in dietetics through graduation was 71.9% (44 of 61).
•
Dietetics students indicated that they are generally “satisfied” with their overall educational
experience.
•
Current dietetics students are most satisfied with their access to resources, such as the
institution’s library and computers, as well as the availability of a broad spectrum of student
groups such as the Student Dietetic Association.
29
•
Current dietetics students are least satisfied with parking facilities, academic advisement,
and the need to share terminals in courses in which students use computers extensively.
•
Dietetics students who voluntarily withdrew from the program most frequently did so for
one of three reasons: insufficient financial resources, perceived inability to “fit in” with
other students, and unmet expectations—“the program was not what they thought it would
be.”
The focus of their subsequent discussion was on how to improve student retention rather than finding
fault with any one person or aspect of the program.
Using Results for Improvement
In this example, the actual retention rate, nearly 72%, was lower than the rate the educators wanted
to achieve. However, as the faculty had planned, their assessment activities yielded data needed to
enable them to devise some specific initiatives to improve retention. Faculty members logically
concluded that the lack of satisfaction with academic advisement probably has an impact on the rate at
which students voluntarily leave the program. Therefore, faculty chose to address this problem by
changing the way they assigned students to faculty advisers. Rather than the current system of simply
assigning each new student to the next available adviser, they decided to assign students to an adviser by
class. For example, Ms. Burton will now advise juniors and Dr. Pier will advise seniors. This change
should make it possible to provide advisement for the students more as a cohort and to encourage
students to connect with each other early in the program. Another benefit of this change is that it should
now be possible for the faculty adviser to obtain an accurate count of the number of students who will
need advanced courses as well as what course scheduling patterns will minimize scheduling conflicts for
students.
Although it is true that assessment data may suggest a problem, as it did in the example cited, it is
also true that the solution may not be apparent or even attainable. Student dissatisfaction with parking, a
frequent complaint, may fall in this category. The institution may lack resources to provide additional
parking. In this example, although the dietetics educators had assessed student satisfaction, they could
not remedy the perceived parking problem and thus chose not to address this issue further.
It is equally true that assessment data may not suggest any problem at all. Had the program
educators achieved an 85% retention rate, they might have confidently continued their present efforts
without significant change. Dietetics educators should understand that assessment may validate the
attainment of goals as well as indicate a need for improvement.
These dietetics educators, having analyzed and discussed relevant assessment data and having both
formulated and implemented plans for improvement, are now well positioned to report their assessment
processes, findings, and planned actions as well as accomplished improvements.
Informing Constituencies
The analyzed data, implemented changes, and the additional recommendations from the program
staff should be shared with others in the institution and in the broader community. The program director
and faculty can use this information to facilitate broad institutional planning initiatives for the
improvement of teaching and learning and to address mandates of professional and regulatory funding
and accrediting agencies. The dissemination of assessment findings can demonstrate strengths and a
determination to correct weaknesses. For example, professional accrediting bodies such as the
Commission on Accreditation for Dietetics Education (CADE) will want to learn what strengths and
weaknesses in the program and student performance the faculty members identified from assessment
processes and what initiatives they are undertaking to correct them. And external stakeholders, such as
30
prospective employers, may be interested to learn not only what goals these dietetics educators consider
when preparing their graduates but also how they meet them.
Relative to this sample goal, after discussing findings, the dietetics educators determined that they
should work in tandem with other faculty and student development professionals across campus,
because retention was an institutionwide issue. They also concluded that although their retention rate
was not yet to the level desired, it was higher than in previous years and sufficiently high so that parents
and prospective students would be encouraged to know that nearly three-fourths of all entering students
completed the program.
CLOSING THE LOOP FOR A STUDENT LEARNING OUTCOME
Analyzing and Discussing Data
Consider the following example of an outcome: “Students will demonstrate their understanding of the
role of nutrients and food in the achievement and maintenance of human health and well-being”
(Appendix C). After program educators established the measures for this student learning outcome
using the methods identified in stage four, they summarized the outcome measures as follows:
•
Graduates will have a knowledge of organic chemistry, biochemistry, physiology,
microbiology, nutrient metabolism, pathophysiology related to nutrition care, and
fluid and electrolyte requirements.
•
Graduates will have knowledge about outcomes-based research.
•
Graduates will have knowledge of promotion of pleasurable eating.
•
Graduates will have demonstrated the ability to translate nutrition needs into food
choices and menus for people of diverse cultures and religions.
The methods selected to assess these outcome measures included exams and lab reports, a quiz and
a paper, group discussion and a menu planning project, a personal diet project, simulation, and a
nutrition analysis project. Let us assume from the application of assessment methods the following
overall results for students beginning the dietetics program:
•
Students have adequate knowledge of organic chemistry, biochemistry, and microbiology;
however, they lack a thorough understanding of nutrient metabolism, pathophysiology
related to nutrition care, and fluid and electrolyte requirements.
•
Students do not have knowledge about outcomes-based research.
•
Students have a strong knowledge of promotion of pleasurable eating.
•
Students have demonstrated the ability to translate nutrition needs into food choices
and menus for people of diverse cultures and religions.
Clearly then, while students have mastered some parts of these Foundation Knowledge and Skills,
as a group, they have not attained an acceptable level of performance with regard to others.
At this point, once again, the educators should discuss their program’s strengths and weaknesses as
well as why any deficiencies exist, and consider what steps they can take to correct them.
31
Using Results for Improvement
The decision to modify the kind, amount, and/or sequence of learning experiences depends on
whether students achieve the learning outcomes. In this example, the students failed to attain an
acceptable level of performance in some outcome measures. Consequently, educators decided that they
should place more emphasis on nutrient metabolism in the beginning nutrition course. They may also
determine that the current text provides insufficient coverage of fluid and electrolyte requirements and
that a different textbook should be used. Likewise, they may determine that students may learn the
pathophysiology related to nutrition care more effectively through the addition of a computer-assisted
model. With regard to outcomes-based research, the educators may conclude that beginning students
should not be expected to have this knowledge at this point because they will not take the course that
includes most of this content until the following semester. Consequently, they will assess the outcome
measure after the students have taken this course. Certainly, based on the assessment completed thus
far, curriculum changes are not warranted.
Given that educators chose to assess beginning students, they have the opportunity to incorporate
changes in future learning experiences and then reassess the same group of students. The assessment of
these students as juniors and seniors should help them determine whether these modifications were
successful. As a consequence of these data, they also have planned to address deficiencies in the
learning experiences provided to beginning students; thus, the assessment of next year’s beginning
students should enable them to determine whether the modifications they implemented were successful.
These assessment processes provide an excellent example of the benefits of formative and
summative assessment. Using formative assessment, dietetics educators have the opportunity to apply
changes that can affect the learning performance of students as they progress through their course of
study. Summative assessment can be used to determine whether these changes adequately address
student performance at the end of a course sequence.
Informing Constituencies
As explained in the example of closing the loop for a program goal, dietetics educators should
share assessment results with relevant constituencies, such as CADE. Constituencies might include the
institution’s Outcomes Assessment Task Force, the divisional dean, and alumni. Prospective employers
also might be interested in knowing that the program is strengthening students’ preparation with regard
to the program’s outcome measures.
32
APPENDIX A
PRIMARY TRAIT ANALYSIS SCALES—A VARIATION ON THE THEME
Primary trait analysis (PTA) is a technique for assessing students’ performance of outcome
measures, such as the Competency Statements and/or Foundation Knowledge and Skills Requirements
of the Commission on Accreditation for Dietetics Education (CADE). In PTA, educators construct
specific measurable criteria or “traits” and scales to address the attainment of outcomes.
A PTA scale that has many of the assessment methods identified in Table 4.1 has certain inherent
advantages. First, it provides dietetics educators with a means for quantifying the extent of mastery of
the competency or foundation requirement. Second, a PTA scale can aid the student in attaining the
competency or foundation requirement because it spells out the outcomes the student is supposed to
achieve. By communicating the desired outcomes directly to the students, for example by listing them
on the course syllabus or in a student handbook, a PTA gives students a clear understanding of what they
must do to master the competency or foundation requirement. Third, the use of PTA scales as a means
to communicate expected accomplishments to students benefits dietetics educators. They can save
valuable instructional time because a PTA directs students immediately to what is important and
expected.
It should be noted that CADE does not expect the PTA scaling method to be used by dietetics
educators. Furthermore, PTA may not be applicable to the assessment of every goal or student learning
outcome. Nonetheless, it may be worth considering as a viable way to undertake meaningful
assessment.
Two approaches to the use of PTA scales follow.
33
PTA APPROACH 1: SCALING OF VERBS
According to the introductory paragraph to CADE’s Competency Statements for the Supervised
Practice Component of Entry-Level Dietitian Education Programs, these competency statements build
on previously attained skills, expressed as verbs, which are appropriate to use as traits for PTA scales.
In the “scaling of verbs” approach, dietetics educators use these verbs to create a scale that describes
the possible levels of student performance. For example, the action verb supervise presupposes that
the student has attained the skills or traits implied in the words assist or participate, perform, or conduct
(1, p32).
Using one possible measure for the learning outcome “Students will demonstrate the ability to
communicate effectively,” we can create a PTA scale for CD10. “Graduates will supervise education and
training for target groups.” As an example, participate, conduct, and supervise characterize the levels
of performance on the scale. Using CADE’s definitions of these verbs, participate is defined as “take
part in team activities.” Conduct is defined as “activities performed independently,” and supervise is
defined as “able to oversee daily operation of a unit including personnel, resource utilization, and
environmental issues; or, coordinate and direct the activities of a team or project workgroup.” Consequently, the highest level would use the definition of supervise, and the scales below it would be
defined in terms of diminishing levels of performance. A PTA scale for this core competency might
look like this:
Table A.1
Sample PTA scale for CD 10. Graduates will supervise education and training for target groups
Primary Trait
Student Rating
Participate
Failed to take part as a
team member in the
education and training of
target groups.
1
Was a casual participant as
a team member in the
education and training of
target groups.
2
Was an active participant as
a team member in the
education and training of
target groups.
3
Conduct
Required considerable
assistance to educate and
train target groups.
Required some assistance
to educate and train target
groups.
4
5
Was able to perform
independently the education
and training for target
groups.
6
Supervise
Experienced significant
difficulty in coordinating and
directing the activities of a
team or project workgroup
addressing the education
and training for target
groups.
7
Experienced some difficulty
in coordinating and directing
the activities of a team or
project workgroup
addressing the education
and training for target
groups.
8
Able to coordinate and
direct the activities of a
team or project workgroup
addressing the education
and training for target
groups.
9
In this example, dietetics educators identify and scale the action verbs noted by CADE, participate
and conduct, as prerequisites for supervise, as component traits in assessing attainment of the
competency in an entry-level dietitian education program. The same approach also can be used in
scaling competency statements applicable to dietetic technician programs. For example, the scaling of
verbs method in a PTA scale can be applied to DT10. “Conduct education and training for target
groups,” by using the word conduct as the highest level on the PTA scale.
34
PTA APPROACH 2: OUTCOME MEASURES
Another technique is to develop a PTA scale based on the measures that are selected for the
student learning outcomes. Starting with the same outcome “Students will demonstrate the ability to
communicate effectively,” we create the PTA scale using all of its outcome measures. These include but
are not limited to:
•
Graduates will have knowledge of counseling theory and methods.
•
Graduates will have demonstrated the ability to explain a public policy position regarding
dietetics.
•
Graduates will have demonstrated the ability to work effectively as a team member.
These characteristics of effective communication were the measurable responses to the question, How
do we determine whether our graduates can communicate effectively?
Using this method, we can scale this outcome as follows:
Table A.2
Sample PTA scale for student learning outcome: “Ability to communicate effectively”
Primary Trait
Graduates will have
knowledge of
counseling theory
and methods.
Student Rating
Student has little or no
knowledge of counseling
theory and methods.
1-2-3
Graduates will have
demonstrated the
ability to explain a
public policy
position regarding
dietetics.
Student has little or no
demonstrated ability to
explain a public policy
position regarding dietetics.
Graduates will have
demonstrated the
ability to work
effectively as a team
member.
Student has little or no
demonstrated ability to work
effectively as a team
member.
1-2-3
1-2-3
Student has some working
knowledge of counseling
theory and methods.
4-5-6
Student has demonstrated
some ability to explain a
public policy position
regarding dietetics.
4-5-6
Student has demonstrated
some ability to work
effectively as a team
member.
4-5-6
Student has working
knowledge of counseling
theory and methods.
7-8-9
Student has demonstrated
the ability to explain a public
policy position regarding
dietetics.
7-8-9
Student has demonstrated
the ability to work effectively
as a team member.
7-8-9
35
APPENDIX B
PROGRAMMATIC ASSESSMENT PLAN—SAMPLE PROGRAM GOALS
TO BE ASSESSED, YEARS 1 - 5a
Program Goal
I. The program will prepare graduates to be competent entry-level dietitians or dietetic technicians.
Institutional Mission Reference (if applicable)
Graduates will meet the high standards expected of entry-level professionals.
Data
What Groups
Who Will
Already
Will Be
Data
Assessment Conduct
Available? Assessed?
Assessment?
Needed
Methods
Outcome Measures
1. Alumni achieve over a 5RD/DTR
Yes
Graduates/
Exam
CDR score
year period a pass rate of at exam scores
alumni
summary for
least 80% on the RD/DTR
program
exam.
Timeline
Annually
2. Within 12 months of completing the program, at least
60% of graduates will have
passed the RD/DTR exam,
and/or obtained employment
related to their major, and/or
enrolled in an accredited
continuing education
program.
Demographic data
Some
Graduates/
alumni
Survey
Program
Director
Every 3
years
3. Students achieve a
satisfactory rating for all the
entry-level competencies.
Evidence of
student
attainment
of competencies
Yes
Students
Portfolio
Faculty/
preceptors/
Students
Twice per
year
4. The mean rating of
“knowledge base” that
employers will give program
graduates in their employ
will meet or exceed the
rating of “3” or “satisfactory”
on surveys.
Results of
employer
surveys
No
Employers
Survey
Program
Director
Every 3
years
36
APPENDIX B (continued)
Program Goal
II. Through encouragement, motivation, and support, program faculty and staff will increase the number of students who
complete their dietetics program of study.
Institutional Mission Reference (if applicable)
The institution is committed to enhancing the graduation rate of its students.
Data
What Groups
Who Will
Already
Will Be
Data
Assessment Conduct
Available? Assessed?
Assessment? Timeline
Needed
Methods
Outcome Measures
1. Eighty-five percent of the
Retention
Yes
All dietetics
Descriptive
Office of
Annually
students who enter the
figures
students
statistics
Institutional
program will complete it.
Research/
Registrar/
Admissions
Office
2. Students indicate on
surveys “satisfactory” or better
scores with respect to the
encouragement, motivation,
and support provided by the
program’s academic
advisement staff.
Results of
student
opinion
survey
Some
All dietetics
students
Survey
Division of
Student
Affairs/ Office
of Institutional
Research
Every 3
years
3. Students indicate on
surveys “satisfactory” or better
scores with respect to the
encouragement, motivation,
and support provided by the
faculty and/or preceptors.
Results of
student
opinion
survey
Some
All dietetics
students
Survey
Office of
Institutional
Research
Every 3
years
4. All at risk students are
identified within the first 25%
of the program duration.
Placement
test results
Yes
All entering
students
Testing
Office of
Student
Support
Services
Each
time
students
begin
program
37
APPENDIX B (continued)
Program Goal
III. The program’s collective resources will be used effectively and efficiently.
Institutional Mission Reference (if applicable)
None
Data
What Groups
Already
Will Be
Data
Assessment
Available? Assessed?
Needed
Methods
Outcome Measures
1. When surveyed, students
Results of
Some
All dietetics
Survey
will indicate that they perceive student
students
that resources were utilized
opinion
effectively to support
survey
instruction.
Timeline
Every 3
years
2. Institutional auditors will
judge the program’s use of
space and budgetary
resources as efficient
according to commonly
accepted guidelines.
Program’s
Yes
use of space
and fiscal
resources
Not applicable Audit
Institutional
auditors
Annually
3. Dietetics education staff
shall maintain a student-toeducator ratio consistent with
the norm for the institution or
agency
Student –to–
faculty ratio
Yes
All classes
taught by
program
educators
Audit
Office of
Institutional
Research
Annually
4. Annually, the program will
update at least 3% of the
resource and reference
materials for faculty and
student use.
Resource/
materials list
and budget
Some
Library and
department
holdings
Audit
Librarian/
Program
Director
Annually
Program Goal
IV. The program will prepare graduates to demonstrate a commitment to community service.
Institutional Mission Reference (if applicable)
The college will provide leadership in volunteer and community service.
Data
What Groups
Already
Will Be
Data
Assessment
Available? Assessed?
Needed
Methods
Outcome Measures
1. Graduates will indicate on
DemograNo
Graduates/
Survey
the alumni survey that they
phic data
alumni
participate in one or more
community service activities.
a
Who Will
Conduct
Assessment?
Division of
Student
Affairs/Office of
Institutional
Research
Who Will
Conduct
Assessment?
Program
Director
RD/DTR indicates registered dietitian/dietetic technician, registered; CDR, Commission on Dietetic Registration;
CADE, Commission on Accreditation for Dietetics Education.
Timeline
Every 3
years
38
APPENDIX C
PROGRAMMATIC ASSESSMENT PLAN—SAMPLE LEARNING OUTCOMES
FOR STUDENTS IN A DPDa
Learning Outcomes for Students in a DPD
I. Students will demonstrate the ability to communicate effectively.
Institutional Mission Reference (if applicable)
Graduates will meet the high standards expected of entry-level professionals.
What Groups
Data
Data Already Will Be
Assessed?
Needed
Available?
Outcome Measures
Graduates will have
Evidence of No
Students in:
knowledge of counseling
student
Applied
theory and methods.
progress
Nutrition
Who Will
Conduct
Assessment?
Greeley
Timeline
nd
2
semester
juniors
nd
Yes
Foodservice
Systems
Management
Quiz on
counseling
theories
Burton
2
semester
juniors
Yes
Students in:
Community
Nutrition
Papersample
“testimony”
Crocker
1
semester
juniors
Yes
Foodservice
Systems
Management
Essay question
on exam
Burton
2
semester
juniors
No
Students in:
Introduction to
Human
Nutrition
Group menu
analysis
b
project
Pier
1
semester
seniors
Yes
Food Science
Group
presentation of
lab results
Ryder
1
semester
juniors
Partially
Human
Resource
Management
Group analysis
of union issues
Burton
1
semester
seniors
Assessment
by
employers
No
Employers of
DPD
graduates
Survey
Program
Director
3 years
after
graduation
Assessment
by DI/grad
school
faculty
Yes
DI/grad school
faculty
Survey
Program
Director
1 year
after
graduation
Perceptions
of alumni
Yes
Alumni
Focus group
Alumni office
6 months
and 4
years after
graduation
Graduates will have
demonstrated the ability to
explain a public policy
position regarding
dietetics.
Evidence of
student
progress
Graduates will have
demonstrated the ability to
work effectively as a team
member.
Evidence of
student
progress
Foundation Knowledge
and Skills above and so
on.
Assessment
Methods
Videotape of
simulated
counseling
b
session
st
nd
st
st
st
39
APPENDIX C (continued)
Learning Outcomes for Students in a DPD
II. Students will demonstrate their ability to use efficiently and effectively the techniques and tools for managing food service
systems.
Institutional Mission Reference (if applicable)
The college will offer high-quality academic programs … and maintain an effective balance of liberal arts study and career
preparation.
Data
What Groups
Who Will
Already
Will Be
Data
Assessment Conduct
Available? Assessed?
Assessment? Timeline
Needed
Methods
Outcome Measures
nd
Graduates will have
Evidence of
Yes
Students in
Quizzes; lab
Burton
2 semester
knowledge of food delivery
student
Foodservice
reports
juniors
systems.
progress
Systems
Management
st
Graduates will have
knowledge of food
production systems.
Evidence of
student
progress
Yes
Students in
Quantity Food
Production
Exams; lab
reports;
projects
Proctor
1 semester
juniors
Graduates will have
demonstrated the ability to
determine recipe/formula
proportions and
modifications for volume
food production.
Evidence of
student
progress
Yes
Students in
Quantity Food
Production
Quantification
b
projects
Proctor
1 semester
juniors
Graduates will have
knowledge about financial
management including
accounting principles.
Evidence of
student
progress
Yes
Students in
Accounting
Final exam
All sections
2 semester
sophomores
Foundation Knowledge and
Skills above and so on.
Assessment
by
employers
No
Employers of
DPD
graduates
Survey
Program
Director
3 years after
graduation
Assessment
by DI/grad
school
faculty
Yes
DI/grad school
faculty
Survey
Program
Director
1 year after
graduation
Perceptions
of alumni
Yes
Alumni
Focus group
Alumni office
6 months
and 4 years
after
graduation
st
nd
40
APPENDIX C (continued)
Learning Outcomes for Students in a DPD
III. Students will demonstrate their understanding of the role of nutrients and food in the achievement and maintenance of
human health and well-being.
Institutional Mission Reference (if applicable)
The college will offer high-quality academic programs … and maintain an effective balance of liberal arts study and career
preparation.
Data
What Groups
Who Will
Already
Will Be
Assessment Conduct
Assessed?
Assessment? Timeline
Methods
Outcome Measures
Data Needed Available?
nd
Graduates will have
Evidence of
Yes
Students in:
Exams, lab
Faculty
2 semester
st
knowledge of organic
student
Organic
reports
academic
freshmen-1
chemistry, biochemistry,
progress
Chemistry,
advisors
semester
physiology, microbiology,
Physiology,
juniors
nutrient metabolism, pathoMicrobiology,
physiology related to
Beginning and
nutrition care, and fluid and
Advanced
electrolyte requirements.
Nutrition
Graduates will have
knowledge about outcomes
based research.
Graduates will have
knowledge of promotion of
pleasurable eating.
Graduates will have
demonstrated the ability to
translate nutrition needs into
food choices and menus for
people of diverse cultures
and religions.
Foundation Knowledge and
Skills above and so on.
a
b
nd
Yes
Medical
Nutrition
Therapy
Exams, case
studies
Pier
2 semester
seniors
Yes
Advanced
Nutrition
Exams, case
studies
Pier
1 semester
seniors
Yes
Students in:
Applied
Nutrition
Quiz
Greeley
3 semester
juniors
No
Medical
Nutrition
Therapy
Paper
Pier
2 semester
seniors
No
Students in:
Beginning
Nutrition
Group
discussion
Pier
1 semester
freshmen
Yes
Quantity Food
Production
Menu
planning
project
Proctor
1 semester
juniors
Yes
Students in:
Beginning
Nutrition
Personal diet
project
Greeley
1 semester
freshmen
Yes
Applied
Nutrition
Simulation;
nutrition
analysis
project
Pier
3 semester
juniors
Yes
Quantity Food
Production
Group menu
analysis
project
Proctor
1 semester
juniors
Assessment
by employers
No
Employers of
DPD graduates
Survey
Program
Director
3 years after
graduation
Assessment
by DI/grad
school faculty
Yes
DI/grad school
faculty
Survey
Program
Director
1 year after
graduation
Perceptions
of alumni
Yes
Alumni
Focus group
Alumni office
6 months
and 4 years
after
graduation
Evidence of
student
progress
Evidence of
student
progress
Evidence of
student
progress
DPD indicates Didactic Program in Dietetics; and DI, Dietetic Internship.
Indicates evidence of student progress to be included in student’s portfolio.
b
st
rd
nd
st
st
st
rd
st
41
APPENDIX D
PROGRAMMATIC ASSESSMENT PLAN—SAMPLE LEARNING OUTCOMES
FOR STUDENTS IN A DIa
Learning Outcomes for Dietetic Interns
I. Students will demonstrate the ability to communicate effectively.
Institutional Mission Reference (if applicable)
The hospital will make a valued and lasting contribution to the community … and participate in the education of future health
care professionals.
Data
What Groups
Who Will
Already
Will Be
Data
Assessment Conduct
Available? Assessed?
Assessment? Timeline
Needed
Methods
Outcome Measures
CD6. Use current
Evidence of Yes
Interns in first Internet
Baker
Week 2
technologies for information
student
rotation
project
and communication activities. progress
CD8. Provide dietetics
education in supervised
practice settings.
Evidence of
student
progress
Yes
Interns in
diabetes
Lesson plans Delaney
and
presentations
Week 3
Yes
immune
disorders, and
Tripp
Week 18
Yes
production
rotations
Hannaford
Week 7
CD37. Coordinate and modify Evidence of
nutrition care activities among student
caregivers.
progress
No
Interns in
cardiovascular
rotation
Team leader
project; quiz
Levy
Week 10
CD39. Refer patients/clients
to appropriate community
services for general health
and nutrition needs and to
other primary care providers
as appropriate.
Evidence of
student
progress
Yes
Interns in
diabetes and
Referral form
and critique
Delaney
Week 3
Yes
community
nutrition
rotation
Durkin
Week 17
CD6, CD8, CD37, CD39, and
so on.
Assessment
by
employers
Yes
Employers of
DI graduates
Survey
Program
Director
6 months
and 3
years after
graduation
Perceptions
of alumni
Yes
Alumni
Survey
Program
Director
1 year
after
graduation
42
APPENDIX D (continued)
Learning Outcomes for Dietetic Interns
II. Students will demonstrate their ability to use efficiently and effectively the techniques and tools for managing foodservice
systems.
Institutional Mission Reference (if applicable)
The institution will be recognized for its effective and responsible use of all its resources … and will promote such in the
implication of its mission.
Data
What Groups
Who Will
Already
Will Be
Data
Assessment Conduct
Available? Assessed?
Assessment? Timeline
Needed
Methods
Outcome Measures
CD13. Interpret and
Evidence of Yes
Interns in
Prepare fact Brenner
Week 24
incorporate new scientific
student
procurement sheet for safe
knowledge into practice.
progress
rotation
handling of
“high-risk”
food
CD17. Participate in business
or operating plan
development.
Evidence of
student
progress
Yes
Interns in
procurement
rotation
Report and
oral
presentation
Hannaford;
Quince
Week 6
CD19. Perform marketing
functions.
Evidence of
student
progress
Yes
Students in
cafeteria/
employee
dining
rotations
Marketing
project/
survey
Quince
Week 9
CD28. Supervise
Evidence of
procurement, distribution, and student
service within delivery
progress
systems.
No
Students in:
foodservice
systems
management
staff relief
Weekly
production
report
Brenner
Week 6
CD13, CD17, CD19, CD28,
and so on.
Assessment
by
employers
Yes
Employers of
DI graduates
Survey
Program
Director
6 months
and 3
years after
graduation
Perceptions
of alumni
Yes
Alumni
Survey
Program
Director
1 year after
graduation
43
APPENDIX D (continued)
Learning Outcomes for Dietetic Interns
III. Students will provide comprehensive nutrition care based on accurate and complete assessment, careful planning, and
recognition of resource limitations.
Institutional Mission Reference (if applicable)
The institution will be recognized for its effective and responsible use of all its resources…and will promote such in the
implication of its mission.
Data
What Groups
Who Will
Already
Will Be
Data
Assessment Conduct
Available? Assessed?
Assessment? Timeline
Needed
Methods
Outcome Measures
CD15. Develop and
Evidence of Yes
Students in
Quiz, report
Delaney
Week 3
measure outcomes for food student
diabetes and
and nutrition services and
progress
practice.
Yes
community
Durkin
Week 17
nutrition rotations
a
CD30. Supervise nutrition
screening of individual
patients/clients.
Evidence of
student
progress
Yes
Students in
prenatal and
medical-surgical
rotation
Screening
reports; case
studies
Zimmerman
and Wiles
Week 13
CD42. Provide nutrition
care for people of diverse
cultures, and religions
across the lifespan, i.e.,
infancy through geriatrics.
Evidence of
student
progress
Some
Students in
community
nutrition rotations
Quizzes,
case study,
videotape
Durkin
Week 17
Yes
Clinical/
community staff
relief
Employers of DI
graduates
Capstone
project
Durkin and
Zimmerman
Week 25
Survey
Program
Director
6 months
and 3
years after
graduation
Alumni
Survey
Program
Director
1 year
after
graduation
CD15, CD30, CD42, and so Assessment
on.
by
employers
Yes
Perceptions
of alumni
Yes
DI indicates Dietetic Internship.
44
APPENDIX E
PROGRAMMATIC ASSESSMENT PLAN—SAMPLE LEARNING OUTCOMES
FOR STUDENTS IN A CPa
Learning Outcomes for Students in a CP
I. Students will demonstrate the ability to communicate effectively.
Institutional Mission Reference (if applicable)
Graduates will apply their knowledge of communication arts.
Data
What Groups
Already
Will Be
Assessed?
Outcome Measures
Data Needed Available?
Graduates will have
Evidence of
Yes
Students in:
knowledge of counseling
student
Professional
theory and methods.
progress
Communications
Graduates will have
demonstrated the ability to
explain a public policy
position regarding dietetics.
Graduates will have
demonstrated the ability to
work effectively as a team
member.
a
Evidence of
student
progress
Evidence of
student
progress
Assessment
Methods
Videotape of
counseling
session
Who Will
Conduct
Assessment?
Parker
Timeline
st
1 semester
juniors
nd
Yes
Medical
Nutrition
Therapy
Quiz on
counseling
theories
Brown
2 semester
seniors
Yes
Students in:
Nutrition for
Communities
Presentation
Kraft
2 semester
juniors
Some
Human
Nutrition II
Essay
question on
exam
Manes
1 semester
seniors
Yes
Students in:
Food Science
Lab reports
and group
presentation
Duvall
1 semester
juniors
Yes
Quantity Food
Production
Production
groups
Stein
2 semester
juniors
No
Managing
Today’s
Workers
Group
problembased
learning
All faculty
2 semester
seniors
nd
st
st
nd
nd
st
CD6. Use current
technologies for information
and communication
activities.
Evidence of
student
progress
Yes
Students in
Professional
Communications
Internet
report
Parker
1 semester
juniors
CD8. Provide dietetics
education in supervised
practice settings.
Evidence of
student
practice
Some
Students in
Medical
Nutrition
Therapy
Videotape
Brown
2 semester
seniors
CD38. Coordinate and
modify nutrition care
activities among caregivers.
Evidence of
student
progress
No
Students in
Human
Nutrition II
Project and
oral
presentation
Manes
1 semester
seniors
CD39. Refer patients/clients
to appropriate community
services for general health
and nutrition needs and to
other primary care providers
as appropriate.
Evidence of
student
progress
Yes
Students in
Nutrition for
Communities
Referral
reports; case
studies
Kraft
2 semester
juniors
CD6, CD8, CD38, CD39,
and so on.
Assessment
by employers
Perceptions
of alumni
Yes
Employers of
CP graduates
Survey
Program
Director
6 months
and 3 years
after
graduation
Yes
Alumni
Survey
Student
Development
Office
1 year after
graduation
CP indicates Coordinated Program in Dietetics
nd
st
nd
45
APPENDIX F
PROGRAMMATIC ASSESSMENT PLAN—SAMPLE LEARNING OUTCOMES
FOR STUDENTS IN A DTa
Learning Outcomes for Students in a DT Program
I. Students will demonstrate the ability to communicate effectively.
Institutional Mission Reference (if applicable)
Graduates will apply their knowledge of communication arts.
Data
What Groups
Data
Already
Will Be
Outcome Measures
Needed
Available? Assessed?
Graduates will have
Evidence of No
Students in:
demonstrated the ability to
student
Introduction
work effectively as a team
progress
to Nutrition
member.
Yes
Nutrition Care
IV
Who Will
Assessment Conduct
Methods
Assessment? Timeline
st
1% or less
Greenberg
1
group project
semester
th
Senior care
project and
presentation
Jones
4
semester
nd
Graduates will have
knowledge of interviewing
techniques.
Evidence of
student
progress
Yes
Students in
Nutrition Care
II
Videotape
Jones
2
semester
Graduates will have
demonstrated the ability to
present an educational
session for target groups.
Evidence of
student
progress
Yes
Students in:
Community
Nutrition
Presentation
at outpatient
clinic
Lawson
2
semester
Yes
Foodservice
Management
Presentation
for
employees
Maguire
3
semester
Yes
Students in:
Community
Nutrition
Presentation
at outpatient
clinic
Lawson
2
semester
Yes
Foodservice
Management
Maguire
3
semester
Assessment
by
employers
Yes
Employers of
DT graduates
Presentation
for
employees
Survey
Program
Director
6 months
and 3
years after
graduation
Assessment
by DPD
directors
No
DPD directors Survey
of graduates
who enter
DPDs
Program
Director
1 year
after
graduation
Perceptions
of alumni
Yes
Alumni
Student
Development
Office
1 year
after
graduation
DT10. Graduates will be able
to conduct education and
training for target groups.
Foundation Knowledge,
Skills, and Competencies
above and so on.
Evidence of
student
progress
Survey
nd
rd
nd
rd
46
APPENDIX F (continued)
Learning Outcomes for Students in a DT Program
II. Students will demonstrate their ability to promote consumption of foods that meet the nutritional needs of individuals and
groups.
Institutional Mission Reference (if applicable)
None
Data
What Groups
Who Will
Already
Will Be
Data
Assessment Conduct
Available?
Assessed?
Assessment? Timeline
Needed
Methods
Outcome Measures
nd
Graduates will have
Evidence of Yes
Students in
Presentation
Lawson
2
knowledge about availability
student
Community
at outpatient
semester
of food and nutrition
progress
Nutrition
clinic
programs in the community.
nd
Graduates will have
knowledge of food
availability and access for
the individual, the family,
and the community.
Evidence of
student
progress
Yes
Students in
Community
Nutrition
Presentation
at outpatient
clinic
Lawson
2
semester
Graduates will have
knowledge of applied
sensory evaluation of food.
Evidence of
student
progress
Yes
Students in
Foodservice
Management
Presentation
for
employees
Maguire
3
semester
Graduates will have
knowledge about health
promotion and disease
prevention theories.
Evidence of
student
progress
Yes
Students in:
Introduction to
Nutrition
Quiz
Greenberg
1
semester
Yes
Nutrition Care I
Report
Jones
4
semester
rd
st
th
rd
DT22. Graduates will be
able to supervise production
of food that meets nutrition
guidelines, cost parameters,
and consumer acceptance.
Evidence of
student
progress
Yes
Students in
Foodservice
Management
Production
reports
Maguire
3
semester
DT40. Graduates will
participate in nutrition care
for people of diverse cultures
and religions across the
lifespan¯ from infancy
through geriatrics.
Evidence of
student
progress
Some
Students in:
Introduction to
Nutrition
Simulations
Greenberg
1
semester
Yes
Nutrition Care I
Charting
Jones
4
semester
Foundation Knowledge,
Skills, and Competencies
above and so on.
Assessment
by
employers
Yes
Employers of
DT graduates
Survey
Program
Director
6 months
and 3
years after
graduation
Assessment
by DPD
directors
No
DPD directors
of graduates
who enter
DPDs
Survey
Program
Director
1 year
after
graduation
Perceptions
of alumni
Yes
Alumni
Survey
Program
Director
1 year
after
graduation
st
th
47
APPENDIX F (continued)
Learning Outcomes for Students in a DT Program
III. Students will demonstrate their understanding of various management concepts and functions.
Institutional Mission Reference (if applicable)
None
Data
What Groups
Who Will
Already
Will Be
Data
Assessment Conduct
Available? Assessed?
Assessment?
Needed
Methods
Outcome Measures
Graduates will have
Evidence of
Students in:
knowledge about:
student
Program planning,
progress
No
Community
Senior care
Lawson
monitoring, and evaluation;
Nutrition
project
nd
2
semester
rd
Marketing techniques;
Yes
Foodservice
Management
Product sales
report; quiz
Maguire
3
semester
System theory, labor
relations, materials, financial,
and facility management;
Yes
Food and
Beverage
Management
Library
report; quiz
Price
3
semester
Yes
Principles of
Management
Quiz; report
Faculty in all
sections
1
semester
No
Principles of
Management
Quiz
Faculty in all
sections
1
semester
Some
Social
Psychology
Case study
Faculty in all
sessions
1
semester
Quality improvement;
Risk management and
diversity issues.
a
Timeline
rd
st
st
st
rd
DT15. Graduates will be able
to participate in organizational
change and planning and
goal setting processes.
Evidence of
student
progress
Yes
Students in
Food and
Beverage
Services
Capstone
project
Price
3
semester
DT28. Graduates will be able
to supervise safety and
sanitation issues.
Evidence of
student
progress
Yes
Students in
Foodservice
Management
Capstone
project
Maguire
3
semester
Foundation Knowledge,
Skills, and Competencies
above and so on.
Assessment
by
employers
Yes
Employers of
DT graduates
Survey
Program
Director
6 months
and 3
years after
graduation
Assessment
by DPD
directors
No
DPD directors
of graduates
who enter
DPDs
Survey
Program
Director
1 year
after
graduation
Perceptions
of alumni
Yes
Alumni
Survey
Student
Development
Office
1 year
after
graduation
DT indicates Dietetic Technician Program; and DPD, Didactic Program in Dietetics.
rd
48
GLOSSARY
Assessment methods. The evaluative techniques used to yield the necessary data to document that a
program has met its stated goals and expectations for students, and the venues where those techniques
can be applied.
Core competencies. The set of specific knowledge, abilities, skills, capabilities, judgment, attitudes,
and values that entry-level practitioners are expected to possess and apply for employment in dietetics.
Formative assessment. Assessment administered during an experience to allow time for corrections
and improvements.
Goal. See program goal.
Institutional mission statement. A statement of beliefs and purpose that guides the planning and
operation of an institution.
Outcomes assessment. A comprehensive process for evaluating the achievement of program goals and
student learning.
Outcome measures. Standards for determining the ability of a program to meet its stated goals and the
extent to which the program meets that standard; measures of the end result or change.
Primary trait analysis (PTA). A criteria-specific method of scoring that can be used to assess student
performance or outcomes.
Program goal. A broad statement of purpose or intent that serves as the basis of a program; its
accomplishment can be assessed or measured.
Programmatic assessment plan (PAP). A systematic, structured guide for undertaking assessment of
goals and expectations for students.
Student learning outcomes. The anticipated performance or values students are expected to derive
from the educational program. The student learning outcomes are based on the “Foundation Knowledge
and Skills for Didactic Component of Entry-Level Dietitian Education Programs” and/or “Competency
Statements for the Supervised Practice Component of Entry-Level Dietitian Education Programs” and
“Foundation Knowledge and Skills for Didactic Component of Entry-Level Dietetic Technician
Programs and Competency Statements for Supervised Practice Component of Entry-Level Dietetic
Technician Programs (CADE Accreditation Handbook, pp. 29-35 and 37-41).
Summative assessment. The application of end-of-experience measures and the use of data that
provide a cumulative view of achievement.
Timeline. A schedule of when activities will take place.
49
BIBLIOGRAPHY
American Association for Higher Education (AAHE) Assessment Forum. 9 Principles of Good Practice
for Assessing Student Learning. Washington, DC: AAHE; 1992.
This extensively cited reference offers a major set of guidelines for sound fundamental assessment
practices. The AAHE sponsors an annual conference specifically on assessment in higher education.
Commission on Accreditation for Dietetics Education. CADE Accreditation Handbook. Chicago, Ill:
American Dietetic Association; 2002.
This handbook describes the process and standards for quality program development in dietetics
education.
Angelo TA, Cross KP. Classroom Assessment Techniques: A Handbook for College Teachers. 2nd ed.
San Francisco, Calif: Jossey-Bass; 1993.
This is a “how to” reference for articulating and implementing specific learning objectives and for
applying assessment methods for evaluating student competency in attaining these objectives. It is an
excellent indispensable source for teachers who wish to encourage more active learning.
Assessment Update: Progress, Trends, and Practices in Higher Education. San Francisco, Calif: JosseyBass.
A useful periodical, Assessment Update consists of eclectic articles on the application of
assessment techniques.
Banta TW, Lund JP, Black KE, Oblander FW. Assessment in Practice: Putting Principles to Work on
College Campuses. San Francisco, Calif: Jossey-Bass; 1995.
The authors apply the best current assessment methods to 165 actual cases, using principles that
should be incorporated into all effective assessment efforts, at the institutional, program, or departmental
levels.
Brown S, Race P, Smith B. 500 Tips on Assessment. London. England: Kogan Page Ltd; 1996.
This is a hands-on reference to strategies, techniques, and problems in the application of
assessment processes to classroom learning.
Nichols JO. A Practitioner’s Handbook for Institutional Effectiveness and Student Outcomes Assessment
Implementation. 3rd ed. New York, NY: Agathon; 1995.
This book is an excellent practical assessment reference. It deals with the application of
techniques rather than theoretical constructs.
50
Palomba CA, Banta TW. Assessment Essentials: Planning, Implementing, and Improving Assessment in
Higher Education. San Francisco, Calif: Jossey-Bass; 1999.
This introduction to assessment basics covers the entire process, from formulating the plan design
to implementing the data for program improvement or for determining validity of goal or outcome
attainment. It addresses such topics as encouraging faculty with assessment, fostering student
involvement, selecting the right assessment methods, and using specific groups in the process.
Walvoord BE, Anserson VJ. Effective Grading: A Tool for Learning and Assessment. San Francisco,
Calif: Jossey-Bass; 1998.
This text is divided into two sections: “Grading in the Classroom” and “How Grading Serves
Broader Assessment Purpose.” The author provides a number of assessment methods, such as primary
trait analysis scales, and includes numerous resources and suggested activities to assist those assessing
student learning.
COMMISSION ON ACCREDITATION
FOR DIETETICS EDUCATION
AMERICAN DIETETIC ASSOCIATION
CHICAGO, ILLINOIS
CATN: 6106