Susan B. Hannah Vice Chancellor for Academic Affairs

To:
Susan B. Hannah
Vice Chancellor for Academic Affairs
Steven T. Sarratore
Associate Vice Chancellor for Academic Programs and Director of Graduate Programs
From:
Erin J. Frew
Director of Assessment
Date:
June 5, 2006
Subject:
Assessment Council Annual Report for 2005-2006
This report summarizes the activities of the IPFW Assessment Council (AC) during the 2005-2006
academic year. The AC is charged with 1) approving school assessment plans and their updates, (2)
advising the Assessment Director (AD), 3) reviewing annual reports of the general education
subcommittee, 4) making resource recommendations to the Vice Chancellor of Academic Affairs (VCAA)
and 5) recommending changes to assessment and providing an annual report to the Education Policy
Committee (EPC).
Members of the AC for the year were Thomas Bernard (VPA), Hal Broberg (ETCS), Erin Frew (ex-officio
member and Director of Assessment), Peter Goodmann (EPC), Julie Hook (GS), Jay Jackson (A & S),
Connie Kracher (HSC), Mark Masters (GES), Rhonda Meriwether (ACCS), James Moore (BMS), Kathleen
Murphey (EDU), Koichiro Otani (PEA) and Steve Sarratore (OVCAA). Because Jay Jackson and Hal
Broberg were unavailable during the Spring 2006 Semester, Stevens Amidon and Ken Modesitt served as
representatives of their respective schools. Thomas Bernard served as chair.
School Assessment Reports and Plan Updates
The AC reviewed unit summary assessment reports (Appendix A), and provided feedback and
recommendations as appropriate. The table below summarizes actions on the reports:
School/Program
Academic
Counseling &
Career Services
Arts & Sciences
Action
Accepted
5/5/06
Notes
Advisors to athletes have not provided a report for the last three years.
The AD will follow-up with Elliott Blumenthal on this.
Accepted
5/5/06
Anthropology, English and Linguistics, History, Sociology, Ethnic and
Cultural Studies, Film Studies, International Studies, Journalism, Peace
and Conflict Studies did not provide assessment reports.
Business &
Management
Science
Accepted
5/6/06
Education
Accepted
4/21/06
ETCS/OLS
Accepted
2/24/06
Accepted
5/5/06
Accepted
4/21/06
Accepted
4/21/06
Accepted
2/24/06
General Studies
Health Sciences
Honors Program
Public &
Environmental
Affairs
Visual & Performing Accepted
Arts
1/20/06
The SBMS has indirect measures in place; however, it is imperative
that direct measures of assessment be established to measure student
learning outcomes. Currently, the only direct measures are the Test of
Understanding of College Economics and the CPA exam pass rate.
The ASB has no assessment plan. The MBA has begun implementing
direct measures of student learning, though adequate data are not
available on which to base program improvement. Overall, data are not
available to be used by constituents to close the feedback loop and
improve programs. This is likely to be problematic as the self-study and
AASCB site visit approach.
A full-time data manager position is needed to support assessment in
the SOE. K. Murphey was asked to develop a position description for
the position.
OLS and CS are updating their assessment plans.
A single assessment process underwent an initial pilot study. The pilot
measure of writing skills will be repeated in 2006-2007.
The graduate nursing program did not submit an assessment report.
The Honors Program submitted its first assessment report.
No data have been collected using direct measures of student learning;
the AC advised collecting data and using it for program improvement
as soon as possible.
The school assessment committee and the AC recommended
programs better link learning outcomes to assessment measures. An
assessment plan was not submitted for the Bachelor of Art Education.
All departments are encouraged to review the “General Standards” in
Paragraph III.B.1 of SD 98-22.
In addition, the Council reviewed revised assessment plans for the following programs:
Program
Psychology
Action
Accepted
1/20/06
Honors
Accepted
2/24/06
Dental Hygiene
Accepted
4/21/06
Accounting
Accepted
5/5/06
Advise Director of Assessment
2
Notes
The AC reviewed and, after requesting and receiving additional
information, recommended the plan for posting to the assessment web
site and submission to the EPC.
The Honors Program director consulted with the AD and developed a
new assessment plan that includes direct and indirect measures of
student learning. The new plan was recommended for posting to the
assessment web site and submission to the EPC.
The AC reviewed and offered suggestions for improving the learning
outcomes. The new plan was accepted and recommended for posting
to the assessment web site and submission to the EPC.
The AC reviewed and accepted the plan, which is based largely on the
AICPA expectations. It will be posted to the assessment web site.
The AC recommended the following as potential topics for the 2007 assessment workshop: 1) making
assessment easy and less time-consuming, 2) tapping into work already being done and 3) reviewing
assessment nuts and bolts. Suggestions for increasing attendance at the workshops included 1) asking
deans to send a representative from each department, 2) providing door prizes, 3) reporting on assessment
tools currently in use at IPFW, 4) recognizing and rewarding exemplary assessment, 5) hosting a party or
picnic to reward those engaged in best practices, 6) establishing a traveling trophy, 7) presenting examples
of assessment that resulted in improvement and 8) enhancing the visibility of those who do and those who
do not assess well.
Review Annual Reports to the General Education Subcommittee
No reports were necessary for this academic year.
Resource Recommendations
The AC recommended that a joint SOE and campus position be funded to provide assessment support.
Although the primary responsibility of this position would be to support the SOE, the AC envisioned it as
having secondary responsibility as a campus-wide consulting position for programs developing electronic
portfolio systems. K. Murphey drafted a description for the proposed position (Appendix B).
Recommend Changes to Assessment and Submit an Annual Report to the EPC
S. Sarratore, the Associate Vice Chancellor for Academic Programs, was asked to identify the committee or
individual responsible for the university website. He was asked to notify them of challenges associated with
maintaining and changing university web sites so that they reflect the most current program information,
including learning outcomes and assessment. S. Sarratore agreed to follow-up on this.
The report to the Educational Policy Committee (Appendix C) was submitted on June 5, 2006.
Assessment Workshop
An assessment workshop, Part & Parcel: Assessment, Program Review & Accreditation, was offered
March 15, 2006, featuring presentations by S. Sarratore and E. Frew. A panel of IPFW faculty provided
presentations on examples of integrating assessment and program review or accreditation. Approximately
58 faculty and staff attended the workshop and the workshop evaluations were quite positive (see Appendix
D for a summary).
Assessment Conference Presentations
In April 2006 Erin Frew presented at the IPFW program review workshop. She offered information on
integrating program review and assessment.
Other
During the March 17, 2006 AC meeting, Susan Hannah reported that a “Showcase” of academic programs
may be initiated at IPFW. One of the criteria for showcase status may be evidence of accomplishing
student outcomes. She also asked that K. Murphey offer a presentation to the IPFW campus on the use of
electronic portfolios for assessment.
3
The AC voted to hold additional meetings during the 2006-2007 academic year to update the assessment
action plan (including the IPFW learning outcomes) and to consider methods for involving students in
assessment, in addition to its regular business.
Conclusion
The AC was active during the 2005-2006 academic year in a number of areas, including school report and
new and revised department assessment plan review. The following activities are planned for 2006-2007:
1.
2.
3.
4.
5.
4
Continue supporting general education assessment.
Promote assessment mini-grants.
Invite revised program assessment plans in accordance with SD 98-22.
Plan the 2007 assessment workshop.
Update the assessment action plan.
APPENDIX A
A/Y 2004-2005 ASSESSMENT REPORT FOR ACADEMIC COUNSELING AND CAREER SERVICES
Criterion
Y/N
Comments/recommendations
All departments/programs have assessment plans
Y/N
All of the ACCS programs have assessment plans
Assessment measures are linked to program goals
Y/N
All assessment measures are based on information and data collected from
past assessment goals.
All departments/programs submitted reports
Y/N
Department/programs use assessment for program
improvement
Y/N
The advisors that work with the Undecided, Pre-business and Guided Studies students have provided a
report. The advisors who work with the student athletes will provide a report next reporting period.
Improvements to programs are based on data collected from past assessment measures.
Departments/programs base recommendations on data
Y/N
All programs used assessment measured to review program improvements
Prior year recommendations were implemented
Y/N
Based on last year’s recommendations student exit surveys have been separated into the three different
categories of students that we advise: Undecided, Pre-business and Guided Studies.
School support for assessment requested/needed
Y/N
N/A
School-level rewards for continuous improvement
Y/N
N/A
Plan for School-level leadership
Y/N
N/A
Recommended changes to department/ program plans
Y/N
Not at this time
Recommendations to Assessment Council
Y/N
Not at this time
Indiana University Purdue University Fort Wayne
A/Y 2004-05 School of Arts and Sciences Assessment Report
Criterion
All departments/programs have assessment plans
Y/N
N
Assessment measures are linked to program goals
Y
Assessment Plan Standards in Paragraph III.B.1. of SD Y
98-22 have been followed.
Comments/recommendations
The A&S Assessment Committee did not receive
assessment plans from the following:
Anthropology, English and Linguistics, History,
Sociology, Ethnic and Cultural Studies, Film
Studies, International Studies, Journalism, Peace
and Conflict Studies (see notes for each specific
department and program)
Generally, yes. See notes for specifics.
Generally, yes. See notes for specifics.
All departments/programs submitted reports
N
The A&S Assessment Committee did not receive
assessment plans from the following:
Anthropology, English and Linguistics, History,
Sociology, Ethnic and Cultural Studies, Film
Studies, International Studies, Journalism, Peace
and Conflict Studies (see notes for each specific
department and program)
Departments/programs use assessment for program
improvement (please include examples from each
program).
Departments/programs base recommendations on data
Y
Generally, yes. See notes for specifics.
Generally, yes. See notes for specifics.
Prior year recommendations were implemented
Y
Y
School* support for assessment requested/needed
N/A
This information is not requested as a part of the
departmental assessment report. Nevertheless, in
some cases a lack of resources is indicated as a
problem with conducting proper assessment.
School*-level review effective
N/A
This information is not requested as a part of the
departmental assessment report.
University-level support for assessment
requested/needed
N/A
This information is not requested as a part of the
departmental assessment report. Nevertheless, in
some cases a lack of resources is indicated as a
problem with conducting proper assessment.
Some recommendations were made. See notes
for specifics.
Recommended changes to department/program plans
Y
Recommendations to Assessment Council
Y
Generally, yes. See notes for specifics.
A&S committee needs more specific guidance for
assessing departmental plans and reports. We’ve
started this process (e.g. evaluating each report
on a six point scale) and will discuss with the
assessment council.
*Includes ACCS & Honors Program Revised and approved by the Assessment Council, February 2005
6
INDIANA UNIVERSITY - PURDUE UNIVERSITY FORT WAYNE
A/Y 2004-05 School of Arts and Sciences Assessment Report
DEPARTMENTS
Anthropology
No formal assessment report or plan was received from the department. A lack of resources (faculty) was
noted as the primary impediment to preparing a formal report.
Audiology and Speech Sciences
Excellent report. This report is clearly written and reflects sound methods and results. The committee was
impressed with the department’s continuous efforts to identify areas that may need attention or
improvement and with the plans to address such areas. It seems that the master’s degree in speechlanguage pathology may help address some of the concerns. It was unclear from the report when this
degree was going to be offered. Given the low response rate of the graduating seniors, we agree that a
different technique might be beneficial.
Biology
AA: Not applicable. No A.A. degrees were awarded in 2004-2005.
BS: Good report. Since post-test scores again failed to reach the 70% criterion (students have not met this
criteria since 1998), we concur with your decision to revisit this aspect of your assessment package. We
thought the program outcomes 1B1, 1B2, and 1B3 were excellent, but were a little confused about some of
the assessment measures. We wondered, for example, how an Alumni Satisfaction Survey can be used to
assess “knowledge of the scientific method”. We encourage you to develop more direct measures of the
outcomes.
MS: Good report. Clearly stated goals, methods, and results.
Center for Academic Support and Advancement
This is a good report. CASA is to be applauded for the steps they are taking to develop more uniform
assessments standards to determine the effectiveness of the broad range of programs CASA offers. The
development of a database which allows the center to compare data from the TutorTrak system with
student performance is an important step in this process. Once queries are developed to support this
database, the center should have the necessary tools to evaluate the effectiveness of tutoring programs in
a relatively uniform manner.
Chemistry
Good report. We concur with the plans to decrease the survey interval.
Communication
An excellent report. The report is clearly organized with data to support conclusions.
7
English and Linguistics
No report or plan received as of 12/20/2005. Last request was made on 11/28/05. Reply
from Richard Ramsey on this date: “ . . . our assessment committee has been meeting with Erin Frew to
revise the department's assessment plan and to review the materials we have accumulated under the old
plan, and I understand they'll be sending me a report momentarily. I 'll send it along to you as soon as
possible, and in the meantime, I hope you'll accept my apologies”.
Geosciences
Good report. We concur with your plans to modify the way the exit exam is administered. You may want to
check with other departments for ideas. For example, in psychology, the exit exam is part of the senior
capstone course and all students in the course must take the exam in order to receive a course grade.
History
No report received as of 12/20/2005. Last request was made on 11/10/05. Bernd Fischer asked if the
department could turn in their extensive program review instead of an assessment report. We replied that it
would not be a good idea, since program reviews are above and beyond an annual assessment report, and
again asked for an annual assessment report and a copy of their assessment plan. We received no reply.
International Language and Culture Studies
Good report. It would be useful to have the same type of information collected from Spanish, German and
French rather than the mixture of approaches. The examination of the testing, both written and oral is very
good. A common report would be helpful. It would be useful to know what measures are being considered.
It seems that the portfolio method of having students produce their own portfolios is not working. It may be
helpful to have faculty collect the portfolio data rather than the students. Physics uses this approach.
Mathematical Sciences
An excellent report. The department has clear assessment criteria, and is responding to deficiencies
identified by the criteria. The use of matrices to organize the data makes it very easy for the reader to
understand the information presented.
Philosophy
The only assessments reported in this years report was an examination of the grades received by
Philosophy majors taking PHIL 150: Principles of Logic. The Philosophy Department recognizes the fact
that a more useful assessment tool for the program is required, and reports that those tools are in place,
and that steps have been taken to make sure this data is collected in the coming years.
Physics
An excellent report. The department has clear assessment criteria, and is responding to deficiencies
identified by the criteria. The use of matrices to organize the data makes it very easy for the reader to
understand the information presented. The department’s use of portfolio assessment to evaluate students
is to be commended: the portfolio materials are clearly defined, as are the standards used to evaluate the
portfolios.
8
Political Science
A moderately flawed report. This report could use some improvement. The report consists mainly of
observations on closed classes and discusses the issue of insufficient number of faculty. In the report, it is
suggested that the fact that political science courses close is indicative of quality teaching. This may be
very true, but there are obviously other plausible explanations for full classes. Additional indicators of
teaching quality would strengthen the argument. There is a survey included but very little analysis of the
results, which, we thought rather defeats the purpose of carrying out the survey.
Psychology
An outstanding report. The report is clearly organized with data to support conclusions. A clear description
of the rubric used to score various elements of the assessment.
Sociology
No report filed for this year. However, the new Chair, Diane Taub, is eager to establish a program of regular
assessment. From her email of 9/15/2005: “I have been chair of sociology for one year. As you may know,
formal assessments have not been regularly conducted in sociology. We have just received the results of
the external and internal reviews that were conducted last year. In addition to considering the suggestions
in the reviews, we have been modifying the undergraduate and graduate curricula. I very much support the
assessment process, and expect the department to submit an assessment plan and report next year.”
PROGRAMS
American Studies
No report. However, we did receive a formal reply. American studies has been inactive for years (since
1992, only one person has earned the AMST certificate). We therefore concur with plans to completely
renovate the program, and look forward to the new assessment plans.
Ethnic and Cultural Studies
No report. This program is apparently not active and has no coordinator.
Film Studies
No report. This program is apparently not active and has no coordinator.
Gerontology
Good report. The report was understandably limited given that only two students completed the
Gerontology certificate this academic year and only one completed the exit questionnaire. On the other
hand, a 50% return rate is pretty good!
International Studies
No report was filed. However, we did receive a thoughtful reply from the coordinator, Nancy Virtue. The
primary deterrent to producing a formal assessment report is a lack of resources, mainly personnel and
time. From Nancy’s email of 11/11/05: “I have no assessment report to give you from International Studies.
I have so much to do with this program on one course release that I'm afraid assessment ends up on the
back burner. I do have an assessment plan in place for International Studies and have made occasional
stabs at meeting its goals. I think the problem is that it is sort of impracticable given all the other demands
on this position. What you might want to say by way of report is that I'm aware that a different, more
practicable assessment plan might need to be developed eventually, but that I don't have the resources
(mainly time) to do much about it”.
9
Journalism
Jean Colbert, the program director of Journalism, said that she will try to get a report to the committee by
the end of the year.
Liberal Studies
Good report. The program has clearly defined criteria for assessing student papers early in the program,
and at graduation. One concern with this procedure is the fact that papers seen from year to year may be
based upon assignments which vary in the complexity of analysis and research required. If a standardized
assignment could be developed by the program for assessment purposes, it would make a year-to-year
comparison more useful.
Native American Studies
Seriously flawed report. We appreciate the effort put into developing an assessment plan and submitting a
first ever assessment report. However, aside from a head count of students who have earned certificates,
there was little assessment of the stated program goals. For example, the first stated goal is to “provide a
forum for interaction between local Native Americans and IPFW students/faculty in an academic setting”.
The assessment report should indicate how many such opportunities have been provided over the past x
number of years, how many students attended, how the forums were evaluated, etc. The other stated goals
were likewise lacking direct or indirect assessment. The report makes it clear that the primary hindrance to
a proper assessment is a lack of resources (faculty and time). Indeed, the bulk of the assessment report is
a reasoned argument/plea for additional faculty. Without such faculty the program will likely become
inactive.
Peace and Conflict Studies
No report. We received no reply from the listed director, Patrick Ashton.
Women’s Studies
Certificate: No Students
AA: No Students
BA: Good report. Clearly delineated assessment plan. The group plan of exit interviews will be useful when
it is able to be used. It is clear that Women’s Studies is making progress on their goals 1 and 4. It may be
useful to develop plans to measure progress on goals 2 and 3 or to improve the program to better address
these goals.
Submitted by the Arts and Sciences Committee
Stevens Amidon
Jay Jackson (Chair)
Mark Masters
10
A/Y 2004-2005 School of Business & Management Sciences
Criterion
All departments /
programs have
assessment plans
Assessment
measures are linked
to program goals
11
Y/N
Comments/recommendations
N
Each of the five concentrations within the B.S.B. degree program have a formal assessment plan for the record, but, aside from the
regular use of the nationally normed Test of Understanding of College Economics (T.U.C.E.) by the economics concentration, the
operational aspects of the existing plans have been found to be largely dysfunctional. Thus, each concentration has recognized that its
existing assessment plan must be replaced. Driven by the new AACSB-International standards each concentration and the MBA
program is currently charged with reconstructing its assessment process to conform to the new AACSB-International guidelines for
Assurance of Learning Outcomes”. (http://www.aacsb.edu/accreditation/business/STANDARDS.pdf pg.74.)
Aside from ‘recognition’, actual progress on plan design has been slow (until very recently)!
Since AACSB-International does not review associate degree programs, the Associate of Science in Business (A.S.B.) program has
been neglected in our assessment process. In fact, the A.S.B. still does not have a formal assessment plan adopted by the faculty. The
A.S.B. is offered jointly by the three departments in SBMS and consequently must be reviewed by faculty at the school level rather than
at the department level. The school-level Undergraduate Policy Committee (UPC) has not yet made measurable progress on the initial
draft of an assessment plan for the A.S.B. that was submitted to it. This committee still needs to edit/modify the initial draft and bring a
formal proposal to the SBMS faculty.
A newly configured MBA curriculum was adopted by the SBMS faculty during the A.Y. ’04-’05 reporting period and began initial
implementation in fall 2005. This was the culmination of previous assessments of the content and delivery system of the prior program.
The new program has an articulated set of seven strategic learning goals for graduates of the program, and an emerging assessment
plan through the coordinated efforts of the SBMS Graduate Policy Committee, the MBA Director and the IPFW Assessment Director.
(A pilot use of the ETS field test for MBA programs was initiated in December 2005, clearly outside the time frame of this report.) The
intent is to complement this direct measure with a portfolio approach in the required capstone course. Use of the student and alumni
surveys of the AACSB / Educational Benchmarking Institute (EBI) will continue to be the dominant indirect measures of MBA core
curriculum assessment, as survey questions can be directly linked to each of the specific MBA program learning objectives. Given the
embryonic nature of the new program in 2005 and the fact that the AACSB/EBI surveys are conducted only in the odd numbered years,
2005 surveys will not likely be used; the next round of meaningful AACSB/EBI surveys for the MBA program will be in 2007.
Y
The BSB program has delineated a set of six learning objectives for the business core curriculum (published in the Bulletin). These
BSB learning goals are remanded to each of the five concentrations within the program. Each of the five concentrations then
compliments the school-level learning objectives with additional learning objectives unique to their concentrations; these concentrationlevel learning objectives were revisited & reviewed during the ’04-’05 academic year (actual re-design occurred in AY ’05-’06).
The new MBA program has an articulated set of seven strategic learning goals for graduates of the program; the emerging formal
assessment plan for this program is deliberately tied to these seven stated goals.
Assessment Plan
Standards in
Paragraph III.B.1. of
SD 98-22 have been
followed.
All departments /
programs submitted
reports
N
As a result of the review of the ‘04-‘05 concentration/program reports, it is noted that each unit has chosen to focus more on ‘plan redevelopment’ than implementation of data collection under their old and inoperable systems. Such ‘plan re-development’ has involved
the IPFW Assessment Director to ensure compliance with SD 98-22 as well as other pragmatic considerations.
As acknowledged above, since the A.S.B. program still has no formal assessment plan at this moment, there is no comprehensive
report for that program. Enrollment management has been the focus of A.S.B. reports to date. All five concentrations within the B.S.
Business program as well as the MBA program submitted reports. Aside from the analysis of results from the normed Test of
Understanding of College Economics, the reports received consisted primarily of introspective reflections of the results of indirect
measures, and thus the critical need for complementary direct metrics.
Departments /
programs use
assessment for
program improvement
(please include
examples from each
program)
Y
Each program’s continuing intent is to use meaningful assessment measures to drive program / concentration improvement. For the
B.S. Business program, the accounting concentration curriculum was reconfigured, effective with the fall 2004 semester, to provide for
two paths of preparation: i) corporate accounting, and ii) public accounting. This new program flexibility was the result of assessment
dating back to 2002. Further, during AY ’04-’05, three courses (W100, L200, and J300) spanning two SBMS departments, fell under
serious examination for their functionality, content, and /or accountability within the BSB curriculum. (The report for AY ’05-’06 will reflect
a definitive ‘cooperative-package’ outcome for this early assessment-driven review.)
The faculty adoption of the newly configured MBA curriculum occurred during 2004 (with initial implementation in fall 2005). This was
the culmination of lengthy prior assessments of both the content and delivery system of the previous MBA program.
Depts / programs
base recommdtn’s on
data
Y
SBMS programs continue to have a strong empirical orientation / factual basis (including both qualitative as well as quantitative data) for
curricular review and redesign. Both the BSB and MBA curricular revisions mentioned above are grounded in previously collected data.
Prior year
recommendations
were implemented
Y
The implementation of both i) the tandem paths within accounting, and ii) the new MBA program are the direct outgrowth of completing
the loop for a cycle of planning, assessment, redesign, and implementation that extends back in time a year or more
School support for
assessment
requested/needed
Likely
in the
future
As previously reported, the Educational Benchmarking Institute (EBI) instruments are quite expensive (exceeding $2000). The initial
EBI of >00-=01 was funded through the accreditation account which is now closed. The spring 2003 and summer 2005 EBI surveys
were each funded by SBMS. The start-up of ETS field testing (as proposed in some emerging plans) would severely stretch the existing
budgets. While, it would be valuable to perform both the EBI exit survey and the ETS field test for graduating students annually, such an
ambitious plan would necessitate a supplemental source of funding.
Y
Recognition for innovation and continuous improvement is an inner driver for most faculty. Within SBMS, the MBA students recognize
instructional and motivational prowess through their annual Excellence in MBA Teaching award. Delta Sigma Pi business fraternity
announced plans to recognize annually a business faculty member for instructional excellence. Unfortunately, at this time there is no
formal recognition or award for contributions to curricular assessment and innovation.
School-level rewards
for continuous
improvement
12
Y
13
Plan for school-level
leadership
Y
A new ad hoc Assessment task force, representing each academic department, has been constituted for reviewing the component
reports and providing feedback to the units. Collectively, these individuals could provide the leadership for an aggregate/program level
assessment and review of the ASB and BSB programs.
Recommended
changes to
department/program
plans
Y
Written and verbal feedback for AY 2004-‘05 has been provided to each department for their BSB concentrations, to the MBA program,
and to the UPC with respect to the A.S.B. program. The feedback attempted to i) provide a critical review of the current processes, and
ii) suggest constructive possibilities to examine in order to rectify deficiencies. Continuing efforts are underway to increase faculty
awareness of, and participation in, the assessment process for all programs within SBMS.
Recommendations to
Assessment Council
Y
Recognize that most faculty, out of misunderstanding its purpose, have a degree of paranoia about assessment. They fear that it will
reflect poorly on their performance in the classroom. Thus, education of faculty is key! We need to continue to foster and nurture the
premise that assessment is a process toward an end, but not an end in itself.
Assessment Report
School of Business & Management Sciences (SBMS)
A/Y 2004-2005
Executive Summary of the Role of Professional Business Accreditation on Assessment
Professional Accreditation:
The foundation of AACSB-International continuous improvement is designed to enable a business school to recognize its mission and
strategic goals, and to assess its actions and outputs against those stated intentions. The underpinnings of the professional
accreditation process include formal review, both internal (via introspection) and external (via peers), of the operating processes
within a business school. AACSB-International recently adopted a new set of standards which will directly and immediately impact
the planning and assessment processes of member schools.
(http://www.aacsb.edu/accreditation/business/STANDARDS.pdf)
The SBMS Dean has charged each department and the MBA program policy committee with reevaluating, and restructuring as
needed, their current assessment processes to be in concert with the Assurance of Learning Outcomes@ section of the new AACSBInternational standards. This involves the determination of multiple learning objectives, with measures and processes, for each BSB
concentration and the MBA program. While AACSB-International is only concerned with bachelors and graduate degree programs,
we recognize that we must include the previously neglected associate degree as well in order to be consonant with campus
assessment.
AACSB-International expects a combination of direct and indirect measures tied to each recognized learning objective:
Direct measures are defined to include such internal measures of student performance as exam scores, evaluation of student
projects and papers, performance in downstream courses, capstone course activities, student portfolios, and external validation of
student performance such as field tests, comparative CPA Examination pass rates, student >consulting= projects assembled for
regional organizations, and forms of student competitions with business students from other institutions (e.g. Indiana CPA Society for
seniors and the Small Business Development Center sponsored competition for MBA students).
Indirect measures are defined to include evaluations from intern/coop sponsors and all surveys (eg. internal such as interim and exit
surveys and our longitudinal survey of BSB students conducted 1, 3, and 10 years out from graduation date or external such as those
of the Educational Benchmarking Institute (see below)),
The general theme of the assessment of learning and continuous improvement push of AACSB-International seeks to unite local
goals and measures with external validation and benchmarking thru peer institutions.
Benchmarking:
AACSB-International is more than a mere evaluator; it is also a very proactive agent for facilitating processes that can strengthen the
quality of business education. The decision by SBMS to participate in the AACSB-International / Educational Benchmarking Institute
(EBI) project, at both the B.S.B. and M.B.A. levels, will provide access to a powerful tool to facilitate vital internal assessments of our
operating processes for issues ranging from resource allocation to learning outcomes and curriculum innovation.
From the AACSB-International/EBI database of more than 100 contributing business schools, SBMS is able to extract data from six
similar schools, providing a needed frame of reference for the interpretation of the internal assessments. The EBI results enable
direct comparison of SBMS undergraduate responses to those of business students at:
i) the peer group of institutions (ASelect 6"), and
ii) all participating Carnegie II institutions
This instrument is used bi-annually, in odd-numbered years, and consequently the next implementation of the EBI survey will be
2007.
It is a given that this benchmarking project will continue to be a recommended assessment metric in the new standards of AACSBInternational. This form of factual-based analysis will enable us to better recognize our relative strengths and weaknesses as well as
build on the best practices of comparable institutions and, thus, continue to improve upon the quality of the educational service we
provide. Through a continuing relationship with the benchmarking project, SBMS will benefit from substantive, externally validated
data concerning how educational initiatives, which we undertake, compare with peer schools over time.
A/Y 2004-05 School of Education Assessment Report
Criterion
Y/N
Comments/recommendations
All departments/programs have assessment plans
Y
The SOE completed an NCATE Re-Accreditation Process in the spring of 2003. The NCATE Team
recognized the assessment plan used in all programs in the SOE. They critiqued the assessment plan
of the Advanced (M.S.) Elementary and Secondary Education programs, which was based on the old
North Central assessment system, which was not “performance-based.” They also critiqued some
aspects of the Educational Leadership assessment plan, and critiqued the Counseling program for not
having collected data in the Data Management System. We have worked since then to develop the
UAS for each of the programs, implement them, and collect data on the assessments. We presented
this data to a second NCATE Team on November 6-8, 2005 at a Focused NCATE Visit on the Unit
Assessment Systems for the Advanced (M.S.) Programs.
The Institutional Report (IR) for the Focused Visit can be found on the School of Education website
(Electronic submission was strongly urged by NCATE)::
http://www.ipfw.edu/educ/accreditation/Institutional_Report/IR%20Focused%20Visit%208-29-05.doc
The IR documents how we developed and implemented a Unit Assessment System for each of the
Graduate programs: School Counseling, Educational Leadership, and Elementary and Secondary
Education. The plans themselves can be located at:
http://www.ipfw.edu/prst/Counseling666.pdf http://www.ipfw.edu/prst/EDLeadUAS.pdf
http://www.ipfw.edu/prst/ElSecUAS.pdf
The Educational Leadership program has developed a cohort program option; its UAS fits that option,
too. The UAS for the Initial (B.S.) programs continues to operate as we continually fine-tune it. We are
instituting e-portfolios in all programs through TaskStream (www.taskstream.com). All programs will
have had pilot projects by the end of 2006.
15
Assessment measures are linked to program goals
Y
Assessment Plan Standards in Paragraph III,B,1 of
SD 98-22 have been followed.
Y
All departments/programs submitted reports
Y
Early Childhood grogram (A.S.) goals and all program courses are linked to INTASC Standards,
National Early Childhood Program Standards, the SOE Conceptual Framework, and the SOE Mission
Statement. Undergraduate Elementary (Early Childhood and Middle Childhood) and Secondary
program (Early Adolescence and Adolescence Young Adulthood) goals and all program courses are
linked to the INTASC Standards, the SOE Conceptual Framework, the SOE Mission Statement, and,
when applicable, the Indiana Professional Standards Boards standards for specific content areas.
Educational Leadership program goals are linked to the NBPTS Standards, professional standards for
School Administrators, the SOE Conceptual Framework, and the SOE Mission Statement. Counseling
program goals are linked to the NBPTS Standards, professional standards for Marriage and Family
Therapy or School Counseling, the SOE Conceptual Framework, and the SOE Mission Statement. The
newly revised Advanced (M.S.) Elementary and Secondary Education programs are aligned with its
Vision Statement, the SOE Conceptual Framework, the SOE Mission Statement, as well as the Five
Core Propositions of the National Board Standards for Professional Teaching Standards (NBPTS),
which replace the INTASC standards, which are only used for pre-service teachers.
The Unit Assessment Taskforce (UAS) coordinates all assessment efforts. Both Department Chairs
serve on the UAS Taskforce, chaired by the Associate Dean. Also serving on it are the Dean, the
Certification Officer, the Data Manager, and faculty representatives from both departments. As of
Spring 2006, representatives from all programs are also on the UAS Taskforce. On a bi-weekly basis
the Taskforce reviews, updates, analyzes, and oversees implementation of the UAS for all programs
throughout the academic year and during the summer. A UAS has been written for each program, and
assessments are made and recorded on an ongoing basis each semester. Reports on all programs are
submitted to our accreditor, NCATE, each year.
For the graduate, or Advanced, programs, reports on various assessment measures are written by
individual program faculty at the end of the academic year. Those narrative reports along with
quantitative reports generated from our Data Management System (DMS) are all reviewed once a year
at a Faculty Retreat prior to the beginning of the Fall semester. Reports from the discussions are
generated. This information is taken to the UAS Taskforce to improve the delivery of our graduate
programs.
16
Departments/programs use assessment for program
improvement (examples from each program
included)
Y
For the undergraduate programs in Elementary and Secondary Education, various measures are
reported and entered into the DMS as students progress through the programs. The Exit Portfolio data
is entered each semester for students completing the programs. It is analyzed for general trends each
semester. The data is submitted to the UAS Taskforce, which coordinates changes in programs or in
the UAS itself. The Associate Dean reports UAS Taskforce findings and actions to the faculty at
monthly SOE faculty meetings.
Y
The SOE’s Unit Assessment System requires that the SOE present data to the faculty for program
improvement when applicable. The UAS Taskforce analyzes data from the Data Management System
and reports to the faculty. It was NCATE’s criticism of the UAS for our Advanced Programs that jumpstarted the Program Review of the Advanced Elementary and Secondary Education programs and led
to the redesign and further implementation of the UAS for all Advanced programs: School Counseling,
Educational Leadership, and Elementary and Secondary Education. Given the data from the newly
implemented and revised UAS for the graduate programs, and shared at the Retreat, we are making
changes. Some changes had already been implemented and were cited in the IR for the Focused Visit.
For Example, Educational Leadership, following their first assessment of student portfolios in Spring
2005, added two new assessments, “Discipline,” and “Special Populations,” to the portfolio and wrote
Portfolio Guidelines so as to better prepare Practicum students for the portfolio they are expected to
present. The Educational Leadership faculty has also initiated new measures to insure greater
consistency in the portfolio assessments by evaluators.
In order to study the data from the many newly initiated assessments in the Advanced Elementary and
Secondary Programs, the faculty applied for and received IRB exempt status to study the data.
The School Counseling students graduating in Spring 2005 received atypically low scores on their
comprehensive exams. Counseling faculty members are in the process of analyzing the results to see if
adjustments need to be made in the curriculum taught in program courses.
Departments/programs base recommendations on
data
17
Y
The SOE Data Manager, the Associate Dean, and the rest of the UAS Taskforce collect and prepare
reports based on data in the DMS and qualitative, narrative reports written by faculty. This is reviewed
at the Retreat in August every year. The UAS Taskforce then coordinates recommendations for
changes with departments and program faculty. Ultimately all changes go through Academic Affairs
Standing Committee for the SOE, and then to the Faculty as a whole.
Prior year recommendations were implemented
Y
The Unit Assessment System for each of the graduate programs was in operation as of Fall 2004. The
Program Guide & Unit Assessment System for each of the graduate programs was posted on the SOE
website and copies were distributed to all faculty and students in Fall 2005. Additionally, Portfolio
Guidelines were written for the Educational Leadership program. A Faculty Retreat was held August 18,
2005, and all faculty participated in assessing, critiquing, and making recommendations for change,
given the reports generated on the assessment data for each of the programs. This information was
then given to the UAS Taskforce to take action.
On November 6-8, 2005, the NCATE Board of Examiners (BOE) team visited for a Focused Visit on the
Unit Assessment Systems for the Advanced Programs. This is NCATE Standard 2: Unit Assessment
and Evaluation. The BOE Team concluded that we met the standard.
A new cohort program in Educational Leadership began in Summer I 2005. The cohort will complete its
work at the end of Summer II 2006. The UAS for the cohort has been aligned with the UAS for the
traditional program in Educational Leadership. Educational Leadership also held a Portfolio Day in
Spring of 2005 and another in Fall 2005 for faculty and area administrators to assess the portfolios of
students of Educational Leadership.
The Advanced Program in Special Education continues in the approval process, dependent upon IU’s
decision to accept new programs.
School* support for assessment requested/needed
18
Y/N
History: Our first Data Manager, Dick Powell, initiated our DMS. He worked for us for two years, 20012003, half time, paid for by a federal grant. When the grant ended, he continued as a consultant, about
20 hours/month. He had to work in the evenings, so this arrangement was awkward to coordinate with
faculty. In 2003-2004 academic year, and into summer of 2004, the SOE considered e-portfolios as a
way to manage our ever expanding database. We decided to wait until after the NCATE Focused Visit
in fall of 2005 to initiate them. In August 2004 we hired a Graduate Assistant, Jason Hoover, a student
in the M.S. program in Computer Science, to a 20 hour/week Data Manager position. We also hired a
part-time secretarial assistant to enter data into the growing DMS. She assisted four staff members
who also enter data. Because the original database created in Access proved insufficient to handle
multiple users at the same time, Jason migrated the data to Oracle.
In the summer of 2005 we again sought an e-portfolio vendor, and chose TaskSteam. We built the
template for the undergraduate programs’ e-portfolio in Fall 2005 and currently have a pilot program in
one section (35 students) of the introductory education course: F300, Invitation to Teaching. The SOE
is paying for these student accounts @ $25/student during the pilot. We are building the templates for
the graduate programs now (Spring 2006) and will initiate pilots in them in Fall 2006.
Jason Hoover will continue to work for us for an additional year (2006-2007). We would like to have a
full-time faculty member or staff in this position, but we would have to give up a faculty line, according
to the VCAA, in order to make that possible. We think, given the ever increasing future assessments we
will be required by the State (Spring 2008 Program Review) and NCATE (Spring 2010 Accreditation
Visit) to complete, that our need for a full time Data Manager will only grow.
University-level support for assessment
requested/needed
N
See above.
School*-level review effective
Y
The Unit Assessment System Taskforce within the School of Education meets bi-weekly and maintains
an on-going monitoring process of program assessment for analysis and curriculum review. All of our
energies this past year went to preparing for the NCATE Focused Visit of NCATE Standard 2 for the
Advanced Programs. Program faculty members were involved in all the preparations. In the end, our
Unit Assessment Systems were judged positively by NCATE and the data showed that our students
were being assessed throughout their programs, as well as at the sequenced checkpoints. All programs
have portfolio assessment. Our Initial Programs implement the Unit Assessment System we have had
in place since Spring 2003.
Recommended changes to department/program
plans
Y
At this point we continue to fine tune the UAS systems, especially as we convert our portfolio
assessment to e-portfolio assessment. Because of the particular capabilities of TaskStream, we are in
the process of making changes in our UASs for the Advanced Programs. Changes to the UAS for the
Initial Programs were relatively minor, but the Advanced Program portfolios offer more challenges.
Recommendations to Assessment Council
Y
Continued funding for a Data Manager is critical to the future assessment requirements for the School
of Education. As the state and national accreditation agencies continue to require more extensive
evidence of program and student performance success, the need for continual monitoring of our
programs and the funds to accomplish these goals will continue to increase. Even with e-portfolios, and
to a certain extent because of them, we will continue to need the help of a Data Manager. While we are
not asking for funds at this point, at some point soon it will be impossible to meet all the demands of the
State and NCATE without a full-time Data Manager.
Submitted March 8, 2006
19
By Kathleen Murphey, Associate Dean, the School of Education.
Hardcopies of the following have been forwarded to the Assessment Office: NCATE IR, UAS for Advanced Programs, Webpages for Accreditation Visit.
*Includes ACCS & Honors Program
Revised and approved by the Assessment Council February 2005
20
Indiana University Purdue University Fort Wayne
A/Y 2004-05 ETCS Assessment Report
Criterion
Y/N
Comments/recommendations
All departments/programs have assessment plans
Y
CAET, ECET, and MIET plans were approved in 04. ENGR plan was approved in 05. Updated CS
and OLS plans are being developed.
Assessment measures are linked to program goals
Y
This will be fully implemented upon approval of the updated CS & OLS plans
Assessment Plan Standards in Paragraph III.B.1. of SD 98-22
have been followed.
Y
This will be fully implemented upon approval of the updated CS & OLS plans
All departments/programs submitted reports
Y
Currently a separate report is submitted for each program. Many of these separate reports are
identical. For instance, there may be a minor change for a B.S. rather than an A.S. Since
departmental assessment plans include assessment practices for each program, during the next
cycle of reports, it is intended to ask for only one report from each department that will report
results from each program without the substantial duplication that now occurs.
Departments/programs use assessment for program
improvement (please include examples from each program).
Y
Departments/programs base recommendations on data
Y
Prior year recommendations were implemented
Y
School* support for assessment requested/needed
Y
School level support is provided to prepare for assessment-based accreditation visits.
School*-level review effective
Y
CAET, CS, ECET, MIET, and ENGR departments have nationally accredited programs. These
accreditations are assessment-based and form the basis for their assessment plans and for all
assessment activities. Accreditation requirements change the nature of school level review of
assessment. School level review essentially ensures that each department/division assessment
plan follows IPFW and, if applicable, accreditation requirements. School-level assistance can be
21
See attached document
valuable during preparations for assessment-based accreditation visits. No appropriate accrediting
agency for OLS programs has been identified. However, the Division of OLS is highly tuned to
assessment practices and leads the way in many areas of assessment, thus, school level review,
other than the annual report has been unnecessary.
University-level support for assessment requested/needed
Y
Faculty release time and/or funding for student assistance with data compilation was requested
and received.
Recommended changes to department/program plans
Y
See attached document. It is expected that changes will be made to several department
assessment plans based on the results of the accreditation visits.
Recommendations to Assessment Council
*includes ACCS & Honors Program
Revised and approved by the Assessment Council, February 2005
22
Y/N
See attached document
School of Engineering Technology and Computer Science and
Division of Organizational Leadership and Supervision
Evaluation of AY 04/05 Assessment Reports
General:
National Accreditation:
Many of the degree programs in the departments of Engineering, Technology, and Computer Science are nationally accredited
by the Computing Accreditation Commission (CAC), the Engineering Accreditation Commission (EAC), or the Technology
1
Accreditation Commission (TAC), of the Accreditation Board for Engineering and Technology (ABET) . All of these accreditations
are assessment based and departmental assessment plans and assessment measures used conform to accreditation
requirements. For these five departments, data is collected, continuous improvement is evident in all areas and faculty
involvement is continuous and extensive based on accreditation requirements.
The B. S. in Computer Science program was notified of its initial accreditation by CAC/ABET in August 2004 and expects to
receive notification of continuing accreditation of the program in August 2006. The A.S. in AET, A.S. in CET, B.S. in CNET, A.S.
and B.S. in EET, A.S. and B.S. in IET, and A.S. and B.S. in MET programs were notified of their re-accreditation in August 2005.
An EAC/ABET accreditation visit was conducted during Fall 2005 for re-accreditation of the B.S. in Electrical Engineering and
B.S. in Mechanical Engineering programs and for initial accreditation of the B.S. in Computer Engineering program. Notification
of the results will occur in August 2006.
Assessment Plans: Currently approved assessment plans that are on file in the office of the VCAA are available at
http://www.ipfw.edu/vcaa/Assessment/assmntinfo.htm. ENGR, CAET, MIET, and ECET departmental assessment plans have
been updated to correspond with senate document SD 98-22 (Sep 03 revision). The CS plan (circa 1997) and the OLS plan
(circa 2001) are in the review process and will be updated to correspond to the senate document.
Division of Organizational Leadership and Supervision:
• The OLS report was comprehensive, data is collected, continuous improvement is evident in all areas, and faculty
involvement is continuous and extensive.
• A revised division assessment plan to include the M.S. program and updates to make it compliant with SD 98-22 is
expected to be submitted for review by the Assessment Council during 2006.
• The M.S. in Organization Leadership and Supervision was approved by the Indiana Commission for Higher Education
during the summer of 2005.
• Notable is that OLS has developed a framework for assessing diversity and defined diversity goals for the A.S. and B.S.
programs.
Computer Science Department:
• The B.S. program in Computer Science is accredited based on a rigorous, assessment based, national accreditation
process. Notification of continuing accreditation for the program is expected in August 2006.
• The A.S. and B.S. in Computer Science, the A.S. and B.S. in Information Systems, and the M.S. in Applied Computer
Science program must be addressed in the revised CS assessment plan. The department assessment plan, circa 1997,
needs to be updated to reflect all programs and to comply with the latest revision of SD 98-22. The updated
assessment plan is expected to be submitted to the Assessment Council for approval during 2006.
1 Accreditation Board for Engineering and Technology, 111 Market Place, Suite 1050, Baltimore, MD 21202
23
•
Notable are the CS Continuing Improvement Actions and Responses to the assessment-based accreditation visit:
- The Department uses the results of periodic assessments to help identify opportunities for program improvement.
The following key figure illustrates the process:
Program Improvement/ Curriculum Change Process
Faculty Development
Activitie
Individual
Recommendations
Course Group
A alyzed Assessment
Data
Departmental Faculty; includes
Academic Affairs Committee
Associate Chair
Assessment, with Dept. as
committee of the whole
Accreditation Standards
ACM/IEEE
New or
Assessment Data
New
Improved
o
24
Individual Student
Feedbac
Curriculum
Recommendation
CS Professional and
Student
Board
School: SETCS
University Academic
Affairs
The department has “toured” all arrows and processes in this graph at least once. Input has
been solicited from CS faculty; professional, alumni, employer, graduating seniors and other
stakeholders. Many suggestions have been incorporated in the CS assessment process, for
example:
9 a new objective was added on leadership skills
9 new courses were added that emphasize team projects for real customers
9 other courses were modified to include exposure to a Unix environment
9 the frequency and timing of various assessments was altered
o
o
o
o
o
-
The department taught most of the following new/modified courses at least one time
9 human computer interface
9 web development
9 software engineering
9 capstone design
9 advanced graphics
9 graphics using Linux
The department collected student evaluation data on such courses
The department collected direct measures from client data for the capstone course the one time
it has been offered (Spring, 2005)
The department added a new required course that addressed the new objective of “leadership
skills”
The department modified several of our survey instruments and procedures, e.g., on-line vs.
hard copy
To ensure adequate support personnel, Mr. Su Yi was hired on a 50% appointment for this position during 2004-05
and 2005-06. It is planned to make a permanent appointment for such a technical support person, beginning in
2006-07.
Engineering Department:
• The B.S. programs in Electrical Engineering (EE), and Mechanical Engineering (ME) hosted re-accreditation visits in
October 2005. The recently approved B.S. in Computer Engineering (CpE) hosted a simultaneous, initial accreditation
visit. The accreditation process is rigorous, assessment based, and nationally recognized.
• The department has a comprehensive assessment procedure implemented to satisfy national accreditation standards
and this procedure is also being used in the new program.
• The department assessment plan was approved during spring 2005. This assessment plan has been recently updated
using feedback from the ABET review team. The most recent change to the plan is the setup of a process to evaluate
and update the educational goals. The revised assessment plan is expected to be submitted for approval by the
assessment council during 2006.
• An M.S.E degree, an M.S. in Engineering degree, and a B.S. in Civil Engineering degree are in the proposal stage and
assessment measures will be developed for these new programs during the approval process.
• The engineering accreditation agency (ABET) considers a university website a binding document. The IPFW main
website contains many instances where components of the engineering programs are described in error. This is due to
the use of old information and the fact that the people who build and maintain those web pages never contact the
Department of Engineering to confirm the accuracy of what is being posted online.
- Last October the IPFW main website cited an ABET accredited engineering degree that IPFW does not offer. This
caused a severe warning from the ABET review team.
- An old set of computer engineering educational objectives was listed on this same site for almost a year despite
multiple requests to have them updated.
The Assessment Council is asked to help in establishing a mechanism that ensures that any description of the
engineering programs on the IPFW main website is accurate. The Department of Engineering website:
http://www.engr.ipfw.edu/ is continuously updated and has the most accurate information about the engineering
programs.
• Notable are the Engineering Continuing Improvement Actions and Responses to the assessment-based accreditation
visit. These are in progress to respond to the Draft EAC/ABET findings expected in January 06 which provides a 30 day
response window to address discrepancies.
25
Engineering Technology (General)
• A new M.S. degree in Technology has been submitted for approval and is expected to begin in Fall 06. An assessment
plan and procedure for this program will be developed during the approval process. The new degree will have principal
technology areas that correspond to specialty areas within the departments of ECET, MIET, and CAET.
Civil and Architectural Engineering Technology Department:
• The A.S. in Architectural Engineering Technology, the A.S. in Civil Engineering Technology, and the B.S. in
Construction Engineering Technology underwent assessment based accreditation visits by TAC/ABET during
September 2004 and the programs received re-accreditation in August 2005. The accreditation process is rigorous,
assessment based, and nationally recognized.
• Changes in the assessment procedures and processes will be developed based on the accreditation actions. These
changes will be implemented and, upon completion of the implementation process, will be included in a revised plan.
• Notable are the CAET Continuing Improvement Actions and Responses to the assessment-based accreditation visit:
- The program objectives were reviewed by the faculty to ensure that they accurately represent the department and
that they are measurable.
- The Industrial Advisory Board of the Department of Civil and Architectural Engineering Technology was
reorganized and met to review the program objectives.
- In accordance with the CAET Assessment and Continuous Improvement Plan, the program objectives will be
revised in response to the review of the constituencies.
- The capstone course was reviewed with the instructor to identify the program objectives and outcomes that must
be measured by this course. The course syllabus was revised to reflect the expected outcomes. A specific project
was selected for this course that provided a multi-disciplinary problem solving experience for the students.
o All members of the CAET department faculty served as “consultants” to students during the
semester.
o All members of the faculty were involved in assessment of the course.
Electrical and Computer Engineering Technology Department:
• The A.S and B.S in Electrical Engineering Technology underwent assessment based accreditation visits by TAC/ABET
during September 2004 and the programs received re-accreditation in August 2005. The accreditation process is
rigorous, assessment based, and nationally recognized.
• The recently approved B.S. in Computer Engineering Technology is expected to apply for an initial accreditation
evaluation visit in the fall of 2007.
• Notable is the following quote from the accreditation report: "There is ample evidence that the Electrical Engineering
Technology Program has adopted and practiced Continuous Improvement and Outcomes-based Assessment
techniques for a number of years. It therefore has a mature set of process that can point to improvements and
accomplishments. Among them are the introduction of new courses and other curricular improvements".
• Also notable are the ECET Continuing Improvement Actions and Responses to the assessment-based accreditation
visit:
- The alumni survey was included in the departmental newsletter (“ECET-BITS”). This newsletter circulates to
all departmental alumni once a year.
- The department revised the survey instrument.
- New program outcome metrics identifying the assessment techniques were prepared for the fall 2005
semester.
- The newly defined program outcome metrics will be used in the EET BS Program Outcomes assessment.
Mechanical and Industrial Engineering Technology Department:
• The A.S and B.S. in Industrial Engineering Technology and A.S. and B.S. in Mechanical Engineering Technology
underwent assessment based accreditation visits by TAC/ABET during September 2004 and the programs received reaccreditation in August 2005. The accreditation process is rigorous, assessment based, and nationally recognized.
26
•
27
Notable are the MIET Continuing Improvement Actions and Responses to the assessment-based accreditation visit:
- Because of poor response to the alumni and employer surveys, the survey form was modified and the
procedure to survey all alumni who graduated in the previous five years was changed. Alumni who did not
respond to the survey were contacted by telephone and ask to complete the survey, which resulted in a 76 %
response rate. To achieve this response rate required contacting alumni over a period of six months. This data
collection was completed in December of 2005. The data is currently being evaluated by the curriculum
committee.
- The departmental Industrial Advisory Committee (IAC) evaluated and approved program educational
objectives.
Indiana University Purdue University Fort Wayne
A/Y 2004-05 Assessment Report for General Studies
Criterion
Y/N
All departments/programs have assessment
plans
Y
Comments/recommendations
We are the only academic unit in the Division of Continuing
Studies.
Assessment measures are linked to
program goals
We are changing our program goals.
Assessment Plan Standards in Paragraph
III.B.1. of SD 98-22 have been followed.
See below.
All departments/programs submitted reports
Y
Departments/programs use assessment for
program improvement (please include
examples from each program).
We are a system-wide degree program through Indiana
University’s School of Continuing Studies. Final decisions
regarding degree policies and procedures are not made at
this campus. We may make suggestions for changes that
would affect the whole system through our campus faculty
representative to the School of Continuing Studies Faculty
Council. Approval of changes through this body would then
be implemented throughout the system. The system-wide
School does not have an assessment process in place. We
are among the first to do so.
Departments/programs base
recommendations on data
See above.
Prior year recommendations were
implemented
We implemented a pilot program last year. Based on
recommendations from the faculty review of the program, it
was determined that another pilot needed to be done.
School* support for assessment
requested/needed
N
The Executive Director of the FW Division of Continuing
Studies is very supportive of assessment.
School*-level review effective
There is no “school” review.
University-level support for assessment
requested/needed
The Vice Chancellor financially supported my attending the
national assessment conference at IUPU-I last fall.
Recommended changes to
department/program plans
N
The pilot needs to be repeated.
Recommendations to Assessment Council
*Includes ACCS & Honors Program
Revised and approved by the Assessment Council, February 2005 - Submitted Spring 2006
The original mission/goals/assessment plan was approved in Senate Document 98-21. This mission/goals/assessment plan
is not accurate and we feel the following mission/goals statement more accurately reflects our program at this time.
28
GENERAL STUDIES DEGREE PROGRAMS
ASSESSMENT REPORT
The General Studies Degree Program is a unique program, part of a university-wide system of the Indiana University School of
Continuing Studies. Students may start and complete an Associate of Arts or Bachelor of General Studies on any of Indiana
University’s campuses or through Independent Study. Students are subject to the policies, procedures, and graduation
certification process of the system-wide School of Continuing Studies. A key characteristic of an adult-oriented program is the
flexibility that allows students to individualize the program, incorporating their academic and career goals into the degree
requirements. Students bring multiple sources of knowledge to the program, based upon personal experiences and life
responsibilities. Because of the unique nature of the program, students may only be with the program for as little as 10 credit
hours (Associate of Arts) or the entire 120 credits for the Bachelor’s Degree.
GOALS FOR THE GENERAL STUDIES DEGREE PROGRAMS
*Provide students with the opportunity to complete a non-specialized curriculum based on individual choice or needs.
*Enable transfer students to maximize the number of credit hours applicable toward the degree.
*Provide students with a means of professional advancement and development of specific career related skills.
*Provide students with a degree program that offers quality, convenience, reputable advising, and personal satisfaction.
ADDITIONAL GOALS FOR ASSOCIATE OF ARTS IN GENERAL STUDIES
*Help students to view this degree as progress toward educational goals.
*Help students build confidence by excelling in college courses, no matter what their age.
ADDITIONAL GOAL FOR THE BACHELOR OF GENERAL STUDIES
*Provide students with basic preparation for many careers and graduate programs.
Direct Measures of Student Learning Outcomes
1. Compare a sample of General Studies students in General Education Area VI classes to a sample of IPFW students in Area
VI using the same rubric.
Pilot Study
•
•
•
•
Select 3 faculty members to meet to review the rubric
Read the same papers
Meet again to review the inter-rater reliability
Adjust the rubric as appropriate
Nine (9) papers were received from General Education Area VI classes. Three (3) faculty members from the General Studies
Faculty Advisory Committee reviewed the papers using two different rubrics, an analytic and a holistic. The faculty determined
after review that it was not an adequate sample and the pilot needed to be repeated. They did determine that they preferred the
analytic rubric with the addition of one category from the holistic rubric. They also determined that it was necessary for each of
them to read all the papers to have true score. The pilot will be repeated again this year.
29
Indirect Measures (Spring 2005 Enrollment/497 Students/4342 Hours)
1. Collect demographic information on General Studies students and compare to university data from semester to semester.
Collect information in such areas as: age, gender, ethnicity, GPA, enrollment status, and class standing.
Spring 2005
Gender
IPFW %
General Studies %
G.S. % of IPFW Total
43.14
36.82
4.23
56.84
63.18
5.44
Ethnicity
2.38
.63
1.35
4.73
9
8.96
2.56
1.05
2.07
.31
.63
9.38
Age
1.16
0
0
33.68
7.53
1.14
34.96
35.15
4.95
12.10
15.9
6.37
16.7
34.31
9.61
1.41
7.11.
20.73
Enrollment
57.32
39.54
3.45
42.68
60.46
6.83
Class Standing
40.32
18.83
2.37
20.88
19.67
4.65
16.02
22.38
6.75
22.89
39.12
8.13
Average GPA
2.79
2.84
2. Collect retention information from fall to fall, spring to spring, compare with University data. (not collected for spring to spring)
Male
Female
Asian/Pac.Is.
Black
Hispanic
Am.Ind./Alas.
0-17
18-22
21-25
26-30
31-50
Over 50
Part-time
Full-time
Freshmen
Sophomore
Junior
Seniors
3. Collect information on students declaring majors in General Studies, prior majors, admission categories, and semester to
semester.
Spring 2005
Admission Category
Adult Admit
GED Admit
High School
Admit
Permanent
Intercampus
Transfer
Re-admit
Re-entry
Transfer
Change of Major
A&S
BUS
EDUC
ET
HS
OLS
SPEA
ACCS
VP
4. Survey graduates, five years and one year out.
Numbers
3
1
2
4
7
46
25
17
5
14
6
12
1
4
18
3
2005 General Studies Survey Results
30
•
•
•
•
•
•
•
•
285 Students who graduated from General Studies in 2003-2004 and 1999-2000 (1 and 5 years out) were sent the
survey
5% of the surveys were returned
73% of those that responded are still living in Indiana
73% of those who responded had earned their BGS and 46% had earned their AGS
67% of the respondents were women and 33% were men
93% of the respondents felt their degree expectations were met and 7% did not
53% of the respondents felt that their IPFW experience was good and 47% felt it was excellent
40% of the respondents pursued further higher education after graduation
Participants were employed by:
Self-employed
IPFW
FW Psychiatry
University of St. Francis
Sauer Land Surveying
1st National Bank of Arizona
Christ Lutheran Church
Crowe Chizek
St. Joseph’s Hospital
Manchester College
Ivy Tech
Park Center
Emergency Medicine of IN and Triad Hospital
Precimed
Professional Tutors of America
Apollo Courier of LAX
Rehab Hospital of FW
Current Job Titles Included:
Landscaping/Lawn Care
Gift processing and Database coordinator
Physician Assistant
Radiologic Technology Clinical Coordinator
Administrative Assistant
Wholesale Account Manager
Pastor
Internal Audit Consultant
Program Director, FW School of Radiology
Adjunct Faculty
MSW, Therapist 1
Physician Assistant ER
CNC Mill programmer/machinist
Tutor
Delivery Driver
Administrative Assistant
Skills Former Students Reported Using:
31
Tourism management
People management
General office skills
Pastoral skills
Business
Communication
Life experience combined with theory and technical knowledge
Programming
Trigonometry
Calculus
Math
Responsibility
Organization
How were your goals met?
Thought career opportunities would be better
It was a requirement for admission into a graduate program
Needed degree to keep job
Accepted into master’s degree in communication
Needed MSW for desired profession, BGS allowed me to change quickly by accepting older credits
Allowed me to focus on the classes I needed for Physician Assistance program without waste of time and money
I loved my education at IPFW and miss the campus a lot. I feel like it is my 2nd home.
What was your experience with GS/IPFW?
Sandy McMurtrie was wonderful in helping me figure out what program to apply my credits earned at IPFW! I would like to see
the GNST program offer internships or some type of job placement help. Provides a lot of opportunities for AS degree students
to obtain a BS degree. Great advisors. GNST degrees are flexible and can be adapted easily to fit students’ needs - also allows
students to build on older college credits, making it easier for returning adults to complete degrees in a more timely and
economic manner. I can’t say enough wonderful things about the program. The advisors and staff were so helpful and
encouraging. IPFW is a great place for an education. They don’t allow students to fall through the cracks. I wouldn’t be where I
am today if it weren’t for the caring professionals at IPFW. Kudos to Mrs. Kimble - she wouldn’t let me quit!
General Studies graduates have been surveyed back to 1996. Because of the low response rate this year, we have decided to
put our survey on our web page and pilot it with this group of alumni. We will send them a postcard, requesting that they go to
our web site, and complete the survey. A reminder postcard will be sent to those that do not complete the survey.
Concluding Thoughts
The General Studies Degree Program will be participating in the program review process this year. The system-wide General
Studies Program has also changed some degree requirements that could have an impact on our assessment process as well. I
believe, based on these two things, we will be changing our assessment process again. We have been collecting more
complete data now for two years so we will have more information for comparisons in next year’s report. At this point we will be
repeating the pilot program for assessment.
32
Indiana University Purdue University Fort Wayne
A/Y 2004-05 School of Health Sciences Assessment Report
Criterion
All departments/programs have assessment plans
Y/N
Y
Comments/recommendations
All HSC programs have assessment plans. Please see comment below on
plan review.
Y
All HSC programs were asked to review their current plans and verify that
their programs goals could be measured. Any goals that could not be
measured were to be changed.
Assessment Plan Standards in Paragraph III.B.1. of SD
98-22 have been followed.
Y
Per the mission and goals of SD 98-22: “Assessment plans are designed to
evaluate whether the goals of the general education program, and of the
respective certificate and degree programs, are being achieved. The goals of
the general education program, and of each certificate and degree program,
have been approved separately. Review of mission and goals is periodically
undertaken by schools, divisions, and departments, culminating in Senate
approval of any revisions.”
All departments/programs submitted reports
Departments/programs use assessment for program
improvement (please include examples from each
program)
N
Y
The Graduate Nursing program did not submit a report.
CFS – continues to use portfolios, alumni surveys, employer surveys, and
ServSafe examination results as multiple measures of effectiveness. They
continue to add more student extramural sites throughout the U.S. and the
world.
Dental Assisting - is pursuing a bachelor degree with two core concentrations.
They are also pursuing a dental specialty associate degree, per the VCAA’s
approval since the need is evident from student, graduate, and employer
survey results.
Dental Hygiene - is pursuing a bachelor degree with two core concentrations.
They plan to pursue a master’s degree after the B.S. degree.
Assessment measures are linked to program goals
Dental Lab Tech – has increased applicants by heavily recruiting in area high
schools and high schools throughout the state. The program is pursuing a
33
bachelor degree with two core concentrations.
Human Services – Faculty created additional research opportunities for
students to participate in professional development. Students were
encouraged to present their research at regional conferences this past year.
Nursing – is making major curriculum changes in Fall, 2006 with a new B.S.
(completion) nursing degree called “Career Steps.” LPN > RN Mobility
Degree, > A.S. Nursing degree > Critical Care Certificate > B.S. Nursing
degree. The graduate program director pursuing three new graduate
degrees, such as nurse practitioner models and administration.
34
Documentation presented to curriculum reviewers.
Departments/programs base recommendations on data
Y
Prior year recommendations were implemented
Y
Most recommendations to programs were implemented. However, some
recommendations require more time to implement.
School* support for assessment requested/needed
Y
The undergraduate and graduate nursing programs made many changes to
their curriculum and are in the process of adding more graduate programs.
School*-level review effective
Y
University-level support for assessment
requested/needed
N
Recommended changes to department/program plans
Y
Recommendations to Assessment Council
N
Not requested
Nursing will be changing their assessment plan since their B.S. degree will
change this year. Other HSC departments have reviewed or are the process
of reviewing their program goals and correlating their goals to the multiple
measures of assessment.
Indiana University Purdue University Fort Wayne
A/Y 2004-05 Honors Assessment Report
Criterion
Y/N
Comments/recommendations
All departments/programs have assessment plans
Y/N
Assessment measures are linked to program goals
Y/N
Assessment Plan Standards in Paragraph III.B.1. of SD 98-22
have been followed.
Y/N
All departments/programs submitted reports
Y/N
Not applicable
Departments/programs use assessment for program
improvement (please include examples from each program).
Y/N
So far we do not have enough data to apply toward
program improvement
Departments/programs base recommendations on data
Y/N
Not applicable at this time
Prior year recommendations were implemented
Y/N
Not applicable
School* support for assessment requested/needed
Y/N
School*-level review effective
Y/N
University-level support for assessment requested/needed
Y/N
Recommended changes to department/program plans
Y/N
Recommendations to Assessment Council
Y/N
*Includes ACCS & Honors Program
Revised and approved by the Assessment Council, February 2005
35
Not applicable
Not applicable
A/Y 2004-05 HONORS PROGRAM ASSESSMENT REPORT
Introduction
Honors Program Assessment Plan was developed during the 2004-2005 academic year and approved on April1, 2005 by the
Honors Program Council. For assessment purposes the program has three components: Honors Courses, H-Option contract,
and Honors Project.
Questionnaires and project evaluation forms (assessment tools) were distributed and collected in the manner described in the
Assessment Procedure section of the Assessment Plan. Data for each category were pooled and the results are presented
below.
1. Assessment of Honors Courses
Assessment tools (Honors Course Assessment for Students and Honors Course Assessment for Faculty) used for this category
were mainly designed as indirect measures for assessing program goals listed under category B, Environment of Intellectual
Excitement and Discovery, of the Honors Program Assessment Plan.
The responses to the Honors Course Assessment for Students(page 3; items1-4) indicate that 68 respondents generally agree
that honors courses foster student interaction with peers, enhance faculty interaction with students, provide an interdisciplinary
approach to teaching and learning, and promote the creation of a community with diverse ideas. In addition, the responses to
item 5 of the questionnaire (page 2) show that students feel that honors courses help them with the development of critical
thinking skills. Faculty response to the assessment tool was low (only three forms were returned) but positive. Data presented on
page 4 suggest that teaching an honors course was intellectually challenging for the faculty members and encouraged facultystudent interaction. It is interesting to note that all three faculty members indicated that they would like to remain involved with
the honors program (items 6, 7 and 8). Two out of three respondents said that they would encourage other faculty to get
involved in the teaching honors course. Collectively, student and faculty responses suggest that honors courses are meeting
Honors Program goals in category B mentioned in the above paragraph.
Honors Course Assessment for Students – 68 Forms
Please write the number that matches how much you agree or disagree with the following statements based upon the following
scale:
1—Strongly Disagree; 2—Disagree; 3—Neutral/Unsure; 4—Agree; 5—Strongly Agree
Note: The numbers in blanks represent averages from 68 forms unless otherwise indicated by the number in the parenthesis.
1.
4.2
This course encouraged me to interact with other students in the class.
2.
4.0
This course encouraged me to interact with my professor.
3.
3.9
This course incorporated material from multiple disciplines (such as
biology, sociology, literature, music, etc.).
4.
4.4
The instructor of this course encouraged participation from all students.
5.
4.1
This course helped me to develop my critical thinking skills.
6.
4.0 (67) I would take another honors course.
7.
3.8
I would take another course with this professor.
8.
3.7
I would recommend this course to my friends.
36
9.
3.8
I would recommend this professor to my friends.
10.
4.1
The teaching methods were appropriate to the course material.
Honors Course Assessment for Faculty
Please write the number that matches how much you agree or disagree with the following statements based upon the following
scale:
1—Strongly Disagree; 2—Disagree; 3—Neutral/Unsure; 4—Agree; 5—Strongly Agree
Note: The numbers in blanks represent averages from 3 forms unless otherwise indicated by the number in the parenthesis.
1
4.3
Teaching this course challenged me intellectually.
2.
4.7
This course encouraged me to interact with the students.
3.
4.0
This course incorporated material from multiple disciplines (such as
biology, sociology, literature, music, etc.).
4.
5.0
I expected more from the students in this course than from those in a regular
section.
5.
3.0
The students in the course performed at a satisfactory level.
6.
5.0
I would consider doing an H-Option.
7.
5.0
I would consider teaching another honors course.
8.
5.0
I would recommend taking this course (as an honors course) to students.
9.
10.
4.5 (2) I would recommend teaching this course (as an honors course) to other faculty.
4.7
This course was conducive to having an Honors Section.
2. Assessment of H-Option
Assessment tools and associated program goals for this category were similar to those mentioned in the Honors Courses
category. Student responses shown on an H-Option Assessment for Students form (page 6) clearly indicate that students more
than agree that H-Options enhance faculty interaction with students (items 2-4) and allow them to achieve their learning goals
under the guidance of dedicated faculty (items 1,5,6, 10 and 11). Faculty responses (page 7) reflect similar sentiments.
Collectively, the student and faculty responses suggest that the H-options have been successful in creating an environment of
intellectual excitement and discovery.
37
H-Option Assessment for Students
Please write the number that matches how much you agree or disagree with the following statements based upon the following
scale:
1—Strongly Disagree; 2—Disagree; 3—Neutral/Unsure; 4—Agree; 5—Strongly Agree
Note: The numbers in blanks represent averages from10 forms unless otherwise indicated by the number in the parenthesis.
1.
4.8
I met my learning objectives in this H-Option.
2.
4.7
I was enthusiastic about meeting the requirements for this H-Option.
3.
4.6
The faculty member was enthusiastic about working with me.
4.
4.3
The H-Option encouraged more interaction between the faculty member and me.
5.
4.6
I was satisfied with the quality of the project or work that resulted from the H-Option.
6.
4.1
The H-Option was better than I had anticipated it would be.
7.
4.6 (9) I would consider contracting another H-Option.
8.
4.8 (9) I would consider contracting another H-Option with this professor.
9.
4.5
10. 4.4 (9)
I would consider taking an Honors Course from this professor.
I would consider working with this professor on my Honors Project.
11.
4.8
I would recommend this faculty member to another student who is considering contracting an H-Option or taking
an Honors Course.
H-Option Assessment for Faculty
Please write the number that matches how much you agree or disagree with the following statements based upon the following
scale:
1—Strongly Disagree; 2—Disagree; 3—Neutral/Unsure; 4—Agree; 5—Strongly Agree
Note: The numbers in blanks represent averages from17 forms unless otherwise indicated by the number in the parenthesis.
1. 4.3 (16) My suggestions made a significant difference in how this H-Option was structured.
2.
4.3
The student had clear learning objectives in this H-Option.
3.
4.5
The student met his or her learning objectives in this H-Option.
4.
4.6
I was enthusiastic about working with the student.
5.
6.
4.0
4.6
The H-Option encouraged more interaction between the student and me.
I was satisfied with the quality of the project or work that resulted from the
H-Option.
7.
3.5
The H-Option was better than I had anticipated it would be.
8.
4.1
I would like to have this student in another of my classes.
38
9.
4.5
I would consider facilitating another H-Option.
10.
4.6
I would consider facilitating another H-Option with this student.
11.
2.7
I would consider making this course an Honors Course.
12.
3.4
I would consider teaching an Honors Course.
13.
4.5
I would consider working with this student on his or her Honors Project.
14.
4.4
I would recommend the H-Option to another faculty member.
15. 4.5 (16) I would recommend this student if another faculty member was considering facilitating an H-Option with them.
3. Assessment of Honors Project
Two students presented their Honors Projects and the data for those projects are shown on pages 9 and 10. Although the data
suggest that Honors Program goals are being met, given the small sample size, it is inappropriate to provide any interpretations
at this time.
Furthermore, it should be noted that the process of project evaluation was fraught with problems and the Honors Program
Council is currently revising the process. Once the evaluation process is established, the assessment process will be modified to
reflect the changes.
Evaluation of Honors Project: Presentation
Summary for Spring of 2005
Please note: The numbers in parenthesis are the averages from 8 evaluation forms
Excellent
Very Good
Good
Fair
Poor
1. Synthesis of ideas (4.25)
5
4
3
2
1
2. Clarity of rationale (3.875)
5
4
3
2
1
3. Support of conclusions with appropriate
evidence (4.0)
5
4
3
2
1
4. Knowledge of subject matter (4.375)
5
4
3
2
1
5. Breadth and depth of treatment (4.375)
5
4
3
2
1
Content
39
6. Independent thinking (4.375)
5
4
3
2
1
7. Identifiable and pertinent methodology (4.25)
5
4
3
2
1
1. Sequence and organization of topics (4.25)
5
4
3
2
1
2. Handling of audience questions (4.375)
5
4
3
2
1
3. Clarity (4.375)
5
4
3
2
1
Overall evaluation (4.25)
5
4
3
2
1
Excellent
Very Good
Good
Fair
Poor
1. Synthesis of ideas (5)
5
4
3
2
1
2. Clarity of rationale (5)
5
4
3
2
1
3. Support of conclusions by
5
4
3
2
1
4. Knowledge of subject matter (5)
5
4
3
2
1
5. Breadth and depth of treatment (5)
5
4
3
2
1
6. Independent thinking
5
4
3
2
1
7. Identifiable and pertinent methodology
5
4
3
2
1
Presentation
(The overall evaluation should give greater
weight to content than to presentation.)
Evaluation of Honors Project: Paper
Please note: The numbers in parenthesis are form one evaluation form.
Merit and Content
appropriate evidence (3)
40
Organization
1. Sequence and organization of topics (5)
5
4
3
2
1
2. Clarity of language (5)
5
4
3
2
1
Overall evaluation (5)
5
4
3
2
1
(The overall evaluation should give greater
weight to content than to presentation.)
Indiana University Purdue University Fort Wayne
A/Y 2004-05 SPEA School Assessment Report
Criterion
Y/N
Comments/recommendations
All departments/programs have assessment
plans
Y
NASPAA accreditation reports were due in June of 2005 and our
current assessment plan was incorporated in that report. The
NASPAA site visit is scheduled on January 30, 2006. The School
has implemented the MPM in the fall of 2005, consequently,
assessment of the Master’s program will not begin until Spring
2006. No assessment will be made of the current MPA since it is in
the process of being phased out.
Assessment measures are linked to
program goals
Y
Measures and program goals were articulated and agreed upon by
faculty in the Summer of 2004.
Assessment Plan Standards in Paragraph
III.B.1. of SD 98-22 have been followed.
Y
All departments/programs submitted reports
Y
Departments/programs use assessment for
program improvement (please include
examples from each program).
Y
Essay questions, as a qualitative measure, have already been
finalized; multiple choice questions, as a quantitative measure, are
being assembled by participating faculty. Efforts will be made to
implement both of these measures in V170 before the semester
ends. Faculty are also in the process of adopting new course
evaluations in fall 2006.
Departments/programs base
recommendations on data
Y
Data for the undergraduate degree will be obtained once
measures are finalized; data acquisition for the MPM will begin fall
2006
Prior year recommendations were
implemented
Y
BS in Criminal Justice was suspended to avoid duplication and
enhance program in public affairs
School* support for assessment
requested/needed
School*-level review effective
University-level support for assessment
requested/needed
41
N/A
Y
N/A
Efforts are being made to increase the number of faculty
participating in the assessment process.
From another school’s report: Faculty release time and/or funding
for student assistance with data compilation was requested and
received.
Recommended changes to
department/program plans
Y
Recommendations to Assessment Council
N
BS in Criminal Justice suspended; MPM will be implemented Fall
2005 and assessment will begin Fall 2006
*Includes ACCS & Honors Program
Revised and approved by the Assessment Council, February 2005
42
Indiana University Purdue University Fort Wayne
A/Y 2004-05 School of Visual and Performing Arts Assessment Report
Criterion
Y/N
Comments/recommendations
All departments/programs have assessment plans
Y/N
Music : Interim and exit assessment measures for all degrees.
Music is continuing to assess core curriculum common to all degrees. This process was
started from their 2003 assessment report
Theater: Interim and exit assessment measures including external reviews. Theatre has
added end of semester reviews for each student.
Visual Communication and Design: Interim and exit assessment measures.
Fine Arts: Interim and exit assessment measures
B.A. Art Education assessment plan should be sent forward to assessment committee.
Assessment measures are linked to program goals
Y
All departments have goals in place and seem to be connected to their assessment. The
VPA curriculum committee recommends that each department directly state all departmental
goals on all future assessment reports.
Assessment Plan Standards in Paragraph III.B.1. of SD 98-22
have been followed.
Y
All programs are meeting these standards. However this committee recommends all
departments review Standards in Paragraph III.B.1. of SD 98-22 for next cycle.
All departments/programs submitted reports
Y
Departments/programs use assessment for program
improvement
Y
43
Music: Music in an Outside Field has grown in numbers as a result of redefining
requirements for the degree. Better organized and scheduled audition dates has resulted
better prepared students, and more involved parents and adjunct faculty. Revised upper
divisional evaluation tool has resulted in better student preparation and performances, as
well as, better faculty evaluation of students.
Visual Communication and Design : Expectations for print materials for B.F.A. interim
assessment are now clearly stated and are being met more readily. Similar expectations will
be put in to place for digital portfolio materials. Each student enrolled in senior project
receive and complete a senior contract resulting in a better rapport with students and faculty.
Y
Fine Arts: Better clarification of B.A. and B.F.A. programs has resulted in more student
interest in B.A. program. Portfolio reviews have demonstrated success in the improvement
of student drawings resulting form continued focus on high standards of 2d instruction.
Continued use of interim assessment to develop program goals for Bachelors of Art
Education.
Implementation of senior project goals for Bachelor of Fine Arts which appear to have
resulted in a higher level of student performance: body of original work and ambitious work,
evidence of depth of thought, evidence of research, sufficient technical virtuosity, ability to
explain ideas, participation in all department senior events, professional attitude, and
keeping abreast with new developments in the field, as they pertain to their work .
Theater : has now added end of the semester reviews of all theater majors scheduled to
begin fall 2005. This is a result of students asking for more faculty feedback. Faculty and
student questionnaires are being developed for more direct data on this process. Revised
course rotation and advising guidelines are being developed. Departmental goals for all
majors have been created and emphasis objectives are under development.
Departments/programs base recommendations on data
Y
All departments are keeping accurate data, drawing plausible conclusions and making
recommendations based on data. The numbers of students are small so statistical
percentages are inappropriate, but the numbers can and are being used to aid in the
analysis of program effectiveness.
Prior year recommendations were implemented
Y
Music: has clarified content of music core sequence.
Fine Arts: Art Education assessment plan is in progress.
Theater: created Goals for B.A. program and clarified language of “mid-program
assessment.”
All departments use of direct and indirect measures continue to improve.
School* support for assessment requested/needed
N
School*-level review effective
Y
University-level support for assessment requested/needed
N
Recommended changes to department/program plans
Y
Recommendations to Assessment Council
N
*Includes ACCS and Honors Program
Revised and approved by the Assessment Council April 2004
44
Departments are learning form each others assessment process.
It would be helpful to this committee if all departments submitted goals annually with
assessment.
APPENDIX B
To:
Erin Frew, Director of Assessment, and Assessment Council
From:
Kathleen Murphey, Associate Dean, School of Education (SOE)
Date:
June 2, 2006
Re:
SOE Data Manager
________________________________________________________________________
The Indiana Department of Education’s newly created Division of Professional Standards (DPS), formerly
the Indiana Professional Standards Board (IPSB), mandates that Schools of Education maintain a Unit
Assessment System that details how the education unit ensures that its programs prepare teacher
candidates with the knowledge, skills, and dispositions necessary to be effective educators. The education
unit must provide evidence that demonstrates its effectiveness and how it uses the evidence to continually
improve its programs. The evidence is standards- and performance-based. The SOE’s accreditor, the
National Council for the Accreditation of Teacher Education (NCATE), has a partnership agreement with
the State of Indiana. NCATE also requires standards- and performance-based assessment; the State and
NCATE conduct accreditation visits together.
The data has to be aggregated, and, thus, stored electronically. The Data Manager is key to providing the
mechanism to collect, store, retrieve, and analyze all of the pieces of evidence needed to ensure that the
process for reviewing and revising programs, as well as the Unit Assessment System itself, can be
maintained.
For the last NCATE/State Visit in Fall of 2005, we presented our Institutional Report and all the evidence
(e-evidence room) on a specially designed website. In the future all Program Reviews (Spring 2008) and
accreditation visits (Spring 2010) will be submitted on line. We are already in the process of preparing for
these events.
Because the portfolios that all students prepare, as part of standards- and performance-based assessment,
are becoming too cumbersome to manage and store in their hardcopy format, we are in the process of
instituting e-Portfolios with the vendor TaskStream. We have had a pilot program this past semester, Spring
2006, and will require e-Portfolios of all students in the introductory undergraduate class, EDUC F300,
Invitation to Teaching, in Fall 2006. The four graduate programs are in the process of developing templates
to begin introducing TaskStream. Developing the templates has required changes to the Unit Assessment
Systems of these programs, so they probably won’t be operational until Spring 2007. Nonetheless, the Data
Manager would play a key role working with faculty as we integrate e-Portfolios into the SOE. This would
involve working with faculty and students as we introduce it, running reports on the data collected, and
working to get ever more pieces of our evidence in the Data Management System into the e-portfolio
format.
We had a Data Manager who was hired in 2001-2003 through a Title II grant for approximately 30 hours a
week. In 2003 he began working as a consultant for a few hours a month after hours, for he took a full-time
job in industry. In August of 2004 we hired a Graduate Professional Assistant, completing an M.S. in
Computer Science, for 20 hours per week. He has done an excellent job. The challenge is that we need
someone for more than 20 hours per week, and on a continuing basis, as we 1) maintain the Data
Management System we have, 2) update it as needed, 3) transfer our portfolios to TaskStream, 4) continue
to make some of our assessment systems as well as student field experience placements electronic, and 5)
work with the SOE’s Unit Assessment System Taskforce as it develops, maintains, and improves the Unit
45
Assessment Systems, as well the programs that they assess. The purpose is, in the end, to prepare
teachers better, so they will be more effective in helping all children learn.
A proportion of the position should be allocated to consult with and support academic program assessment
across campus. Responsibilities will include consulting with and providing technical support for academic
programs adopting electronic portfolio systems (i.e., TaskStream). The ratio of time spent on these duties
should be approximately 25-30 percent.
46
School Of Education (SOE) Data Manager/E-Portfolio Campus Consultant Position Description
(Proposed June 2006)
The SOE Data Manager works with the Unit Assessment System (UAS) Taskforce and others to do the
following:
1)
Maintain the SOE Data Management System, a database with all licensing and
assessment data from the SOE:
•
•
•
•
•
•
•
•
•
Maintain and evolve the Praxis score recording/reporting system
Maintain and evolve the Licensing score recording/reporting system
Maintain and evolve the Criminal History information system
Write IACTE license report (September);
Write Preliminary Title II Program Completer Report (November);
Oversee assessment on Portfolio Days; assist with data entry, evolve and extend the
existing application (each semester);
Generate Portfolio reports for Elementary and Secondary Education (each semester):
-Send letters to students
-Evaluator comparison
-Portfolio stratification report
-INTASC standards analysis
-Secondary/Elementary comparison
-Similar reports for Educational Leadership portfolios;
Submit Title II Program Completer Report (January); and
Generate reports of assessments from all programs for faculty discussion at August
Faculty Retreat (August).
2)
Develop the SOE Data Management System to meet evolving requirements of the SOE,
NCATE, and the State.
3)
Integrate new technologies into the SOE Data Management System as they become
available.
4)
Work with faculty and students to integrate e-Portfolios (TaskStream) into the SOE.
5)
Continue to develop a comprehensive system to electronically manage field experience
placements for all programs, assessments of the field experiences, and surveys of graduates and
employers.
6)
Work with staff, as necessary, to insure the smooth and accurate entry of data into the
Data Management System.
7)
Develop, as time allows, administrative functions of the SOE in electronic format.
8)
Write tools to query, manipulate, and transform data using Java.
9)
Possess conceptual knowledge of Relational Database Management Systems, including
Structured Query Language (SQL), PL/SQL (an Oracle extension to SQL) and others.
47
10)
Consult with and provide technical support to campus academic programs adopting
electronic portfolio systems.
APPENDIX C
48
TO:
Jonathan Tankel
Educational Policy Committee
Indiana University Purdue University-Fort Wayne
FROM:
IPFW Assessment Council
Thomas Bernard, Chair, School of Visual & Performing Arts
Hal Broberg, Engineering Technology & Computer Science
Erin J. Frew, Director of Assessment, Ex-officio
Peter Goodmann, Educational Policy Committee
Julie Hook, General Studies
Jay Jackson, Arts & Sciences
Connie Kracher, Health Sciences
Mark Masters, General Education Subcommittee
Rhonda Meriwether, Academic Counseling & Career Services
James Moore, R. T. Doermer School of Business & Management Sciences
Kathleen Murphey, School of Education
Koichiro Otani, School of Public & Environmental Affairs
Steve Sarratore, Office of the Vice Chancellor of Academic Affairs
DATE:
June 5, 2006
SUBJECT:
Assessment Council Annual Report, 2005-2006
The Assessment Council (AC) met during the 2005-2006 academic year and reviewed and accepted
assessment reports from the Schools of Arts and Sciences, Business and Management Sciences,
Education, Engineering Technology and Computer Science, Health Sciences, Public and Environmental
Affairs and Visual and Performing Arts. It also reviewed and accepted reports from Academic Counseling
and Career Services, General Studies and the Honors Program. Some programs did not submit reports to
their school and therefore, were not reviewed.
The following new and revised unit assessment plans were reviewed and accepted: Accounting, Dental
Hygiene, Psychology and the Honors Program. They will be forwarded to the EPC when the 90-day posting
period has concluded.
The AC recommended a new position for the School of Education (SOE) and the IPFW campus. This
position would assist programs adopting electronic portfolios and manage the process of and data from the
SOE electronic student portfolios.
The AC will hold meetings during fall 2006 to conduct additional business such as updating the assessment
action plan for IPFW and considering methods for involving students in assessment, in addition to its
regular business.
APPENDIX D
2006 IPFW Assessment Workshop:
49
Part & Parcel: Assessment, Program Review & Accreditation
March 15, 2006
Evaluation Summary, N=28 of approximately 58 attendees or 48%
Please help us improve the quality of future assessment workshops by providing feedback below. Please circle only
one response.
The first number in the summary below is the frequency and the second is the percentage of respondents.
1.
The information I received at this workshop was useful.
Strongly Agree (11, 39%) Agree (16, 57%) Disagree (1, 4%) Strongly Disagree (0)
2.
I learned more about the relationship of assessment to program review and accreditation at this workshop.
Strongly Agree (12, 43%) Agree (14, 50%) Disagree (2, 7%) Strongly Disagree (0)
3.
I plan to use at least one of the concepts I learned about today.
Strongly Agree (7, 25%) Agree (19, 67%) Disagree (0) Strongly Disagree (0)
“Not sure” (1, 4%) Neutral (1, 4%)
Comments: Hopefully insight into “portfolio” approach (to A). Ideas from panel presenters (to SA),
Vertical vs. horizontal assessment (to SA)
4.
I feel more positive about assessment than I did prior to attending the workshop.
Strongly Agree (5, 17%) Agree (15, 52%) Disagree (5, 17%) Strongly Disagree (2, 7%)
Non-responses = 2 (7%)
One response indicated “agree” and “disagree”
Comments: I was already positive (to SD), Okay with it already (to D), I did not feel negative about it before this
workshop, so this question is not relevant for me. (to A), I was already positive (to D), I felt very positive fore the
workshop! (to DS), I was already ok w/ assessment (no response), Came in with positive attitude—good to see how
much is being done (to A), But also more overwhelmed (to A)
Please list topics of interest to you for future workshops:
•
•
•
•
•
Integrated assessment measures (not “added” just for assessment)
E-portfolios
Creating rubrics; action plans after assessment
Process for establishing metric with validity and reliability
How to engage faculty, effective execution of plan
•
•
Theory behind outcomes assessment along with the nuts and bolts of how to assess outcomes using rubrics
I would be interested in a discussion on the relation between pedagogical effectiveness and perceptions of
varying relevance of department goals/discipline perspective in the working world (which might also include
the university as a viable option for future employment).
Grants; Midwest or national assessment conference or workshop, engineering and technology assessment
•
50
•
•
•
•
•
•
“Real” faculty (not chairs, associate deans, directors, etc.) who become “converts” based on positive
experiences at IPFW (a “success” story).
Making assessment efficient and useful!
History of assessment “movement”; Place of assessment activities within faculty evaluation areas (teaching,
research, service)
Help with statistics and analysis
Drawing on institutional resources for assessment: What’s available, what should be used, benefits of
incorporating support services in review (library, CELT, others?) Testing office? ITS?
Tips for keeping assessment progress going. Like project-management-type tips to keep depts... making
progress throughout the year, rather than a last minute rush (tools like milestones, rewards, project-mgt
software)
Other comments:
•
•
•
•
•
•
•
•
•
•
•
•
•
•
51
Too many talking heads for this much time—schedule a break—don’t read to us, especially lists.
Panelists were good—except for Nancy who just read the handout. Interesting to see the difference
performance indicators different depts. Use. Also good to see different stages of development.
Assessment is a required part of our job. The info presented was fairly enlightening, but hardly useful.
More specific info on university requirements would have been.
It helped to hear about what was or wasn’t happening in the other areas.
Get industry people to be on panel. Get people from other institutions to be on panel.
Kudos to Erin for keeping Steven T. Sarratore on schedule. VPA processes may be difficult for non-artistic
academic units to draw ideas from.
More interaction with each other would have been helpful. Too much detail for me on what others are doing.
Not sure a whole morning was necessary—perhaps ID specific weak areas/depts. etc. focus effort/learning
there. Yes—work with the depts. As McCrosky said was helpful to her/them.
Perhaps present only 1 or 2 department/school examples and allow for more questions.
Thank you for breakfast. How does a faculty member become a member of an assessment team for
accreditation, e.g., ABET? Is there a training institute that ABET holds?
Very impressive program development and presentation
Response to Question #4: More positive toward improvements assessment can bring to a program, less
positive toward generating “meaningless” data to justify assessment.
Erin’s voice needs to be amplified. Reading PowerPoints is a drag, cycle of program assessment lends itself
to better visual representation. Relationships between curriculum, plans, review could all be shown
graphically.