MSEM Annual Assessment Report 09-30-10

Annual Assessment Report to the College 2009-2010
College: College of Engineering and Computer Science (CECS)
Department: MSEM
Program: MSE
Note: Please submit report to your department chair or program coordinator and to the Associate Dean of your College by September 30, 2010.
You may submit a separate report for each program which conducted assessment activities.
Liaison: Ileana Costea
1. Overview of Annual Assessment Project(s)
1a. Assessment Process Overview: Provide a brief overview of the intended plan to assess the program this year. Is assessment under the
oversight of one person or a committee?
The overall MSEM assessment process is presented in figure 1 at the end of the report.
The plan for assessment for the MSEM Department is following the plan the Department has for the Engineering Accreditation ABET Board. The MSE program
was accredited for the full 6 years in 2007, and will be visited by ABET again in 2013.
The department full-time faculty, with the full support of the Department Chair, is participating in the continuous assessment process, as needed and required
by the engineering accreditation ABET board. The Department assessment liaison is expected to be a member of the College Assessment Committee and the
University Assessment Committee and to bring information from these committees to the rest of the MSEM faculty at each department meeting, every two
weeks. The Assessment/ABET liaison coordinates the assessment work of the rest of the faculty. All Faculty have to create ABET course folders for the
course(s) they are responsible for (either teach, or coordinate in the case of a part-timer teaching the course) based on the schedule of the Department ABET
plan. As part of the ABET Course folders, an Assessment Matrix and Report is created from which assessment of the SLOs results. All instructors, full-time and
part-time, have to assess the MSE course they teach at the end of the semester, based on a simplified assessment form distributed by the department chair.
See figure 2 at the end of this report.
MSEM Annual Assessment Report 2009-2010 – prepared by Ileana Costea, Ph.D.
Page 1
1b. Implementation and Modifications: Did the actual assessment process deviate from what was intended? If so, please describe any
modification to your assessment process and why it occurred.
The actual assessment process was followed in the sense of applying its methodology. However, the actual work fell behind schedule. One cause is the
furlough situation of the last academic year 2009-2010. The other is the little amount of assessment work done the previous year.
The following have been achieved during the academic year 2009-2010:
•
Assessment folders including ABET Matrices and Reports were created for the following courses that are listed in the ABET plan: MSE101, MSE227,
MSE227L, MSE304OL, MSE407, MSE488A, and MSE488B.
•
The Department did a partial assessment of SLO5 and SLO13, and SLO7 and SLO14 (based on courses for which folders were created this academic
year). For course MSE410L SLO 5 & SLO 7 were assessed. All SLOs will be assessed by 2012, as per ABET Department plan.
•
Re-confirmation of MSE Program Objectives was done by MSEM faculty and will soon be done also by the MSEM Industry Liaison Committee
(Department Chair in charge). Faculty revised the program outcomes and concluded that they are valid for the current program and should be kept as
they are.
•
The action items mentioned in the MSE Annual Assessment Report to the College 2008-209 were addressed : improve MSE 227 reports; develop new
experiments for MSE 412/L ; improve the uniformity of rubrics for SLO7 and SLO13; reinforce the design process instruction in MSE 415; expand ethics
instruction components in MSE 101/L and MSE 488 A&B; improve the MSE 412/L Lab Manual. The assessment of the on-line course MSE304OL has
been completed. The comparison of the assessment results between the on-line and on-campus MSE304 offerings remains to be done.
•
Involvement of all full-time faculty in the Assessment Process: all instructors, full-time and part-time, are to assess each semester the MSE program
courses they teach using the new simplified assessment form (figure 2); all full-time faculty in the MSEM Department work on course assessment
folders, including ABET/Assessment matrices, ABET/Assessment Reports with analysis of the numerical results in the matrices, and conclusions about
what needs to be changed and improved.
More work is still needed to get in line with the ABET accreditation plan of the department. More intense assessment work is planned for the academic year
2010-2011 now that the furlough environment is over. An Assessment Department Committee will be established, and special assessment working sessions
outside department meetings will be arranged regularly.
MSEM Annual Assessment Report 2009-2010 – prepared by Ileana Costea, Ph.D.
Page 2
2. Student Learning Outcome Assessment Project: Answer questions according to the individual SLO assessed this year. If you assessed an
additional SLO, report in the next chart below.
2a. Which Student Learning Outcome was measured this year?
The MSEM Department completed assessment of SLO 5 and SLO 13. (See also section 1b above).
2b. What assessment instrument(s) were used to measure this SLO?
Direct measures were used for assessment.
Department approved target levels of contribution of course learning objective to the program SLO were compared with actual contribution based on student
performance. Evidence of student performance consisted of samples of homework assignments, class projects, reports, and examinations evaluated and
graded by faculty.
Each year EBI Surveys are produced and processed by Educational Benchmarking Inc. At the end of the year the department chair conducted Exit Interviews
with students in the culminating experience design cours MSE488.
The data for assessing SLOs is to be obtained from the courses that contribute to the respective SLO. The performance of all students in these courses is
measured. The Department has a master table that shows which courses contribute to which program SLOs. Then each course has a target Assessment Matrix
that shows the target contributions of the course objectives to the program SLOs. The possible level of contribution are: 1 = incidental, 3 = low level, 5 =
moderate level, 7 = high level, 9 = exceptionally high level. Only levels 3, 5, 7, and 9 are used in the target matrix. A sample of the Assessment target matrix for
an MSE course that was assessed this academic year (the culminating experience design course MSE488) is given in figure 3 at the end of this report. The
actual performance on all these contribution entries is assessed based on the performance of all the students in the course.Numerical values different from
the target values are obtained. A total value per each SLO for that course is obtained as a summation of all the entries in that column, and compared with
target total value. If actual summation value for an SLO for the course is equal or greater than the target value, the conclusion drawn is that that the
performance on that SLO is satisfactory for the course and no change/improvement is needed . A portion of the master assessment target for all courses and
SLOs is extracted and given in figure 4. at the end of this report, indicating the courses that are considered as contributing to the SLOs 5, 7, 13, & 14.
Note: To achieve uniformity among instructors/courses the department plans to use in the future rubrics established by the College Assessment Committee.
So far team work, written and oral communication rubrics were created at the College level.
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
Assessment was done based on performance of all students in the courses that contribute to the program assessed SLOs. See also Section 2b above and
figures 3 and 4 at the end of this report.
MSEM Annual Assessment Report 2009-2010 – prepared by Ileana Costea, Ph.D.
Page 3
The student work samples are department curriculum-faculty evaluation of student performance on the SLOs during Fall 2009 and Spring 2010, and the
courses are MSE101, MSE227, MSE227L, MSE304OL, MSE407L, and MSE488B.
2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was
a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
A cross-sectional comparison was used, comparing the perfrmance of all students from freshmen to seniors.
The assessment results were supposed to improve MSE 227 reports, develop new experiments for MSE 412/L, improve uniformity of rubrics for SLOs 7 and 13,
reinforce design process instruction in MSE 415, expand ethics instruction components in MSE 101/L and MSE 488 A&B and improve MSE 412/L Lab Manual.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the
data collected.
The following needs were determined based on assessment results: more technician support; more uniformity among various sections of the same course, and
better use of the College Assessment Committee rubrics to facilitate teamwork, written and oral communication.
2f. Use of Assessment Results of this SLO: Think about all the different ways the results were or will be used. For example, to recommend
changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to
program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed
description of how the assessment results were or will be used.
•
A strategy to catch up with the ABET plan is needed. All SLOs will be assessed by 2012.
•
A tutor should be hired to help students improve their writing.
•
The results of the MSE227 assessment indicate that there is a considerable emphasis on SLO5 identify, formulate, solve engineering problems), but
less on SLO 7 (effective written and oral communication). In the future, efforts will be made to balance these requirements by placing additional
emphasis on writing and public speaking.
Some programs assess multiple SLOs each year. If your program assessed an additional SLO, report the process for that individual SLO below. If
you need additional SLO charts, please cut & paste the empty chart as many times as needed. If you did NOT assess another SLO, skip this
section.
MSEM Annual Assessment Report 2009-2010 – prepared by Ileana Costea, Ph.D.
Page 4
3. How do your assessment activities connect with your program’s strategic plan?
Everything we do for assessment is helping us with our ABET accreditation assessment endeavor and our strategic plan.
4. Overall, if this year’s program assessment evidence indicates that new resources are needed in order to improve and support student
learning, please discuss here.
The design culminating experience course , MSE488, has been modified throughout the years to make the student learning experience richer. Therefore, all the
projects require students to work cohesively in teams and to operate numerous types of shop equipment, such as TIG and MIG welders, plasma cutters,
manual and CNC mills and lathes, and to use advanced 3-D modeling software. Because of the modification of the projects in MSE488, student spend more
time in the Automation/MSE lab. The lab contains heavy-duty industrial fabrication equipment. For safety reasons the presence of a technician is required for
more hours.
5. Other information, assessment or reflective activities not captured above.
Lists of MSE undergraduate alumni were compiled. The lists were solicted by Ileana Cstea, the MSEM Assessment representative for 2009-2010, and were
obtained from the Department Chair, Assciate Dean, Alumni, and Institutional Research Office.
These lists should be used to send a questionnaire about the program to alumni, and through them to contact their respective employers via email. This
process should be continued in the following academic years.
•
A questionnaire was created for MSE304 course to be distributed to students at the end of the semester.
•
Continuous improvement of the program is the basis of the assessment endeavor.
•
Faculty understand the value of the assessment for program improvement and better student learning.
•
Education of faculty in assessment issue is a department wide issue. This year the Assessment Liaison attended the ABET/Assessment annual
MSEM Annual Assessment Report 2009-2010 – prepared by Ileana Costea, Ph.D.
Page 5
workshop, providing an opportunity to learn about how assessment is done at other institutions nationwide.
6. Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your
program? Please provide citation or discuss.
No
Figure 1. MSEM Assessment Activities
MSEM Annual Assessment Report 2009-2010 – prepared by Ileana Costea, Ph.D.
Page 6
MSEM - Course Evaluation Form
Course
Number:
MSE 488
Instructor:
R.D. Conner /w T.
Shraibati in F 2009
Fall & Spring,
2009-10
Semester/year:
The purpose of this is form is to document the achievement of course objectives and program outcomes in the courses that you instruct. Answers to the questions
below should cite supporting evidence from your own observations, student performance on assignments and examinations, and other feedback.
x
First time course taught by this instructor
Course prerequisite(s)
Course taught previously
MSE 365
MSE 408/L
X
•
Were the students adequately prepared by prerequisite courses?
Yes
Were changes implemented since the last time this course was taught?
Yes
No
X
•
No
If Yes, what changes were made since the last time this course was taught? Did these changes improve the course?
Changes made since last time
Included section on Engineering Ethics
Effects of change
Ethics work was well received by students as evidenced
through discussion.
.
MSEM Annual Assessment Report 2009-2010 – prepared by Ileana Costea, Ph.D.
Page 7
X
•
Yes
Are changes called for the next time this course is taught?
Changes recommended for next time
No
If Yes, what changes should be made the next
time this course is taught?
Purpose of changes
Clearly define course milestones & expectations for written Provide additional guidance and keep students focused.
and oral work.
Provide revised rubrics on writing expectations and Provide additional clarification on expectations.
evaluation
Establish and equitably distribute specific tasks to students To load all students equally.
on projects
Provide closer scrutiny of engineering analysis and Much of the engineering analysis, e.g., statics, strength of
materials, was poorly understood and performed
calculations
incorrectly
Most useful comments from students:
Students indicated that workloads were not evenly distributed. The 80/20 rule was enforced: 20 percent of the students
did 80% of the work.
Achievement of Course Objectives/Demonstration of Program Outcomes
Did the students demonstrate achievement of the course objectives and program outcomes specific to this course? In the table below, rate achievement of
objectives/outcomes using evidence from direct assessment of student work, student surveys, etc.
If sampling, please indicate the approximate percent of the class sampled:
100
Use assessment rubrics for determining program outcome assessment
MSEM Annual Assessment Report 2009-2010 – prepared by Ileana Costea, Ph.D.
Page 8
Course Objectives/Program
Outcomes
List Course Objectives first,
followed by Program Outcomes
Means of Direct
Assessment by
Instructor—what evidence
was used for your
assessment?
SLO 5 (Outcome E): an ability to
Sr. Design Project
identify, formulate and solve
Reports/Presentation &
manufacturing systems engineering evaluations
problems.
Instructor’s Direct
Outcome Assessment
Improved (yes/no/??) compared
to last year
9=Excellent to 1=Poor
(9, 7, 5, 3, 1)
7
SLO 7 (Outcome G): effective oral
and written communication
Senior design
reports/presentations/
evaluations
9
SLO 13 (Outcome M): an
understanding of the design of
products and the equipment, tooling
and environment necessary for their
manufacture.
SLO 14 (Outcome N): an
understanding of the creation of
competitive advantage through
effective management of
contemporary manufacturing
enterprises.
Senior design reports
9
Gantt Charts, Work break 7
down structure, Sr. Design
presentations
Fig. 2. Sample of simplified assessment course form to be used by all instructors teaching MSE courses all semesters.
MSEM Annual Assessment Report 2009-2010 – prepared by Ileana Costea, Ph.D.
Page 9
Outcome O
Outcome P
Outcome N
Outcome M
Outcome L
Outcome K
Outcome J
Outcome I
Outcome H
Outcome G
Outcome F
Outcome E
Outcome-related course learning objectives
Outcome D
Prepared by: D. Conner, May 11, 2010
Outcome C
Outcome A
Course: MSE 227- Materials Engineering
Outcome B
ABET COURSE ASSESSMENT Report page 1 of 2
12
14
1. Find relevant sources of information necessary to complete design projects.
8
2. Utilize appropriate equipment/software to perform project work
3. Identify various aspects of the dilemma in a case study related to an ethical situation that a
professional engineer has encountered. Define, discuss and justify possible solutions.
4. Function effectively on a design team with effectiveness being determined by the instructor and
peer ratings.
5. Write an effective project report which communicates the work that was performed.
7
6. Deliver an oral presentation of work performed to faculty and/or other experts.
9
7
7. Demonstrate ability to utilize program-related knowledge in open-ended professional design
experience.
8. Develop a prototype or process design within economic constraints, compressed project
schedule, social and political context inherent when multiple organizations are involved.
9. Write weekly activity reports and submit via e-mail
0
DEMONSTRATED CONTRIBUTIONS TO PROGRAM OUTCOMES
16
PROGRAM OUTCOME TARGETS
5
7
COMPARISONS WITH TARGETS
7
7
3
=
21
8
8
9
7
0
7
<
3
21
14
23
15
21
19
>
<
Note ABET uses the term Outcomes (with letters) instead of SLO (with numbers); e.g. Outcome A = SLO1
Objective contributes to Outcome: 1 = incidentally, 3 = low level, 5 = moderate level, 7 = high level, 9 = exceptionally high level
Figure3. Sample target Assessment Matrix for the culminating experience MSE course, MSE488.
MSEM Annual Assessment Report 2009-2010 – prepared by Ileana Costea, Ph.D.
Page 10
MSE
Course
SLO 5
(OUTCOME E)
SLO 7
(OUTCOME G)
SLO 13
(OUCOME M)
101/L
227
227L
304
319/L
362
402
403CS
407
409/L
410/L
412/L
415
488 A & B
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
SLO 14
(OUCOME N)
x
x
x
x
x
x
x
x
x
x
Legend:
SLO 5: an ability to identify, formulate, and solve manufacturing systems engineering problems..
SLO 7: an ability to communicate effectively in both the written and spoken modes.
SLO 13: an understanding of the design of products, and the equipment, tooling and environment
necessary for their manufacture.
SLO14: an understanding of the creation of competitive advantage through effective
.
Figure 4. Sample table showing which MSE course contribute to each SLO (OUTCOME).
Note that ABET engineering accreditation uses the term letter-OUTCOME instead of number-SLO
MSEM Annual Assessment Report 2009-2010 – prepared by Ileana Costea, Ph.D.
Page 11