A Program-level Assessment Exam In-house Development and Implementation March 2011 IACBE National Conference Las Vegas, Nevada Dr. Kevin Schieuer, Finance Dr. Jena Shafai, DS and SCM College of Business Bellevue University, Nebraska Purpose of Presentation Review foundations of assessment and to share our experiences regarding the development and initial implementation of an in-house program-level assessment exam. Agenda 1. Foundations of assessment 2. Program exam options: vendor versus in-house 3. Development process 4. Other considerations: demographics 5. Implementation process 6. Results, conclusions, and recommendations 7. Questions and discussion Bellevue University • • • • • • • • Four-year private, not-for-profit university Founded 1966 by Chamber of Commerce Located in Omaha, NE metropolitan area Programs: ~40 bachelors, ~18 masters, and Ph.D. in Human Capital Management Enrollment: ~ 6000 UG, 2500 graduate Online and residential Accreditation: HLC and IACBE for CoB Focus: Real Learning for Real Life, business orientation. Foundations of Assessment Assessment • …the systematic collection & analysis of information to IMPROVE STUDENT LEARNING (OAPA, UMass, 2001). Foundations of Assessment Assessment • …what should students be learning? - goals and objectives? • … what are students actually learning? - meeting goals and objectives? • … what should faculty do to improve learning? - continuous quality improvement? (OAPA, UMass, 2001) Foundations of Assessment • Direct measures versus indirect measures • Program level versus course level • Assessment tools commonly used • Program mapping of learning outcomes Foundations: Direct versus indirect measures Direct Methods of Assessment • “require students to display knowledge & skills as they respond to the instrument itself”. • Objective tests, essays, presentations, and classroom assignments all meet this criterion. (Palomba & Banta, 1999, 11-12; OAPA Umass, 2001, 22) Foundations: Direct versus indirect measures Direct Methods of Assessment • are “tangible, visible, self-explanatory, and compelling evidence of exactly what students have and have not learned”. (Suskie, 2009, 20) Foundations: Common Direct Measures • Exams, Quizzes, Pre-test and Post-test, • Projects, reports, presentations, • Portfolios of student work, • Simulation results, • Field experience / internship ratings, • Observations of student dynamics and behaviors, • Summaries/assessments of content discussion threads, • Capstone projects, reports, presentations, • Professional certifications/exams, • Business plans, • Program exams. Foundations: Direct versus indirect measures Indirect Methods of Assessment • “ask students to reflect on their learning, rather than to demonstrate it”. • Surveys (students/employers), reflection journals, discussion papers, and interviews may all meet this criterion. (Palomba & Banta, 1999, 11-12; OAPA Umass, 2001, 22) Foundations: Direct versus indirect measures Indirect Methods of Assessment • “consists of proxy signs that students are probably learning”. • Less clear, less convincing than direct measures. • Retention, graduation rates. (Suskie, 2009, 20) Foundations: Common Indirect Measures • Program surveys: - students / alumni / employers, • University surveys: student satisfaction, • Focus group discussions and feedback, • Net Promoter Scores, • Course evaluations, • Exit interviews, • Summaries/assessments of reflection discussion threads, • Placement rates (employment/graduate school), • Course / program reflection papers, • Learning journals (reflections) • Discussion papers • Faculty – student discussions (document, email), • Student honors, awards, and scholarships. Foundations: Program versus Course • Course level assessment: - more specific to individual course objectives, - generally administered during course. • Program level assessment: - more general to overall program outcomes, - integrates multiple course areas within a program, - time delay: courses may be several terms before assessment - elements may be embedded within courses. • Appropriate level of details for the level of assessment. • Consider objectives / outcomes in each level. • Mapping / rubrics relate course objectives to program outcomes Foundations: Program Mapping Enhances program design, learning, and assessment. Multiple Mapping Alternatives: • • • • • Course assessments / activities to course objectives, Course / program content to body of knowledge lists, Course objectives to program outcomes, Program outcomes to university goals, Program assessment tools to program outcomes. Program Exam Options Vendor vs. In-house Development • Vendor exams: - external standard, comparative data - reduce development and processing time - support & updates from vendor may a consideration - risk of less alignment with curriculum, though adaptable - risk of less faculty engagement and ownership - cost may be a factor Program Exam Options Vendor vs. In-house Development • Major vendor exams: - Educational Testing Service (ETS): Major Field Test in Business (ets.org) * 120 question, multiple-choice, 2 hours, * 4 answer options per question * Martel (2007): 15% of surveyed AACSB schools indicated use of ETS (46% of 154 respondents, out of 469 surveyed) (Martel, 2007) •Content (disciplines included) Accounting (15%) Management (15%) Finance (13%) Economics (13%) Marketing (13%) Quant. Business Analysis (11%) Info Systems (10%) Legal/Social Envir. (10%) International (embedded) Program Exam Options Vendor vs. In-house Development • Major vendor exams: - Comprehensive Business Exam (CBE): (fbla-pbl.org) * 88 question, multiple-choice, 1.5 hours (most done 30-60 minutes), * 4 answer options per question •Content (disciplines included) Accounting (22%) Management (17%) Economics (10%) Marketing (14%) Social Envir. (10%) International (10%) Finance (14%) Legal Environ.(10%) Program Exam Options Vendor vs. In-house Development • In-house development: - reduced external validation and comparative data - collaborative development experience, takes time - enhance cross-discipline interaction and program knowledge - enhance faculty engagement, buy-in and ownership - enhance alignment with curriculum - updates may be easier - cost savings & control may be factors to consider In-house Development Process: Faculty engagement and buy-in • Direct involvement in development and implementation • Inclusion of all faculty (not just a few) • Faculty in same disciplines collaborating to write exam questions • Enhanced faculty interactions on content Development Process • Multi-disciplinary business content areas, • Development promoted broad understanding of BA Program experience among faculty, • Faculty small-group interactions, review, and feedback, • Small group interaction dynamics. Development process: Multi-disciplinary business content 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. Critical Thinking Computer Applications Accounting – financial Accounting - managerial Management Principles Organizational Behavior Marketing Principles Finance Management Science (Quantitative Methods) Operations Management Business Ethics Business Law Strategy and Business Policy Development process: Small group review and interactions • 5-6 Questions submitted by content experts • Two sessions held during monthly College meetings • ~8 small groups composed of 3-4 faculty • Groups completed a feedback form for each question reviewed by their group Development: Review/Feedback Form Question Review, Evaluation, and Recommendations by Faculty Teams Goal: • Conceptual oriented questions, • Assess achievement of program level knowledge and skills, • Relevant to professional careers, • Demonstrate at the conclusion of BA program. Development process: Review Form Question Review, Evaluation, and Recommendations by Faculty Teams 1. Does the question frame the concept and/or topic being addressed? 1 = Yes 2 = somewhat yes 3 = unclear/neutral 4 = somewhat no 5 = No 2. Is the intent of the question clearly stated? 1 = Yes 2 = somewhat yes 3 = unclear/neutral 4 = somewhat no 5 = No 3. Does the question address a relevant and/or significant business skill or concept generally used in business? 1 = Yes 2 = somewhat yes 3 = unclear/neutral 4 = somewhat no 5 = No 4. Does the question address an application of conceptual business knowledge or skills appropriate for a program level exam? 1 = Yes 2 = somewhat yes 3 = unclear/neutral 4 = somewhat no 5 = No 5. Does the question require memorization or recall of details that successful business administration graduates may not generally know without reference resources? 1 = Yes 2 = somewhat yes 3 = unclear/neutral 4 = somewhat no 5 = No 6. Is the question too easy or too hard? 1 = Too Easy 2 = somewhat easy 3 = about right 4 = somewhat hard 5 = too hard Development process: Review Form Question Review, Evaluation, and Recommendations by Faculty Teams • Recommendations for improving question: Retain question as is? Yes or No, if NO, then indicate (circle one): 1. REVISE & RETAIN or 2. OMIT • Recommended revisions (please provide specific wording and revision recommendations and/or ideas): • Recommendations of other ideas or key concepts to consider for a question: • Other Comments: Development: Group Dynamics • Discipline/area faculty separated & distributed among groups • Groups did not review questions written by members • Authors of questions were not revealed • Groups of 4-5 faculty reviewed 8-10 questions • Small group discussion ~ 60 minutes • College discussion ~ 30 minutes • Feedback forms sent to discipline experts for consideration Development: Group Dynamics • Feedback forms given to faculty for consideration • Feedback summary: Action Number of Questions Percent of Questions Retain as is 20 31% Revise and retain 32 49% Omit 13 20% Total 65 100% Development: Group Dynamics • Authors of questions (discipline experts) had discretion regarding question modifications • Session 2 Review: • repeat of session 1 process and dynamics • focused on revised or new questions from session 1 • result: selection of 5 questions per discipline area Sample Finance Question Which of the following variables are essential to evaluating a project’s financial viability and value? a. Net income b. Cash flow c. Risk level d. Book value of equipment e. B and C only Sample Management Science Question When managing projects, one must often reduce project time to meet deadlines. This requires expediting (i.e., crashing) certain project activities. If the goal is to reduce project time at a minimum cost, then the project manager should: a. Crash all critical path activities b. Crash critical path activities with the least cost of crashing c. Crash largest-duration activities d. Crash lowest-cost activities e. Crash activities which are easiest to finish Development: Group Dynamics Bonus outcomes of process: • Faculty gained appreciation of other discipline areas • Interaction enhanced question writing skills • Collegiality: active, engaged, passionate discussions • Greater appreciation of what students should learn from our business degree (i.e. program outcomes) Other considerations Demographic factors: • Gender • Age • Work experience • Transfer status: hours overall, BA, AC • Major: Business, Accounting, other • GPA Other considerations • Timing and course sequence considerations • Transfer students implications Fall 2010: Added 13 questions regarding completion of core courses Example: Have you completed AC 205, Financial Accounting? a. Yes, at Bellevue University b. Yes, at another university/college c. no, but I am currently enrolled in this course d. no, I have not had this course Implementation Process Administration of exam: • In capstone course: Strategy and Business Policy • Residential sections only: ~ 90 minutes • Winter 2009, spring and fall (13 transfer/completion items) 2010 • Some course credit provided at professor’s discretion • Implementation challenges: faculty health complications Results Winter ’09 Spring ’10 Fall ‘10 20 25 16 Average of % of students answering exam questions correctly 44% 51% 55% Min, Max % of students answering exam question correctly 5%, 95% 0%, 93% 0%, 100% n (class size) Results Correlation of Percent Correct Questions: Correlation Winter ‘09 Spring ’10 Winter ’09 1.0 Spring ’10 0.78 1.0 Fall ‘10 0.73 0.77 Fall ‘10 1.0 Observations • Scores lower than expected: • • • • Questions too hard Students not learning or retaining knowledge Students have not had the courses yet (pre-requisite enforcement) Transfer Courses: Not meeting expectations? • Fall 2010: ~70% of students transferred in ~ > 50% core courses • Recency Effect: highest percent correct (~82%) in Strategy and Business Policy course, in which the exam was administered • Some questions are consistently among most missed across terms Continuous Improvement Assessment …the systematic collection & analysis of information to IMPROVE STUDENT LEARNING (OAPA, UMass, 2001). Continuous Improvement Faculty Reflections: How to improve student learning • are exam questions appropriate for program assessment? • are exam questions addressing appropriate course and program learning outcomes? Future Considerations • Enforcement of pre-requisites • Administer online (via Survey Monkey?) • Currently considering vendor options for external validation and comparisons Future Considerations Closing the assessment loop: Faculty involvement • • • • Efforts to improve student learning and performance Feedback to discipline area faculty Review / update course delivery / focus in light of low scores Review / update exam content as changes are made to program / course content • Track program exam performance in discipline areas with associated course grades Bibliography • Angelo, Thomas A. and K. Patricia Cross, “Classroom Assessment Techniques: A Handbook for College Teachers,” 2nd Ed., Jossey-Bass, 1993. • Martell, K. (2007). Assessing student learning: Are business schools making the grade? The Journal of Education for Business, 82(4), 189-195. • Office of Academic Planning & Assessment (OAPA), “Program-Based Review • Palomba, Catherine A. and Trudy W. Banta, “Assessment Essentials: Planning, Implementing and Improving Assessment in Higher Education,” Jossey-Bass, 1999. • Suskie, Linda, “Assessing Student Learning: a common sense guide,” 2nd Ed., Jossey-Bass/Wiley, 2009. • White, Bonnie J. (2007) Perspectives on Program Assessment: An Introduction, The Journal of Education for Business, 82(4), 187-188. and Assessment: Tools and Techniques for Program Improvement,” University of Massachusetts Amherst, 2001. Questions and Discussion Kevin Schieuer [email protected] Jena Shafai [email protected]
© Copyright 2026 Paperzz