Learning Spaces Diana G. Oblinger, Editor Learning Spaces Diana G. Oblinger, Editor ISBN 0-9672853-7-2 ©2006 EDUCAUSE. Available electronically at www.educause.edu/learningspaces Learning Spaces Part 1: Principles and Practices Chapter 1. Space as a Change Agent Diana G. Oblinger Chapter 2. Challenging Traditional Assumptions and Rethinking Learning Spaces Nancy Van Note Chism Chapter 3. Seriously Cool Places: The Future of Learning-Centered Built Environments William Dittoe Chapter 4. Community: The Hidden Context for Learning Deborah J. Bickford and David J. Wright Chapter 5. Student Practices and Their Impact on Learning Spaces Cyprien Lomas and Diana G. Oblinger Chapter 6. The Psychology of Learning Environments Ken A. Graetz Chapter 7. Linking the Information Commons to Learning Joan K. Lippincott Chapter 8. Navigating Toward the Next-Generation Computer Lab Alan R. Cattier Chapter 9. Trends in Learning Space Design Malcolm Brown and Philip Long ISBN 0-9672853-7-2 ©2006 EDUCAUSE. Available electronically at www.educause.edu/learningspaces Chapter 10. Human-Centered Design Guidelines Lori Gee Chapter 11. Designing Blended Learning Space to the Student Experience Andrew J. Milne Chapter 12. Sustaining and Supporting Learning Spaces Christopher Johnson Chapter 13: Assessing Learning Spaces Sawyer Hunley and Molly Schaller • Assessment Framework • Assessment Targets • Assessment Methods • An Example of Assessment • Practical Implications • Endnotes • About the Authors Part 2: Case Studies Chapter 14. Learning How to See Diana G. Oblinger Chapter 15. City of London: Sir John Cass Business School Clive Holtham Chapter 16. Denison University: MIX Lab Scott Siddall Chapter 17. Duke University: Perkins Library Marilyn M. Lombardi and Thomas B. Wall Chapter 18. Eckerd College: Peter H. Armacost Library J. Michael Barber Chapter 19. Estrella Mountain Community College: The Learning Studios Project Homero Lopez and Lori Gee Chapter 20. Hamilton College: Science Center Nikki Reynolds and Douglas A. Weldon Chapter 21. Indiana University-Purdue University Indianapolis: The ES Corridor Project Nancy Van Note Chism Chapter 22. Iowa State University: LeBaron Hall Auditorium Jim Twetten Chapter 23. London School of Economics: BOX Andrew Harrison Chapter 24. Messiah College: Boyer Hall Dennis Lynch Chapter 25. Michigan Technological University: Center for Integrated Learning and Information Technology Paul Urbanek Chapter 26. MIT: The Brain and Cognitive Sciences Complex Phillip D. Long ©2006 EDUCAUSE. Available electronically at www.educause.edu/learningspaces Chapter 27. MIT: Steam Café Scott Francisco Chapter 28. North Carolina State University: Flyspace Hal Meeks Chapter 29. North Carolina State University: SCALE-UP Robert Beichner Chapter 30. Northwestern University: The Information Commons Bob Davis and Denise Shorey Chapter 31. The Ohio State University: The Digital Union Victoria Getis, Catherine Gynn, and Susan E. Metros Chapter 32. Olin College of Engineering: Academic and Olin Centers Joanne Kossuth Chapter 33. The Pennsylvania State University: Smeal College of Business Peter Nourjian Chapter 34. St. Lawrence University: Center for Teaching and Learning Sondra Smith and Kim Mooney Chapter 35. Stanford University: GroupSpaces Richard Holeton Chapter 36. Stanford University: Wallenberg Hall Dan Gilbert Chapter 37. The University of Arizona: Manuel Pacheco Integrated Learning Center Christopher Johnson Chapter 38. University of British Columbia: The Irving K. Barber Learning Centre Simon Neame and Cyprien Lomas Chapter 39. University of Central Florida: Collaboration and Multimedia Classrooms Ruth Marshall Chapter 40. University of Chicago: The USITE/Crerar Computing Cluster and Cybercafé Shirley Dugdale and Chad Kainz Chapter 41. The University of Georgia: The Student Learning Center William Gray Potter and Florence E. King Chapter 42. Virginia Tech: The Math Emporium Barbara L. Robinson and Anne H. Moore Chapter 43. Virginia Tech: Torgersen Hall J. Thomas Head and Anne H. Moore ©2006 EDUCAUSE. Available electronically at www.educause.edu/learningspaces Chapter 13 Assessing Learning Spaces Sawyer Hunley and Molly Schaller University of Dayton An eloquent case can be made to explain the relationship between learning spaces and learning. But how do we know when a learning space enhances learning? We need assessment data to answer this question. The answer, in turn, provides guidance for developing learning spaces and for monitoring their impact on learning. We cannot assess the impact of learning spaces without addressing instructional and programmatic issues, which requires a multifactor, multimethod analysis. The analysis determines the learning space characteristics that enhance student learning and support the faculty’s pedagogical strategies. Data can then be used to establish a set of principles or guidelines to inform learning space development, while a monitoring system evaluates space effectiveness. This system should take into account learning outcomes and space utilization and should be sensitive to change over time. Assessment Framework Three issues must be addressed in the assessment design: First, it must be clear whether assessment focuses on teaching or learning. Second, the audience(s) for the assessment information must be identified to ensure the assessment blends with existing requirements, such as accreditation. Third, assessment of learning space must take into account the fact that learning and instruction are no longer confined to the classroom. This chapter provides a framework for assessing the impact of learning spaces on learning. Assessment targets and methods will be identified and then contextualized with an example of one university’s approach. Focus of Assessment While the goal of higher education is to help the students learn and develop, there is a difference between a learning focus and a teaching focus. An institution with an emphasis on learning measures its success through assessment of student ©2006 Sawyer Hunley and Molly Schaller 13.1 Learning Spaces learning outcomes. While the assessment of teaching might include evaluation of student learning outcomes, it is often limited to the assessment of student satisfaction with courses or peer observation of teaching performance, neither of which directly addresses learning. Assessment should integrate the evaluation of instruction and learning. Learning is facilitated through the pedagogical efforts of the faculty; both faculty and learners are supported by learning space. Therefore, appropriate assessment targets are learning outcomes, teaching methods, and use of learning space. Accountability Traditional accountability methods include reports on quality to federal, regional, or state agencies and accreditation bodies. The audience for systematic assessment of institutional quality lies outside the institution. Quality indicators generally are based on indirect measures of academic performance such as selectivity, academic expenditures, faculty-student ratios, and Carnegie classification. Because these measures do not adequately represent the net effects or value added from higher education, alternatives must be sought. Full accountability is not limited to external audiences. Internal examination of effectiveness is important for institutional growth and development. Pascarella and Terenzini1 found that educationally effective institutions are differentiated by student involvement in the academic and nonacademic systems; the nature and frequency of student contact with peers and faculty members; interdisciplinary or integrated curricula; pedagogies that facilitate learning engagement and application; campus environments that emphasize scholarship and provide opportunities for encounters with diverse individuals and ideas; and environments that support exploration. These factors are linked to student learning and can be measured in terms of engagement in learning activities and use of space. The assessment of the relationship between learning spaces and academic engagement aligns closely with accountability and can be included in the overall assessment plan for the institution. Informal Learning Informal learning, which occurs outside the formal instructor-facilitated setting, is now recognized as an important part of the overall learning environment. Informal settings include libraries and physical spaces that facilitate group Assessing Learning Spaces 13.2 and individual academic activities and computer-assisted learning. Technology has redefined the meaning of learning space by changing our notions of place and time: Place is defined by both physical and virtual settings. Learning time has become more flexible and can be formally scheduled or individually selected by the learner. The structure and content of learning can be formally structured and facilitated within a program or course or it can be self-directed. Assessment Structure The assessment structure is extended with the inclusion of informal learning activities. Thus, a comprehensive assessment of learning space addresses the use of physical space that accommodates formal as well as informal and technologically based learning. (See Table 1.) Table 1. Assessment Structure Characteristic Formal Learning Informal Learning Environment Physical and virtual Physical and virtual Time Scheduled, self-selected, and flexible Scheduled, self-selected, and flexible Structure Facilitated Self-directed Content Program-directed Self-directed Assessment Targets Institutions should determine assessment targets based on their own missions, goals, and culture. Models, theories, and research suggest relevant targets: In their general model for assessing change in college students, Pascarella and Terenzini2 suggested using university-wide targets to determine student growth. Strange and Banning 3 pointed to the importance of the person-environment interaction. Huba and Freed4 emphasized learning outcomes as a direct measure of learning. Astin’s theory of involvement 5 makes the case for measuring student engagement as an indicator for student learning. 13.3 Learning Spaces General Model for Assessing Change Pascarella and Terenzini’s 6 model approaches growth and development as a function of student background and precollege traits, structural and organizational features of the institution, interaction with agents of socialization, and quality of the study effort. Precollege traits and background can be addressed through the process of selectivity, but are less relevant to the discussion of how students develop once they enter the academy. For the purpose of measuring the impact of learning space, organizational and structural features are translated into programmatic, pedagogical, and environmental factors. Student growth and development are affected by their level of engagement and quality of study efforts. Thus, learning space assessment targets the facilitation or inhibition of student interactions with faculty and peers within formal and informal environments. The academic and cocurricular program, pedagogical approaches used by faculty, and environment become critical elements affecting engagement and are targets for assessment. Person-Environment Interaction Person-environment interaction models can help focus learning space assessment. Strange and Banning7 identified four person-environment themes: Physical surroundings encourage or constrain behavior. The collective socialization by individuals creates or defines environments. Organizational goals, complexity, centralization, formalization, stratification, production, and efficiency influence environments. Environmental pressure, social climate, and campus cultures influence perceptions of settings. Measures that target frequency and type of space use identify factors of the physical environment that encourage or constrain engagement. Focus groups, interviews, and surveys provide descriptive information regarding interactions between individuals, instructional characteristics, institutional climate, and other relevant structures. Quantitative and qualitative assessment methods reveal multiple aspects of the relationship between physical space and learning. Learning Outcomes Learning outcomes are observable and measurable indicators of student learning. Huba and Freed8 suggested that statements of learning outcomes usually Assessing Learning Spaces 13.4 begin with the phrase, “Students will be able to....” Maki 9 classified four levels of learning outcomes. Institutional outcomes are general and reflect students’ entire educational experience. Program outcomes reflect work within a specific program. Course outcomes reflect the type of work within a particular course. Individual outcomes come from data collected on the same individual over time. Direct measures of learning outcomes are the most valid and reliable indicators of academic gains. But direct measures to determine the impact of learning spaces on learning are fraught with complexity. For example, students generally participate in courses and learning activities not confined to one type of learning environment. Individual courses may be taught by multiple instructors using a variety of methods. Institution-wide learning goals measured at discrete points during students’ matriculation cannot fully account for the impact of their various experiences. Measures of learning specific to courses probably are not sensitive enough to detect differences due to instruction or the setting of variables. Individual measures collected over time would be costly and differ across individuals, making the data difficult to interpret. One alternative measure for student learning is student engagement. Engagement Astin’s theory of involvement10 asserts that “students learn by becoming involved.” A general consensus in the literature finds student engagement to be a valid indicator of educational effectiveness and a good indicator of learning. Research based on the National Survey of Student Engagement (http://nsse.iub.edu/index.cfm) validates this assessment target. The flexibility of the concept of engagement makes it useful for investigating the relationship between learning space and learning for several reasons: The relationship between the learning environment and the individuals occupying that environment can be determined. The involvement of students in learning activities within formal and informal learning environments can be measured. Engagement can be measured through direct (observation) and indirect (survey, focus groups) methods. Measures of engagement are sensitive to changes over time. 13.5 Learning Spaces Assessment Methods We have identified three general targets for assessing the impact of learning spaces on learning: Academic engagement Teaching methods Use of learning spaces Issues such as validity and reliability would be problematic if the assessment system relied on a single method; however, a multifactor, multimethod assessment approach allows for the aggregation and verification of outcomes across measures. Qualitative measures provide insight and a depth of understanding into how individuals respond to space, as well as into their needs and if those needs are being met. Quantitative measures reveal statistical relationships between specific types of space and their uses. Consistent patterns from the analyses demonstrate the impact of the space on learning. Three of the most useful methods are briefly described here. Focus Groups and Interviews Focus groups and interviews explore the users’ experience of spaces. They provide insight into how faculty and students respond to a particular space, how their views of each other change in different spaces, and how their views of learning are related to a specific space. While this approach relies on individual memory and interpretation, it also allows for a deeper understanding of individual reactions to spaces. Surveys While focus groups and interviews produce a rich understanding of users’ experience of space, surveys can tap the perspectives of a larger number of students and validate findings from other measures. The National Survey of Student Engagement (http://nsse.iub.edu/index.cfm), for example, assesses engagement of students across multiple institutions; annually developed norms can be used to compare institutions. Surveys administered repeatedly within a single institution can target specific questions and monitor changes in perceptions over time. Well-designed surveys generate both quantitative and qualitative information. Assessing Learning Spaces 13.6 Photographic Studies Photography, as a direct observational method, can determine usage patterns in learning spaces. Photographic studies capture observational data across time and in multiple settings with minimal intrusiveness and using modest resources. This approach quantifies students’ use of space, including their interactions with the physical and human environment. Direct observation offers a validity check for interpretations from other measures. An Example of Assessment Over the past decade, the University of Dayton has worked to improve the overall campus environment through planning, renovation, and construction. The multiyear Learning Living Assessment Project examines the relationship of the built environment, academic programs, and learning/engagement involving three innovative campus learning spaces and the library. A two-stage model first identifies the characteristics of learning spaces and academic programming that positively impact learning, then incorporates the findings to develop space and programs while monitoring their effectiveness. The project was launched in fall 2004. Two unique living/learning spaces that opened on campus in fall 2004 were included in the study. Conceived and developed over a three-year period, ArtStreet includes student townhouses, music practice rooms, classrooms, and studios. The Marianist Hall learning/living space (attached to a residence hall) was designed with a specific integrated learning community in mind. This space has two large classrooms, two smaller meeting rooms, many smaller spaces for faculty and students, and a rotunda for large group meetings. The Learning Teaching Center (LTC)—the third space included in the study—is an established space holding an experimental classroom, one large and one small meeting room, a coffee shop, meeting space, and personal study spaces. The study included the campus library study spaces to encompass a wider range of informal learning spaces on campus. Methods The first year of the project focused on determining the relationship among learning, academic programs, and physical learning space. Multiple methods and multiple sources were used in developing a streamlined and user-friendly assess- 13.7 Learning Spaces ment system. In most cases the measures were administered in the fall and in the spring to identify response patterns over time. Both qualitative and quantitative approaches were used. Engagement served to represent learning outcomes, as measured through the National Survey of Student Engagement (http://nsse.iub.edu/index.cfm), focus groups, surveys, and photographic studies of both formal and informal learning spaces. Data identified programmatic and pedagogical characteristics that increased both student and faculty engagement in the learning process. These data were also used to generate quantitative information about space usage and qualitative information clarifying why certain spaces were preferred. The National Survey of Student Engagement data were obtained from the spring 2004 administration of the survey to first-year students and seniors. These data suggested an overall pattern of student engagement for the university. Surveys were administered in the fall and spring to students who lived or took courses in the three innovative living/learning spaces and to a nonparticipant control group. The surveys provided both quantitative and qualitative data regarding perceptions of physical spaces on campus and academic programs. Focus groups recorded the in-depth perceptions of students and faculty who participated in the innovative spaces or programs. Photographic studies were conducted in the three innovative spaces and the library, with a layout of the space used to select photo spots. Still pictures taken on a digital camera were stored on a computer hard drive and then transferred to compact discs for ArtStreet, Marianist Hall, and the LTC. Photographers took a picture at each designated spot, chosen according to the arrangement of furniture and the configuration of the environment. Photos taken on the library’s six floors using a video camera were converted to DVDs. Photographers began on the top floor and followed a designated path throughout the building. Floors were designated with “zones” determined by a change in furniture. Photographs were taken by the primary researchers, volunteer students, student staff, or professional staff every hour beginning at approximately a quarter after the hour for one week. The library space, LTC, and Marianist Hall were each photographed in the fall of 2004. ArtStreet was not photographed at that time due to a delay in construction and opening of the facility until late in the fall term. Marianist Hall and ArtStreet were each photographed in spring 2005. The photographic study will continue to record changes in space use for the next two years. Assessing Learning Spaces 13.8 Insights Based on our first year of data collection and analysis, we have discovered relationships among learning space, instructional practices, and learning. Academic engagement was encouraged by learning spaces that were comfortable, open, flexible, and appealing. For example, students described classes in one of the innovative spaces as requiring more accountability on their part because there were few physical barriers between themselves and faculty. Students were most engaged in settings and in academic activities that encouraged interpersonal interaction and were supported by technology. In comparison, in more traditional classrooms with seats arranged in rows and the instructor at the front of the room, they felt they had less responsibility for participation. Engagement was discouraged by poor air circulation, uncomfortable temperatures, distractions, and noninteractive pedagogical practices. In addition, our photographic studies showed students using our newest and perhaps most innovative spaces late into the night for individual and group study. Students reported that they felt at home in the space and also that they could stay focused on academics while there. The results also revealed that no one physical structure accommodated all types of learning needs. A balanced environment facilitates both group and individual activities, with features that support computer access and spaces that allow for a break from focused academic work. The learning space often limits the faculty’s pedagogical repertoire. Faculty discussions or communities of practice expand their awareness of pedagogical options. Faculty who are comfortable leading case studies, discussions, or small group activities in flexible spaces do not believe they can accomplish these same activities in traditional spaces. They prefer flexible space with movable furniture and seamless technology. Faculty who were not comfortable with a range of pedagogical approaches tended to alter our most innovative spaces to obtain a lecture-room feel. In one classroom with no tables, just comfortable chairs in a circle, one faculty member consistently pulled a table in front of her seat and lectured from that position. In order to expand faculty pedagogy, we cannot simply build or design new spaces—faculty need to discuss exploring new approaches for engaging students. A key to academic engagement is to minimize the separation between living and learning. Learning takes place in all environments, so a complete assessment of the impact of learning environments must include informal as well as formal aca- 13.9 Learning Spaces demic settings. Formal settings are most engaging when they encourage learning through social interaction and are relevant to students’ lives. Informal settings must be flexible and comfortable and accommodate a variety of learning activities. This understanding of the relationship between living and learning led our research team to adopt the motto “Bring life to learning; bring learning to life.” Practical Implications Higher education has significant investments in learning spaces with the expectation of making a positive impact on learning. Well-designed assessments will provide the information needed to confirm the impact of learning spaces on learning. The process must account for the complex interaction among learning spaces, pedagogical practices, and student outcomes. Problems in interpreting the results can be mediated through a system that incorporates data gathered over time from multiple factors, multiple methods, and multiple sources. A two-stage model for assessment provides a set of criteria useful in guiding space development that also assists in identifying measurable targets; and a process for monitoring the impact of space on key learning and engagement targets over time. While concepts about the impact of learning space on learning are certainly generalizable, assessment procedures should be conducted by each institution to account for individual differences. Higher education must assess its own performance, address its weaknesses, build on its strengths, and promote high-quality experiences. The main advantage of an assessment strategy is the enhancement of student learning—the goal of every college and university. Endnotes 1. Ernest T. Pascarella and Patrick T. Terenzini, How College Affects Students: A Third Decade of Research, vol. 2 (San Francisco: Jossey-Bass, 2005). 2. Ibid. 3. C. Carney Strange and James H. Banning, Educating by Design: Creating Campus Learning Environments That Work (San Francisco: Jossey-Bass, 2001). 4. Mary E. Huba and Jann E. Freed, Learner Centered Assessment on College Campuses: Shifting the Focus from Teaching to Learning (Needham Heights, Mass.: Allyn & Bacon, 2000). Assessing Learning Spaces 13.10 5. Alexander W. Astin, “Involvement: The Cornerstone of Excellence,” Change, vol. 17, no. 4 (1985), pp. 35–39. 6. Pascarella and Terenzini, op. cit. 7. Strange and Banning, op. cit. 8. Huba and Freed, op. cit. 9. Peggy L. Maki, Assessing for Learning: Building A Sustainable Commitment Across the Institution (Sterling, Va.: Stylus Publishing, 2004). 10.Astin, op. cit. About the Authors Sawyer Hunley is coordinator of the School Psychology Program and a learning/teaching fellow at the University of Dayton in Ohio. As program coordinator, she has advanced the concept of learning and behavioral assessment in a prevention and response to intervention model. In her role as chair for the National Association of School Psychologists Certification Board, she has integrated this model into the requirements for national credentialing. Hunley is currently investigating the impact of learning space on learning at the University of Dayton. Molly Schaller is an assistant professor and coordinator of the College Student Personnel Program and a fellow in the Learning Teaching Center at the University of Dayton. She and Hunley are engaged in a multiyear, comprehensive study of the relationship between space and learning. She holds a master’s degree from Miami University in college student personnel and a PhD in higher education administration from Ohio University. Her research focuses on college student development with a special emphasis on sophomore students. 13.11 www.educause.edu/learningspaces ISBN 0-9672853-7-2 ©2006 EDUCAUSE. Available electronically at www.educause.edu/learningspaces [email protected] 1150 18th Street, NW, Suite 1010 Washington, DC 20036 202-872-4200 202-872-4318 (fax) www.educause.edu 4772 Walnut Street, Suite 206 Boulder, CO 80301-2538 303-449-4430 303-440-0461 (fax)
© Copyright 2026 Paperzz