OSUSocialRoboticsSyllabus

ROB-568, Fall 2017
Social Robotics
Instructor: Heather Knight
Office: TBD
E-Mail: [email protected]
CATOLOG DESCRIPTION
In-depth exploration of the leading research, design principles, and challenges in
Human-Robot Interaction (HRI), with an emphasis on socially interactive robots.
Topics include social embodiment, multi-modal communication, human-robot
teamwork, social learning, aspects of social psychology and cognition, as well as
applications and evaluation with human subjects. Requires participation, lightning
talks, student-led lectures, written critiques of class readings, and a group project
involving a hypothetical social robotics project.
Pre/Co-requisite, and Enforced Prerequisites: none
Level Limitations: +01 (Graduate)
Course Credits: 4 (220 minutes of combined lecture and discussion per week)
Required text: None. All papers are available on-line.
Tentative list of topics:
 Introduction: Human-Robot Interaction (HRI)
 Intentional Action
 Intention Parsing
 Intention Expression
 Legible Robot Motion
 Social Navigation
 Navigation Planning
 Kalman and Particle Filters
 Social Navigation
 Nonverbal Behavior in HRI
 Gestures and Body Language
 Expressive Motion
 Gaze and Eye Contact
 Collaboration
 Human-Robot Collaboration
 Handovers
 Learning in HRI
 Reinforcement Learning and Supervised Learning
 Learning from Demonstration
 Socially Guided Robot Learning
 HRI Experiments
Experimental Design
Design Research in Social HRI
Performance Methodologies in HRI
 Applications for HRI
 Socially Assistive Robots
 Logistics
 Robot Ethics
Learning Outcomes:
By the end of the course students will be able to:
 Articulate the core theoretical challenges in Social Robotics.
 Read and assess technical papers.
 Structure and present research experiments.
 Motivate and produce Social Robotics study designs.



Evaluation of Student Performance:
Attendance (10%): Students must attend 18 or more of the 20 sessions to receive
10%, 16 or more will receive 5%, otherwise 0%. Permission to miss classes without
penalty must be requested at least 48 hours beforehand.
Lighting Talks (10%): Students receive up to 5% points for each lighting talk, 1
point each for motivation, experiment, results, visuals, and entertainment value.
Research Blog (30%): Before each lecture, students will read at least one of the
assigned papers and write 200-300 words about the upcoming topic. Due one hour
before the start of class. Entries worth 2-3 points, one for each of the following:
 Focus on critique and reactions to readings, not merely a summary
 Present a thoughtful opinion to posed question(s). Compare/contrast ideas,
concepts across readings.
 (optional) Go beyond assigned readings to find papers you feel are
particularly informative, insightful.
Student Lecture (30%): Each student will create a 30-minute presentation for
their assigned lecture topic, and prepare 3-5 questions to lead class discussion on
the topic:
 The presentation will be worth 20%: motivations of topic, completeness,
clarity, visuals, and entertainment value.
 Discussion questions and quality worth 5%: anonymous class vote.
 Posting the slides on the class website: 5% for same day, decreasing one
percentage point per day after that.
Final Project (20%): Students will form teams around a theoretical research topic,
perform a literature search and motivate this topic (5%), outline a prospective
research experiment (5%), create video illustrating “simulated” examples of study
outcomes (5%), and share reflections on “results” and overall process (5%).
SCHEDULE
Class
Date
TOPIC
1-1
Sep-21
Intro: Social HRI + Course Overview
1-2
Sep-26
Sample Lecture: Early Social Robotics
2-1
Sep-28
Student Lighting Talks
2-2
Oct-3
1. Social Navigation
3-1
Oct-5
2. NVB: Gestures
3-2
Oct-10
3. NVB: Expressive Motion
4-1
Oct-12
4. NVB: Gaze
4-2
Oct-17
5. Human-Robot Collaboration
5-1
Oct-19
6. Handovers
5-2
Oct-24
7. Multimodal Dialog
6-1
Oct-26
8. Learning from Demonstration
6-2
Oct-31
9. Socially Guided Robot Learning
7-1
Nov-2
10. Design Methods
7-2
Nov-7
11. Improvisation & Performing Arts
8-1
Nov-9
12. Ethics in Social Robotics
8-2
Nov-14
Project Working Session I
9-2
Nov-16
Literature Search Lighting Talks
9-2
Nov-21
Project Working Session II
10-1
Dec 5
Project Presentations I
10-2
Dec 7
Project Presentations II
BLUE: Housekeeping
RED: Lectures
GREEN: Student Talks
Working Together and Expectations of Student Conduct: Collaboration with
fellow students is mandatory. Using freely available resources on the web (and
elsewhere) is highly encouraged. You are, however, responsible for clearly
documenting material or ideas that come from others. Specifically, when conducting
peer-peer evaluations and documenting projects you must delineate who was
responsible for what, where external material came from, and how it was used.
Follow the University’s rules of Statement of Expectations for Student Conduct.
Academic Dishonesty: You are permitted, and to a great extent encouraged, to
work with others on homework sets. However, there is an obvious difference
between constructive discussion of a particular problem and copying. Acts of
academic dishonesty will not be tolerated and will be handled according to
university policy. (See http://studentlife.oregonstate.edu/studentconduct/offenses0 for details.)
Other Policies:
Students with Disabilities: Accommodations are collaborative efforts between
students, faculty and Disability Access Services (DAS). Students with approved DAS
accommodations should contact instructor by the first week of the term. Students
who believe they are eligible for accommodations but who have not yet obtained
approval through DAS should contact immediately DAS at 737-4098.
READINGS
Papers with solemn emoticon (-_-) are required readings.
Papers with zen-like emoticon (^_^) should be emphasized in lecture presentation.
0. Early HRI
 Breazeal, C. (2004). Social Interactions in HRI: The Robot View. IEEE
Transactions on SMC, Part C, 34(2), 181–186.
 Breazeal, C. (2002). Designing Sociable Machines: Lessons Learned. In K.
Dautenhahn, A. H. Bond, L. Canamero, & B. Edmonds (Eds.), The Intentional
Stance (pp. 149–156). Norwell, MA: Kluwer Academic Publishers.
1. Social Navigation
 Helbing & Molnar (1995). Social force model for pedestrian
dynamics. Physical review E,51(5), 4282. [PDF] (-_-)
 Sisbot et al. (2007). A human aware mobile robot motion planner. IEEE
Transactions on Robotics, 23(5), 874-883. [PDF available above with Cornell
login] (-_-)
 Papadakis et al. (2014). Adaptive spacing in human-robot interactions.
In IEEE/RSJ International Conference on Intelligent Robots and Systems
(IROS’14), 2627–2632. [PDF] (^_^)
 Murakami et al. (2014). Destination unknown: walking side-by-side without
knowing the goal. In Proceedings of the ACM/IEEE international conference on
Human-robot interaction (HRI’14), 471–478. [PDF] (^_^)
2. Nonverbal Behavior: Gestures
 Ekman & Friesen (1969). The repertoire of nonverbal behavior: Categories,
origins, usage, and coding. Semiotica, 1(1), 49-98. [PDF] (-_-)




Calinon & Billard (2007). Incremental learning of gestures by imitation in a
humanoid robot. In Proc. of the ACM/IEEE international conference on
Human-robot interaction (HRI’07), pp. 255-262.[PDF] (^_^)
Brooks & Breazeal (2006). Working with robots and objects: Revisiting
deictic reference for achieving spatial common ground. InProceedings of the
1st ACM SIGCHI/SIGART conference on Human-robot interaction (HRI’06).
297–304. [PDF] (^_^)a
Brooks & Arkin (2007). Behavioral overlays for non-verbal communication
expression on a humanoid robot. Autonomous Robots, 22(1), 55-74.
[PDF] (^_^)b
Ou & Grupen (2010). From manipulation to communicative gesture.
InProceedings of the 5th ACM/IEEE international conference on Human-robot
interaction (HRI’10). 325–332. [PDF]
3. Nonverbal Behavior: Expressive Motion
 Heider, F., & Simmel, M. (1944). An experimental study of apparent
behavior. The American Journal of Psychology, 57(2), 243–259.
 Sirkin, D., Mok, B., Yang, S., & Ju, W. (2015). Mechanical ottoman: how robotic
furniture offers and withdraws support. In Proc. of the Tenth Annual
ACM/IEEE International Conference on Human-Robot Interaction (HRI’15), pp.
11-18.
 Knight, H., Thielstrom, R., & Simmons, R. (2016, October). Expressive path
shape (swagger): Simple features that illustrate a robot's attitude toward its
goal. In Intelligent Robots and Systems (IROS), 2016 IEEE/RSJ International
Conference on (pp. 1475-1482). IEEE.
 Knight, H., & Simmons, R. (2016, May). Laban head-motions convey robot
state: A call for robot body language. In Robotics and Automation (ICRA), 2016
IEEE International Conference on (pp. 2881-2888). IEEE.
 Knight, H., & Simmons, R. (2014, August). Expressive motion with x, y and
theta: Laban effort features for mobile robots. In Robot and Human
Interactive Communication, 2014 RO-MAN: The 23rd IEEE International
Symposium on (pp. 267-273). IEEE.
4. Nonverbal Behavior: Gaze
 Argyle et al. (1973). The different functions of gaze. Semiotica, 7(1), 19-32.
[PDF] (-_-)
 Kuno et al. (2006). Museum guide robot with communicative head motion.
In 15th IEEE International Symposium on Robot and Human Interactive
Communication (RO-MAN’06), 33-38. IEEE. [PDF] (^_^)
 Andrist et al. (2014). Conversational gaze aversion for humanlike robots.
In Proceedings of the ACM/IEEE international conference on Human-robot
interaction (HRI’14), 25-32. [PDF] (^_^)
 Argyle & Dean (1965). Eye-contact, distance and affiliation. Sociometry 28(3),
289-304. [PDF]

Admoni et al. (2014). Deliberate delays during robot-to-human handovers
improve compliance with gaze communication. In Proc. of the 10th ACM/IEEE
international conference on Human-robot interaction (HRI’14). [PDF]
5. Human-Robot Collaboration
 Bratman, M. E. (1992). Shared cooperative activity. The philosophical
review, 101(2), 327-341. [PDF] (-_-)
 Hoffman & Breazeal (2004). Collaboration in Human-Robot Teams. In Proc. of
the AIAA 1st Intelligent Systems Technical Conference, Chicago, IL,
USA. [PDF] (^_^)a
 Hoffman & Breazeal (2007). Cost-based anticipatory action selection for
human–robot fluency. IEEE Transactions on Robotics, 23(5), 952961. [PDF] (^_^)b
 Shah et al. (2009). Fast Distributed Multi-agent Plan Execution with Dynamic
Task Assignment and Scheduling. International Conference on Automated
Planning and Scheduling (ICAPS09). [PDF] (^_^)
 Sebanz, N., et al. (2006). Joint action: bodies and minds moving
together. Trends in Cognitive Sciences, 10(2), 70-76. [PDF]
 Hawkins, et al. (2014). Anticipating human actions for collaboration in the
presence of task and sensor uncertainty. In Proc. of the IEEE International
Conference on Robotics and Automation (ICRA 2014), pp. 2215-2222 [PDF]
 Cohen, P. R., & Levesque, H. J. (1991). Teamwork. Nous, 25(4), 487-512. [PDF
available with Cornell login]
6. Handovers
 Strabala et al. (2013). Towards seamless human-robot handovers. Journal of
Human-Robot Interaction, 2(1), 112-132. [PDF] (-_-)
 Sisbot & Alami (2012). A human-aware manipulation planner. IEEE
Transactions on Robotics, 28(5), 1045-1057. [PDF available with University
login] (^_^)

Yamane et al. (2013). Synthesizing object receiving motions of humanoid
robots with human motion database. In IEEE International Conference
onRobotics and Automation (ICRA’13), 1629-1636 [PDF] (^_^)
 Shi et al. (2013). A Model of Distributional Handing Interaction for a Mobile
Robot. In Robotics: Science and Systems (RSS’13). [PDF]
7. Multimodal Dialog
 Clark & Tree (2002). Using uh and um in spontaneous speaking. Cognition,
84(1), 73-111. [PDF] (-_-)
 Chao & Thomaz (2012). Timing in multimodal turn-taking interactions:
Control and analysis using timed petri nets. Journal of Human-Robot
Interaction, 1(1). [PDF] (^_^)
 Knepper et al. (2015). Recovering from failure by asking for
help. Autonomous Robots, 39(3), 347-362. [PDF] (^_^)
8. Learning from Demonstration




Argall, B. D., Chernova, S., Veloso, M., & Browning, B. (2009). A survey of
robot learning from demonstration. Robotics and autonomous systems,
57(5), 469-483. [PDF] (-_-)
Akgun, B., Cakmak, M., Yoo, J. W., & Thomaz, A. L. (2012). Trajectories and
keyframes for kinesthetic teaching: A human-robot interaction perspective.
In Proc. of the seventh annual ACM/IEEE international conference on HumanRobot Interaction (HRI’12), pp. 391-398.[PDF] (^_^)
Chernova, S. and Veloso, M. (2010). Confidence-based multi-robot learning
from demonstration. International Journal of Social Robotics, 2(2):195–215.
[PDF] (^_^)
Meltzoff, A. N., & Decety, J. (2003). What imitation tells us about social
cognition: a rapprochement between developmental psychology and
cognitive neuroscience. Philosophical Transactions of the Royal Society of
London B: Biological Sciences, 358(1431), 491-500. [PDF]
9. Socially Guided Robot Learning
 Meltzoff (2007). ‘Like me’: a foundation for social cognition. Developmental
science, 10(1), 126-134. [PDF] (-_-)
 Knox et al. (2013). Training a robot via human feedback: A case study.
In International Conference on Social Robotics (ICSR’13), 460-470 [PDF] (^_^)
 Thomaz & Cakmak (2009). Learning about objects with human teachers.
In Proc. of the 4th ACM/IEEE international conference on Human robot
interaction (HRI’09), pp. 15-22. [PDF] (^_^)

Cakmak & Thomaz (2012). Designing robot learners that ask good
questions. In Proc. of the seventh annual ACM/IEEE international conference
on Human-Robot Interaction (HRI’12), pp. 17-24. [PDF]
10. Design Methods
 Kahn, P. H., Freier, N. G., Kanda, T., Ishiguro, H., Ruckert, J. H., Severson, R. L.,
& Kane, S. K. (2008). Design patterns for sociality in human-robot interaction.
In Proc. of the 3rd ACM/IEEE international conference on Human robot
interaction (HRI’08), pp. 97-104.
 Hoffman, G., & Ju, W. (2014). Designing robots with movement in mind.
Journal of Human-Robot Interaction, 3(1), 89-122.
11. Improvisation & Performing Arts
 Knight, H. (2011). Eight lessons learned about non-verbal interactions
through robot theater. In Social Robotics (pp. 42–51). Springer.
 Dixon, S. (2004). Metal performance: humanizing robots, returning to nature,
and camping about. The Drama Review, 48(4).
 Jochum, E., Vlachos, E., Christoffersen, A., Nielsen, S. G., Hameed, I. A., & Tan,
Z. H. (2016). Using Theatre to Study Interaction with Care
Robots. International Journal of Social Robotics, 8(4), 457-470.

Hoffman, G., & Weinberg, G. (2010, May). Gesture-based human-robot jazz
improvisation. In Robotics and Automation (ICRA), 2010 IEEE International
Conference on (pp. 582-587). IEEE.
12. Robots and Ethics
 Wallach, W., & Allen, C. (2009). Moral machines: Teaching robots right from
wrong. Oxford University Press.
 Malle, B. F., Scheutz, M., Arnold, T., Voiklis, J., & Cusimano, C. (2015). Sacrifice
One For the Good of Many?: People Apply Different Moral Norms to Human
and Robot Agents. In Proceedings of the Tenth Annual ACM/IEEE International
Conference on Human-Robot Interaction (pp. 117-124). ACM.
 Kahn Jr, P. H., Kanda, T., Ishiguro, H., Gill, B. T., Ruckert, J. H., Shen, S., … &
Severson, R. L. (2012, March). Do people hold a humanoid robot morally
accountable for the harm it causes?. In Proceedings of the seventh annual
ACM/IEEE international conference on Human-Robot Interaction (pp. 33-40).
 Kahn Jr, P. H., Kanda, T., Ishiguro, H., Freier, N. G., Severson, R. L., Gill, B. T., …
& Shen, S. (2012). “Robovie, you’ll have to go into the closet now”: Children’s
social and moral relationships with a humanoid robot. Developmental
psychology, 48(2), 303.