NRI: Real Time Observation, Inference and Intervention of Co-Robot Systems Towards Individually Customized Performance Feedback Based on Students' Affective States

Project: Research project

Project Details

Description

This NSF National Robotics Initiative project will investigate the potential of a cycle of observation, inference and intervention by co-robot systems to enhance students' affective states and improve their performance on engineering laboratory tasks. Co-robots are robots that work side-by-side with humans, assisting them and adapting to their needs. The two-way exchange of knowledge between students and co-robots creates a reciprocal relationship, in which each party learns from the other in service of a common goal. Affective states, such as frustration and engagement, play a major role in students' performance on everyday learning tasks. A student who is overly stressed or distracted may commit errors that would be otherwise easy to avoid. A co-robot system that is cognizant of students' affective states can intervene to prevent these errors. The results of this project may provide a template for skill-based instruction on topics well beyond engineering. Currently, such learning requires extensive interactions between a student and an instructor, with the instructor providing intensive feedback at all times. In many cases, personality mismatches or other issues between instructor and student can lead to frustration, learning difficulties, and eventual dropout. Furthermore, one-on-one learning is limited by scalability challenges, as an increase in the number of students, without a proportional increase in trained instructors, can result in decreases in quality and quantity of instructor time allocated to each student. Co-robot learning systems will be able to mitigate these challenges by providing both real time and scalable feedback systems that adapt to the individual needs of students and help to minimize the amount of human instructor time required by each student.

This research will acquire facial, auditory, and body gesture data from students using the integrated visual, audio and depth sensory system of the co-robot. The system will make statistical inferences of students' affective states, based on machine learning classification of facial and body language data. Visual feedback will be used to present students with interventions (visual instructions and commentary) intended to enhance their affective state and improve their performance on laboratory tasks. The project will assess the impact of co-robots' ability to improve students' affective states and enhance students' performance on laboratory tasks over repeated iterations of learning and testing. This project will lead to a better understanding of how students interact and function during potentially stressful laboratory activities. The co-robot systems proposed in this work will help discover the correlations that exists between students' affect and task performance. Co-robots will actively adapt to the manner in which students learn complex engineering tasks and the affective states that accompany that learning. Co-robot systems that predict the effectiveness of specific intervention strategies for each student and situation will lead to individually-tailored student feedback that serves both students and instructors towards enhancing student performance over time. This proposal advances the impact of co-robots into educational research and practice and extends knowledge of how to succinctly represent the complexities of human behavior in digital form.

StatusFinished
Effective start/end date9/1/158/31/18

Funding

  • National Science Foundation: $342,574.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.