Co-robots are robots that work alongside their human counterparts towards the successful completion of a task or set of tasks. In the context of engineering education, co-robots have the potential to aid students towards the successful completion of an engineering assignment by providing students with real-time feedback regarding their performance, technique, or safety practices. However, determining when and how to provide feedback that advances learning remains an open research question for human-co-robot interactions. Towards addressing this knowledge gap, this work describes the data types available to both humans and co-robots in the context of engineering education. Furthermore, this works demonstrates how these data types can be potentially utilized to enable co-robot systems to provide feedback that advances students' learning or task performance. The authors introduce a case study pertaining the use of a co-robot system capable of capturing students' facial keypoint and skeletal data, and providing real-time feedback. The corobot is created using commercially available, off-the-shelf components (e.g., Microsoft Kinect) in order to expand the reach and potential availability of these systems in engineering education. This work analyzes the facial expressions exhibited by students as they received instructions about how to complete a task, and feedback about their subsequent performance on that task. This allows the authors to explore the influence that co-robot visual feedback systems have in changing students' behavior while performing a task. The results suggest that students' facial keypoint data is statistically significantly different, depending on the feedback provided (pvalue< 0.005). Moreover, the results suggest there is a statistically significant relationship between students' facial keypoint data while receiving instructions on how to complete a task, and their subsequent performance on that task (p-value< 0.005). These findings suggest that students' facial keypoint data can be utilized by a co-robot system to learn about the state changes in students, as they complete a task, and provide interventions when certain patterns are discovered that have the potential to reduce students' learning or task performance. The outcomes of this paper contribute to advancing the National Academy of Engineering's Grand Challenge of personalized learning a by demonstrating how students' facial expression data can be utilized in an effective manner to advance human-co-robot interactions and improve the capability of co-robot systems to provide feedback that advances students' performance.
|Original language||English (US)|
|Journal||ASEE Annual Conference and Exposition, Conference Proceedings|
|State||Published - Jun 24 2017|
|Event||124th ASEE Annual Conference and Exposition - Columbus, United States|
Duration: Jun 25 2017 → Jun 28 2017
All Science Journal Classification (ASJC) codes