TY - JOUR
T1 - When to provide feedback? Exploring human-co-robot interactions in engineering enviroments
AU - Lopez, Christian Enmanuel
AU - Tucker, Conrad
N1 - Funding Information:
This research is funded by the National Science Foundation NSF NRI #1527148. Any opinions, findings, or conclusions found in this paper are those of the authors and do not necessarily reflect the views of the sponsors.
Publisher Copyright:
© American Society for Engineering Education, 2017.
PY - 2017/6/24
Y1 - 2017/6/24
N2 - Co-robots are robots that work alongside their human counterparts towards the successful completion of a task or set of tasks. In the context of engineering education, co-robots have the potential to aid students towards the successful completion of an engineering assignment by providing students with real-time feedback regarding their performance, technique, or safety practices. However, determining when and how to provide feedback that advances learning remains an open research question for human-co-robot interactions. Towards addressing this knowledge gap, this work describes the data types available to both humans and co-robots in the context of engineering education. Furthermore, this works demonstrates how these data types can be potentially utilized to enable co-robot systems to provide feedback that advances students' learning or task performance. The authors introduce a case study pertaining the use of a co-robot system capable of capturing students' facial keypoint and skeletal data, and providing real-time feedback. The corobot is created using commercially available, off-the-shelf components (e.g., Microsoft Kinect) in order to expand the reach and potential availability of these systems in engineering education. This work analyzes the facial expressions exhibited by students as they received instructions about how to complete a task, and feedback about their subsequent performance on that task. This allows the authors to explore the influence that co-robot visual feedback systems have in changing students' behavior while performing a task. The results suggest that students' facial keypoint data is statistically significantly different, depending on the feedback provided (pvalue< 0.005). Moreover, the results suggest there is a statistically significant relationship between students' facial keypoint data while receiving instructions on how to complete a task, and their subsequent performance on that task (p-value< 0.005). These findings suggest that students' facial keypoint data can be utilized by a co-robot system to learn about the state changes in students, as they complete a task, and provide interventions when certain patterns are discovered that have the potential to reduce students' learning or task performance. The outcomes of this paper contribute to advancing the National Academy of Engineering's Grand Challenge of personalized learning a by demonstrating how students' facial expression data can be utilized in an effective manner to advance human-co-robot interactions and improve the capability of co-robot systems to provide feedback that advances students' performance.
AB - Co-robots are robots that work alongside their human counterparts towards the successful completion of a task or set of tasks. In the context of engineering education, co-robots have the potential to aid students towards the successful completion of an engineering assignment by providing students with real-time feedback regarding their performance, technique, or safety practices. However, determining when and how to provide feedback that advances learning remains an open research question for human-co-robot interactions. Towards addressing this knowledge gap, this work describes the data types available to both humans and co-robots in the context of engineering education. Furthermore, this works demonstrates how these data types can be potentially utilized to enable co-robot systems to provide feedback that advances students' learning or task performance. The authors introduce a case study pertaining the use of a co-robot system capable of capturing students' facial keypoint and skeletal data, and providing real-time feedback. The corobot is created using commercially available, off-the-shelf components (e.g., Microsoft Kinect) in order to expand the reach and potential availability of these systems in engineering education. This work analyzes the facial expressions exhibited by students as they received instructions about how to complete a task, and feedback about their subsequent performance on that task. This allows the authors to explore the influence that co-robot visual feedback systems have in changing students' behavior while performing a task. The results suggest that students' facial keypoint data is statistically significantly different, depending on the feedback provided (pvalue< 0.005). Moreover, the results suggest there is a statistically significant relationship between students' facial keypoint data while receiving instructions on how to complete a task, and their subsequent performance on that task (p-value< 0.005). These findings suggest that students' facial keypoint data can be utilized by a co-robot system to learn about the state changes in students, as they complete a task, and provide interventions when certain patterns are discovered that have the potential to reduce students' learning or task performance. The outcomes of this paper contribute to advancing the National Academy of Engineering's Grand Challenge of personalized learning a by demonstrating how students' facial expression data can be utilized in an effective manner to advance human-co-robot interactions and improve the capability of co-robot systems to provide feedback that advances students' performance.
UR - http://www.scopus.com/inward/record.url?scp=85030569607&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85030569607&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85030569607
SN - 2153-5965
VL - 2017-June
JO - ASEE Annual Conference and Exposition, Conference Proceedings
JF - ASEE Annual Conference and Exposition, Conference Proceedings
T2 - 124th ASEE Annual Conference and Exposition
Y2 - 25 June 2017 through 28 June 2017
ER -