When to provide feedback? Exploring human-co-robot interactions in engineering enviroments

Christian Enmanuel Lopez, Conrad Tucker

Research output: Contribution to journalConference article

1 Citation (Scopus)

Abstract

Co-robots are robots that work alongside their human counterparts towards the successful completion of a task or set of tasks. In the context of engineering education, co-robots have the potential to aid students towards the successful completion of an engineering assignment by providing students with real-time feedback regarding their performance, technique, or safety practices. However, determining when and how to provide feedback that advances learning remains an open research question for human-co-robot interactions. Towards addressing this knowledge gap, this work describes the data types available to both humans and co-robots in the context of engineering education. Furthermore, this works demonstrates how these data types can be potentially utilized to enable co-robot systems to provide feedback that advances students' learning or task performance. The authors introduce a case study pertaining the use of a co-robot system capable of capturing students' facial keypoint and skeletal data, and providing real-time feedback. The corobot is created using commercially available, off-the-shelf components (e.g., Microsoft Kinect) in order to expand the reach and potential availability of these systems in engineering education. This work analyzes the facial expressions exhibited by students as they received instructions about how to complete a task, and feedback about their subsequent performance on that task. This allows the authors to explore the influence that co-robot visual feedback systems have in changing students' behavior while performing a task. The results suggest that students' facial keypoint data is statistically significantly different, depending on the feedback provided (pvalue< 0.005). Moreover, the results suggest there is a statistically significant relationship between students' facial keypoint data while receiving instructions on how to complete a task, and their subsequent performance on that task (p-value< 0.005). These findings suggest that students' facial keypoint data can be utilized by a co-robot system to learn about the state changes in students, as they complete a task, and provide interventions when certain patterns are discovered that have the potential to reduce students' learning or task performance. The outcomes of this paper contribute to advancing the National Academy of Engineering's Grand Challenge of personalized learning a by demonstrating how students' facial expression data can be utilized in an effective manner to advance human-co-robot interactions and improve the capability of co-robot systems to provide feedback that advances students' performance.

Original languageEnglish (US)
JournalASEE Annual Conference and Exposition, Conference Proceedings
Volume2017-June
StatePublished - Jun 24 2017
Event124th ASEE Annual Conference and Exposition - Columbus, United States
Duration: Jun 25 2017Jun 28 2017

Fingerprint

Robots
Students
Feedback
Engineering education
Availability

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Cite this

@article{1a3b74f49e074ff686493c25c2d078c2,
title = "When to provide feedback? Exploring human-co-robot interactions in engineering enviroments",
abstract = "Co-robots are robots that work alongside their human counterparts towards the successful completion of a task or set of tasks. In the context of engineering education, co-robots have the potential to aid students towards the successful completion of an engineering assignment by providing students with real-time feedback regarding their performance, technique, or safety practices. However, determining when and how to provide feedback that advances learning remains an open research question for human-co-robot interactions. Towards addressing this knowledge gap, this work describes the data types available to both humans and co-robots in the context of engineering education. Furthermore, this works demonstrates how these data types can be potentially utilized to enable co-robot systems to provide feedback that advances students' learning or task performance. The authors introduce a case study pertaining the use of a co-robot system capable of capturing students' facial keypoint and skeletal data, and providing real-time feedback. The corobot is created using commercially available, off-the-shelf components (e.g., Microsoft Kinect) in order to expand the reach and potential availability of these systems in engineering education. This work analyzes the facial expressions exhibited by students as they received instructions about how to complete a task, and feedback about their subsequent performance on that task. This allows the authors to explore the influence that co-robot visual feedback systems have in changing students' behavior while performing a task. The results suggest that students' facial keypoint data is statistically significantly different, depending on the feedback provided (pvalue< 0.005). Moreover, the results suggest there is a statistically significant relationship between students' facial keypoint data while receiving instructions on how to complete a task, and their subsequent performance on that task (p-value< 0.005). These findings suggest that students' facial keypoint data can be utilized by a co-robot system to learn about the state changes in students, as they complete a task, and provide interventions when certain patterns are discovered that have the potential to reduce students' learning or task performance. The outcomes of this paper contribute to advancing the National Academy of Engineering's Grand Challenge of personalized learning a by demonstrating how students' facial expression data can be utilized in an effective manner to advance human-co-robot interactions and improve the capability of co-robot systems to provide feedback that advances students' performance.",
author = "Lopez, {Christian Enmanuel} and Conrad Tucker",
year = "2017",
month = "6",
day = "24",
language = "English (US)",
volume = "2017-June",
journal = "ASEE Annual Conference and Exposition, Conference Proceedings",
issn = "2153-5965",

}

When to provide feedback? Exploring human-co-robot interactions in engineering enviroments. / Lopez, Christian Enmanuel; Tucker, Conrad.

In: ASEE Annual Conference and Exposition, Conference Proceedings, Vol. 2017-June, 24.06.2017.

Research output: Contribution to journalConference article

TY - JOUR

T1 - When to provide feedback? Exploring human-co-robot interactions in engineering enviroments

AU - Lopez, Christian Enmanuel

AU - Tucker, Conrad

PY - 2017/6/24

Y1 - 2017/6/24

N2 - Co-robots are robots that work alongside their human counterparts towards the successful completion of a task or set of tasks. In the context of engineering education, co-robots have the potential to aid students towards the successful completion of an engineering assignment by providing students with real-time feedback regarding their performance, technique, or safety practices. However, determining when and how to provide feedback that advances learning remains an open research question for human-co-robot interactions. Towards addressing this knowledge gap, this work describes the data types available to both humans and co-robots in the context of engineering education. Furthermore, this works demonstrates how these data types can be potentially utilized to enable co-robot systems to provide feedback that advances students' learning or task performance. The authors introduce a case study pertaining the use of a co-robot system capable of capturing students' facial keypoint and skeletal data, and providing real-time feedback. The corobot is created using commercially available, off-the-shelf components (e.g., Microsoft Kinect) in order to expand the reach and potential availability of these systems in engineering education. This work analyzes the facial expressions exhibited by students as they received instructions about how to complete a task, and feedback about their subsequent performance on that task. This allows the authors to explore the influence that co-robot visual feedback systems have in changing students' behavior while performing a task. The results suggest that students' facial keypoint data is statistically significantly different, depending on the feedback provided (pvalue< 0.005). Moreover, the results suggest there is a statistically significant relationship between students' facial keypoint data while receiving instructions on how to complete a task, and their subsequent performance on that task (p-value< 0.005). These findings suggest that students' facial keypoint data can be utilized by a co-robot system to learn about the state changes in students, as they complete a task, and provide interventions when certain patterns are discovered that have the potential to reduce students' learning or task performance. The outcomes of this paper contribute to advancing the National Academy of Engineering's Grand Challenge of personalized learning a by demonstrating how students' facial expression data can be utilized in an effective manner to advance human-co-robot interactions and improve the capability of co-robot systems to provide feedback that advances students' performance.

AB - Co-robots are robots that work alongside their human counterparts towards the successful completion of a task or set of tasks. In the context of engineering education, co-robots have the potential to aid students towards the successful completion of an engineering assignment by providing students with real-time feedback regarding their performance, technique, or safety practices. However, determining when and how to provide feedback that advances learning remains an open research question for human-co-robot interactions. Towards addressing this knowledge gap, this work describes the data types available to both humans and co-robots in the context of engineering education. Furthermore, this works demonstrates how these data types can be potentially utilized to enable co-robot systems to provide feedback that advances students' learning or task performance. The authors introduce a case study pertaining the use of a co-robot system capable of capturing students' facial keypoint and skeletal data, and providing real-time feedback. The corobot is created using commercially available, off-the-shelf components (e.g., Microsoft Kinect) in order to expand the reach and potential availability of these systems in engineering education. This work analyzes the facial expressions exhibited by students as they received instructions about how to complete a task, and feedback about their subsequent performance on that task. This allows the authors to explore the influence that co-robot visual feedback systems have in changing students' behavior while performing a task. The results suggest that students' facial keypoint data is statistically significantly different, depending on the feedback provided (pvalue< 0.005). Moreover, the results suggest there is a statistically significant relationship between students' facial keypoint data while receiving instructions on how to complete a task, and their subsequent performance on that task (p-value< 0.005). These findings suggest that students' facial keypoint data can be utilized by a co-robot system to learn about the state changes in students, as they complete a task, and provide interventions when certain patterns are discovered that have the potential to reduce students' learning or task performance. The outcomes of this paper contribute to advancing the National Academy of Engineering's Grand Challenge of personalized learning a by demonstrating how students' facial expression data can be utilized in an effective manner to advance human-co-robot interactions and improve the capability of co-robot systems to provide feedback that advances students' performance.

UR - http://www.scopus.com/inward/record.url?scp=85030569607&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85030569607&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85030569607

VL - 2017-June

JO - ASEE Annual Conference and Exposition, Conference Proceedings

JF - ASEE Annual Conference and Exposition, Conference Proceedings

SN - 2153-5965

ER -