The JUSThink project aims to improve the computational thinking skills of children by exercising algorithmic reasoning with and through graphs, where graphs are posed as a way to represent, reason with and solve a problem. It targets at fostering children’s understanding of abstract graphs through a collaborative problem solving task.
Concretely, children participating in the experiment interact, in teams, with a setup consisting of two input modalities (mice or/and touch screens or tangible robots called Cellulos) and a humanoid robot such as a QTrobot or a NAO robot, in the presence of an observer. The learning activity allows the children to solve an instance of minimum spanning tree problem along with a robot that is present for motivational support and guidance. For example, in the concrete scenario of a mining company desiring to connect gold mines, the minimum spanning tree problem corresponds to connecting them most effectively in terms of spending as little as possible for building the roads.
We aim at using the data to model the interaction and then use the model to adapt the behavior of the robot in real time in various research contexts including but not limited to the concepts of engagement, mutual modelling, etc. in an educational setting to improve the learning outcome.
The latest setup as described above focuses on solving an instance of the minimum spanning tree problem with the input modality as the touch screens. However, in the initial phase of this project, a few experiments were conducted in another activity which was in a path planning scenario with Cellulo as the input modality. In that activity, children in pairs were asked to find the best path from a home to a destination, where the learning outcome is to understand the basic notion of cost. More details can be found in the publication titled: “ What Do Human-Robot Interaction Traces Tell Us About Learning ?”.
A Social Robot That Looks For Productive Engagement
2021Robots for Learning workshop at 16th annual ACM/IEEE International Conference on Human-Robot Interaction, Online conference, March 9-11, 2021.
What if Social Robots look for Productive Engagement?
International Journal of Social Robotics
When Positive Perception of the Robot Has No Effect on Learning
2020 29Th Ieee International Conference On Robot And Human Interactive Communication (Ro-Man)
2020-08-3129th IEEE International Conference on Robot and Human Interactive Communication (IEEE RO-MAN), Virtual Conference, Aug 31 – Sept 4, 2020.
DOI : 10.1109/RO-MAN47096.2020.9223343
Is There ‘ONE way’ of Learning? A Data-driven Approach
202022nd ACM International Conference on Multimodal Interaction, Virtual event, Netherlands, October 25-29, 2020.
You Tell, I Do, and We Swap until we Connect All the Gold Mines!
2020-01-01Vol. 2020 , num. 120, p. 22-23.
Robot Analytics: What Do Human-Robot Interaction Traces Tell Us About Learning?
2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)
2019-10-14IEEE RoMan 2019 – The 28th IEEE International Conference on Robot & Human Interactive Communication, New Delhi, India, October 14-18, 2019.
DOI : 10.1109/RO-MAN46459.2019.8956465
Learning By Collaborative Teaching : An Engaging Multi-Party CoWriter Activity
[IEEE RoMan 2019 – The 28th IEEE International Conference on Robot & Human Interactive Communication]
2019The 28th IEEE International Conference on Robot & Human Interactive Communication (RoMan 2019), New Delhi, India, October 14 – 18, 2019.
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 765955.