Research



Children communicating using eyegaze with teacher

Eye-gaze as a form of Communication



Student learning complex ideas in VR

VR and Cognition



Communicating with Robots

Robots and Communication



Communicating during Covid-19

Communication and COVID-19



Eye-gaze as a form of Human-Machine Communication

Principal Investigator: Rhonda McEwen. Co-Investigators: Michelle Lui, Post Doctoral Fellow

This study analyzes the role of the machine as a communicative partner for children with complex communication needs as they use eye-tracking technology to communicate. We ask: to what extent do eye-tracking devices serve as functional communications systems for children with complex communication needs? We followed 12 children with profound physical disabilities in a special education classroom over 3 months. An eye-tracking system was used to collect data from software that assisted the children in facial recognition, task identification, and vocabulary building. Results show that eye gaze served as a functional communication system for the majority of the children. We found voice affect to be a strong determinant of communicative success between students and both of their communicative partners: the teachers (humans) and the technologies (machines).Keywords: human-machine communication, interpersonal communication, eye-gaze communication, communicative partner, communication disability, special education, educational technology.



Virtual Reality, Sensors and Cognition

Principal Investigator: Rhonda McEwen. Co-investigators: Michelle Lui,Post doctoral researcher & Martha Mullay, Associate Professor, Carelton University

Educators are recognizing the potential power of immersive virtual reality (IVR) to allow learners to experience first-hand phenomena that were previously intangible, such as atoms and molecules. In this study an IVR simulation of a complex gene regulation system was co-designed with an undergraduate microbiology course instructor. The course, with 234 students, was taught using active learning strategies, including peer instruction and exposure to a two-dimensional computer simulation. 34 students from the course participated in an interactive IVR experience using head-mounted displays. We assess students' conceptual understanding using tests, multimodal data collected during the IVR sessions (including video analysis in combination with physiological sensor data, eye tracking data, and interaction logs) as well as semi-structured interviews. We found that students who were seated while in IVR demonstrated significantly higher conceptual understanding of gene regulation at the end of the course, and higher overall course outcomes, as compared to students who experienced the course as originally designed (control). However, students who experienced IVR in a standing position performed similarly to the control group. In addition, learning gain appears to be influenced by a combination of the level of prior knowledge and how IVR is experienced (i.e., sitting vs. standing). Implications for the connections between sensorimotor systems and cognition in IVR are discussed. Keywords: Virtual reality, embodied cognition, physiological sensing, multimodal analysis, simulation, molecular biology.



Robots and Communication

Principal Investigator: Rhonda McEwen. Co-Investigators: Zhao Zhao, Post doctoral researcher and Yaxi Zhao, PhD student.

In an era in which robots take a part in our daily living activities, trust in these robots becomes a crucial issue. Of greater relevance to trust in HRI and already more extensively studied, previous research on trust in automation, and in human-computer interaction (HCI), may provide some insights and implications for trust in human-robot interaction (HRI). However, robots differ from automated machines in their embodiment and the ability of performing human-like behaviors (facial expressions, gestures, etc.).There have been different views as to whether interpersonal trust can be applied to human-robot trust, some research suggest that trust between humans is different than trust between humans and robots, Lee and See (2004); whereas according to Madhavan and Wiegmann (2007) individuals can have a trust relationships with technology that is similar to human trust relationships with the key difference being decision making. Thompson and Gillian (in Barnes & Jentsch, 2010, p. 78) suggest that drivers of interpersonal trust may also drive human-robot trust. Therefore, in this research, we ask "Do human-robot interactions follow a similar trust pattern as human-human interactions?"

Communicating during COVID-19 - China and the diaspora

Principal investigator: Rhonda McEwen. Co-investigators: Zhao Zhao, Yaxi Zhao & Tony Tang

After the 2019-novel Coronavirus (COVID-19) was first detected in China in December 2019, it spread rapidly, threatening lives and as well as social, economic and emotional well-being in one of the world’s most densely populated countries. The virus has severely affected daily life around the world but China felt the effects first. As the rest of the world watched the Chinese government implemented social distancing, leading to a complete shutdown of the country’s main industries for more than 2 months. During this time, the country was engaged in one of the most ambitious quarantine strategies in world history, with at least seven hundred and sixty million people confined largely to their homes. In contrast to normal daily life, people relied more heavily on technologies to stay connected to each other, to shop online, and to acquire information.We are fundamentally interested in how people use technologies during a global pandemic, with the China experience as a case study.

.