Multimodal interaction in a collaborative virtual environment for medical training

Click and go directly to the sections that you are interested in

Listen all automatically
*This feature is in beta

Introduction

What is the project Show-Me?

Problem Statement

What was the goal of my work?

Scope and constraints

Work organization

Observations

Design of the prototype

Experimental design

Participants

Experimental conditions

Analysis

Results and Future Work

What do I learn? What I can reuse?

collaboration with haptic devices

Introduction

The following user research case was part of my internship in the IBISC laboratory in Paris Saclay during my master degree. This research wasn’t teamwork, hence, whatever I explain will be totally my work. The work was supervised by Amine Chellali, Associate Professor.

The purpose of this work was to explore multimodal collaboration techniques to improve teaching technical medical skills. The methodology used was a user-centred design approach. Local anaesthesia was chosen as a case study. The work done consisted of a journey that could be divided into 5 main phases: literary review, observation, interaction design, prototype development and evaluation. The internship work was part of a broader project called Show-me.

 

volume_up pause
Surgery interaction

What is the project Show-Me?

Before starting talking about my research work, it’s useful to describe the main project behind it. The Show-me project aims to design innovative multimodal and collaborative interaction techniques and user interfaces, allowing expert mentors to demonstrate their skills, supervise, and guide a mentee through a VR training environment to better transfer technical skills in healthcare. The project aims to determine the impact of interactive and multimodal collaborative training in Virtual Reality on transferring technical skills in healthcare. The project is based on the hypothesis that multimodal mentee-mentor interactions inside a shared virtual environment will improve the technical skills transfer and the trainee's learning experience compared to traditional teaching methods. The research project is funded by the "Agence Nationale de la recherche" (ANR) JCJC program.

volume_up pause
Surgery interaction  part 2

Problem Statement

report_problem

The project was born from the motivation that medical errors are crucial in the healthcare system. For example, France has around 10 thousand deaths per year due to human errors. This number is quite bigger if we take the USA as an example. In fact, in the USA, there are around 440 thousand deaths per year due to human errors. 

At this moment, medical students are trained during hospital internships supervised by experts. Hence, the current learning method is mentoring-based. We know that the mentoring-based is an effective learning technique, but at this moment, students face up a real patient without any experience. Connected to the statistics about human errors, there is a need for a medical simulation that supports mentoring. Starting from this need, some studies find a possible solution in multimodal collaboration in a virtual environment. My research aims to investigate more in this field that it’s still not sufficiently explored. 

 

 

volume_up pause

What was the goal of my work?

When I started my work, the project was still in an early phase, and my internship goal wasn't clear. Therefore, I focused the first part of my internship on exploring the current learner-teacher approach in a specific field of medical education, anaesthesia. In this phase, I did some observations in hospitals which allowed me to define the objective of my research, namely investigating modalities to teach distance information in a collaborative virtual environment. To conduct the research, I designed and developed a virtual reality prototype. My investigation aims to give insights for the broader project, whose final goal is creating a simulator to be used by students and teachers to transfer medical skills in a collaborative virtual environment.

 

volume_up pause

Scope and constraints

coronavirus

During the internship, I had to face some issues. Initially, I planned to start observations in the hospital in March 2021 and finish in June 2021. Unfortunately, I was able to have only two observations in March, then for two months, due to the pandemic restrictions, I wasn’t allowed to go to the hospital. I did also two observations in June but then I needed to stop for time constraints. I had to submit my master thesis in August and it didn’t allow me to continue on the observations. Besides, local anaesthesia consists of two kinds of spatial movements: distance and orientation. The IBISC laboratory permitted me to have a haptic device that supported only distance variations, so my project was limited to investigating only this part.

 

volume_up pause

Work organization

The first step was having a better understanding of the context. For that, a literary review was done. It is possible to divide into two main sections:

• Medical education: it touches upon the current teaching process explaining the role of the simulation, teachers as coaches and a hint of the recent multimodal interactions in this field.

• Multimodal collaborative interaction for learning: The literary review was approached in a top-down manner, starting from a broader understanding of multimodal interaction in Human-computer interaction to then defining more specifically the input and output possibilities. Besides, the design and evaluation of multimodal interaction approaches were explored. After having a good understanding of that, it was possible to integrate a collaborative component. After a more general literary review of collaboration, more focused research on the collaborative virtual environment (CVE) was performed. Finally, the central topic review is focused on multimodal collaborative interaction for learning.

Then, I focus my research on understanding the user. For that, I did some observations. I was surprised by how many insights I had during the observations. I had enough insights to define a Hierarchical Task Analysis (HTA) and the research question. After having a research question, I design and develop the environment to investigate it. Besides, I perform experiments with users and analyzed the data obtained. In the next sections, you can read a little summary of the project, but if you want to discuss it in more detail that, I will be glad to discuss it.

 

volume_up pause
Anesthesia simulation
Device for anesthesia simulation

Observations

Four observations were performed to get insights into current teaching simulations and multimodal interaction between expert doctors and students. The first two visits were at LabForSims, a simulation medical centre, in the Paris Saclay Faculty of Medicine. The other two were in the Centre Hospitalier Sud Francilien (CHSF), in Évry - Corbeil-Essonnes.During the simulations, the observations were in the same room without interfering with the sessions. Besides, it was possible to interact with teachers and students to ask questions and clarifications at the end of the session. The needle insertions were performed using a blue phantom block model that is possible to see in the images above. Thanks to the observations, I achieve to establish a hierarchical task analysis of needle insertion (click here to see more).

Then, I had some insights to extrapolate a research question. The research question was "Which modality among voice, gestural and haptic is the most effective to teach spatial information in a VR teaching environment?".To answer this question, the first prototype of a teaching CVE was designed and developed, and a user study was conducted.

volume_up pause

Design of the prototype

A first prototype was developed to compare the effectiveness of spatial information teaching using virtual reality and multimodal experience. To simulate spatial information teaching, a new task was defined. Since the objective was to understand the use of communication modalities for teaching rather than simulating a real anaesthesia procedure, the task was simplified. The new task consisted of picking up a 3D sphere from a starting point and dropping it at a target position using a virtual tool. The participant would have indications concerning the target position using three different modalities: vocal, gestural, and haptic. This task mimics the manipulation of a needle with a mentor guiding a mentee during this manipulation. The sphere positions mimic the different structures the learner must reach during needle manipulation. The main research hypothesis was that the communication modality used to guide the learner would have an impact on user performance and on user experience. 

The prototype was developed in Unity3D. It permits the implementation of a multimodal virtual experience, and it is compatible with the different tools (haptic device, VR headset, etc.) necessary to address the research question. The application consisted of five different scenes. There were two colourful plane surfaces in all the scenes, one horizontal yellow and one vertical red. These surfaces were determining the range of movements that users were allowed to do using the haptic device. 

volume_up pause
Example of experiment

Experimental design

The study was planned to evaluate which modality among voice, gestural, and haptic was the most effective for spatial guidance in a VE.  Hence, participants performed the user tasks in the VE prototype. The participants had to perform twelve trials for each modality. A trial is composed of (1) one instruction given through the current modality and (2) the performance of the manipulation task by the user following the received instruction. The participants received four instructions of movement along each axis (X, Y, Z). The user had to follow the instruction, focusing on the amplitude of the movement instead of the starting and ending positions. Indeed, the starting and ending positions changed for each trial. This decision was made to avoid that the users’ actions being based on the ending point and not the instruction.

To evaluate the user experience to answer the research question, both objective and subjective measurements were used. The users' performances were monitored automatically during the user test. Each time that the user acted, the application saved several data in a CSV file.  For the subjective measurements, I used mostly questionnaires.

 

volume_up pause
Haptic device interaction in real environment

Participants

groups

In total, 21 users took part in this experiment. The participants were recruited in the university, and they were interns and PhD students in other laboratories, university staff, etc. All participants were French or English speakers.

volume_up pause

Experimental conditions

The experiment followed a within-subject design including one factor (instruction modality) with three conditions: Voice, gesture, and haptics. A Latin square method was used to counterbalance the presentation order of the experimental conditions. The reason to use it was to prevent the possibility that the learning effect between modalities affects the results. The three experimental conditions relied on different human senses to transfer the instructions. The instruction in the gestural condition was performed by a virtual hand in the prototype.

The voice instruction was given by the experimenter. In the haptic condition, the user had the haptic arm blocked until they were ready, then the experimenter said “go”, and the haptic arm started to move automatically.

 

volume_up pause

Analysis

equalizer

Teaching experience is composed of technical and social aspects. They are equally important in a learning environment. Hence, the experiments aim to analyse the user performance and the user's feelings with different modalities.

In this project, I approached for the first time the SPSS software. It was useful to analyse the data collected in the experiment. The data were analyzed choosing the best suitable way for the type of data collected. I had the opportunity to use different tests, in detail:

  • ANOVA test
  • Pairwise comparisons with Bonferroni correction 
  • non-parametric Friedman test 
  • Wilcoxon signed-rank test with a Bonferroni-adjusted
volume_up pause

Results and Future Work

The experient gave different results among the objective and subjective measurements. Indeed, the study reveals various insights about the components of a collaborative environment which can be useful for future work. Indeed, it suggests that a multimodal experience could improve the learning experience which confirms previous studies presented in the literature review. Hence, it opens up the opportunity to conduct more experiments with instructions communicated through a combination of modalities instead of a single modality at a time as it was for this experiment. Besides, this investigation focused on teaching distance estimation, but the needle insertion movements are a mix of distance precision and orientation. Although needle orientation was not explored in this work, it is an integral part of the needle insertion experience. Having a different focus of the teaching, such as needle orientation, could mean that another modality might be more suitable.  For this reason, future experiments taking into consideration the orientation should be done.

volume_up pause

What do I learn? What I can reuse?

These six months, they put me to the test. I was for the first time approaching research in HCI and without a team. I learn from zero how to do literary reviews and organize the papers read. I had the opportunity to read plenty of interesting papers that give me a deep knowledge of multimodal interaction. Besides, the literary review done permits me to reuse the knowledge in multimodal interaction for my next projects. The observations in the hospital were challenging. I had to understand complex contexts like a medical education. Thanks to doing that, I had improved my skills in adapting to different contexts. During the internship, I improved my skills in designing and developing virtual environments in Unity3D and I had the opportunity to approach new interaction devices like the geomagic touch haptic device.

Besides, I learn how to analyse data with SPSS and I improved my project management skills.

volume_up pause

Let's break the wall that divides your product from success together!