11/23/2022
By Darlene Barker

The Richard A. Miner School of Computer & Information Sciences invites you to attend a doctoral dissertation proposal defense by Darlene Barker on "Creation of Thelxinoe: Emotions and Touch in Virtual Reality."

Ph.D. Candidate: Darlene Barker
Date: Thursday, Dec. 8, 2022
Time: 2 p.m.
Location: Via Zoom

Committee Members:

  • Haim Levkowitz (advisor), Department Chair, Department of Computer Science
  • Tingjian Ge (member), Professor, Computer Science Department
  • Ashleigh Hillier (member), Professor, Psychology Department, Co-Director of the Center for Autism Research and Education (CARE)

Abstract:
This body of work is the creation of the framework for the creation of touch in virtual reality (VR) with the processing of emotions from multiple sensors, and the processing of touch and sensation generation. To accomplish this, we use a collection of shallow machine learning and deep learning models to process the aggregated emotion data from brain waves, physiological changes, pupillometry, facial recognition, speech, and background effects to recreate the emotional state of a person in VR. Being able to connect to our environment enriches our experiences of the world whether it is in real life or VR is the purpose of this work. Emotion and touch are part of the glue that enhances in person communication and by adding these two to the virtual world, we are pulling more of the real world into the experience which can only enhance it. Touch is the first sense we experience at birth and has been simulated in VR with the use of virtual tools, and haptic devices that manipulate the world, leaving the whole body out of the equation. Our framework works towards adding emotion and sensations to the VR experience. Here our sight and hearing senses are being used in diverse ways to help cover for the missing sense of touch and convince us that we are using touch. We are working towards bringing the human body back into the mix where it can give a much stronger and immersive sense of presence while in VR, and as such, we need touch and emotions added to the mix. The contribution of this work is the introduction of a multisensory collection framework called Thelxinoë that collects emotion data from multiple sensors including brain waves, facial expression, pupillometry, speech, and body movement, that is aggregated within a black box that generates an output matching the emotions at the time of collection which are used to facilitate the action of touch generated by the parties by creating sensations based on the initiated touch.