RoomPathY is an intelligent VR environment that adapts itself following the user’s personal emotions. The name comes by the union of the words Room + Empathy. It works in two steps: first, it evaluates the user’s mood with few questions, then it changes color of the lights and sounds of the VR environment according to the mood evaluated and during the session changes the lights’ intensity according to the user’s heartbeat. The application is developed with Unity 2017.2p3 and is built for the Oculus Gear VR and it needs also a compatible smartphone and the Gear VR controller in order to be used.
I chose to develop for the Gear VR for several reasons:
- A mobile VR application is something that many people can use easily, without the need to buy expensive PC or HMD.
- Oculus offers a lot of nice API that can also take advantage from the official Gear VR controller.
- I’ve already have a Galaxy Note 8 with a Gear VR headset and controller, so I could test it in any moment, without the need of more hardware.
In this application the user should be sit in a comfortable place, without moving. So, I don’t have to care about one of the biggest problems of using mobile headset, position tracking and movement inside a room.
The video frame rate and audio quality in the YouTube video are low quality because of streaming problem, in the app there is no lag nor audio quality problems.
Psychology behind RoomPathY
RoomPathY is based on psychological studies that aimed to understand the mood of a person and to improve it with specific light’s colors and sounds. There are many, many theories about emotion taxonomy and detection: we glued different aspects to create a substrate for RoomPathY. One of the most important and fascinating theories is the Wheel of Emotion.
The Wheel of Emotion, designed in 1980 by the American psychologist Robert Plutchik, visualizes eight main emotions coupled in opposites
- joy – sadness
- anger – fear
- anticipation – surprise
- trust – disgust
and derivative emotions each composed of two basic one. The basic emotions are represented in the middle circle while the inner one and the outer one display stronger and weaker version of them. For RoomPathY I used a simplified 2-dimension model strongly relying on the detection of the first two couple.
For what concern the mood detection I used two different methods: one conscious and one unconscious. The first one is done with some direct and explicit questions. For the second one I made some research and I found a study done by some experts which have assigned emotions values to some abstract pictures. I mixed that two information in order to find the correct mood profile for the user.
A self-evaluation of how the user feel it’s essential and unavoidable for a successful emotion analysis. As a future improvement I’ll will add also to the evaluation some physiological data such as ECG and GSR, in order to have a more accurate mood analysis.
Once mood is detected, the system has to couple it with the correct environment setup.
I crossed studies on neuroarchitecture and synesthesia to assign a color to every mood. For the sound, through second parts statistical studies I adapted an algorithm to analyze musical spectra in relation to their mood influences.
After the mood has been detected and profiled, RoomPathY acts in a symmetric way: for example, to soften a sadness state it will apply the environment setup corresponding to the origin-symmetric point in the emotion wheel graph, joy in this case. If the algorithm detect the same value for two emotions, it will set up an environment with the two lights colors related to those emotions.
The connection between mood and colors is not static. Since everyone can react differently to some colors or sounds, according to some past memories or experiences, I added a feedback process when the user wants to end a session. RoomPathY asks him how does he feel before and after the session, if the “after” value is lower than the “before” value, the user can change lights’ colors through a color picker and the system will save his changes. Whenever the user wants, he can reset the color assignments.
I’ve already developed a “real” version of RoomPathY, but there were three big problems that stopped a further development:
- The hardware was too expensive (Philips hue lights + raspberry (server) + speakers for each bundle)
- The installation could be hard to do, because each room is different. Moreover, it wasn’t portable, limited to the room where was installed.
- It was difficult to completely isolate a room from external sounds and lights, especially during the day.
So, I decided to build a Virtual version of RoomPathY that could be used easily by everyone, in any places, with only the need of a smartphone, a headset and a controller, without the need of ad hoc expensive hardware.
In order to take advantage of all the features of Gear VR (especially for the controller), I had to learn and use the Oculus SDK.
Environment, motion and interaction
I’ve built a room environment with 3 variations in the interior decoration:
- Normal version, where I put some stuff that I would have put in my room
- Christmas version
- Halloween version
Each version has different sounds and some different furniture. The sound is different only before the starting of the session. During the session the sound would be chosen by the mood detection algorithm.
I tried to put more audio sources for the sound, but after some tries I thought it was better to put only one audio source that covers all the scene. The reason is that the application should be used with headphones in order to have a better immersion, so I wanted to recreate a similar effect and don’t have sound coming from different points that can influence negatively during the session, disorienting the user.
The only other audio source in the scene is the fire sound coming from the fireplace.
Before starting the session, the user can choose if listen at the beginning of the session an intro audio voice that will help to relax him. The audio can be listened with a tutorial voice that explains which benefits the user can obtain listening to it.
For what concern the movement inside the room, first I tried to use the touchpad of the Gear VR controller in order to move around, but I realized that it was useless and, in that way, there could be sickness problems. So, I decided to use teleporting, just to let the user moving around and see better the different parts of the room. I put a line renderer projection spawned from the controller with a circle pointer icon at the end of it. If the user presses the trigger button he will be teleported in the circle pointer position.
I put in each version of the room a bed, a sofa and chair-like furniture. Those are some common objects that the user can have also in the real environment where he is during the session. Since the user should relax and don’t move during the session, he can teleport over the object that is most similar to the one he is in the real world (e.g. if he is sitting on the sofa, he can go over the sofa also in the virtual room), in order to have a better match between the virtual and the real environment.
In order to answer the questions, start the session, changing the room, etc. the user can interact with the tv screen in the room using the touch pad of the controller. When the session starts the screen will be turned off and if the user wants to end the session, he can press the back button to see the feedback screen and end the session.
As a plus, during the session, RoomPathY will take as input the heartbeat of the user and change light intensity following the bpm. In this first version I don’t have the hardware resources to take real live data, so to show how it works I’ve downloaded a sample HR.
Conclusions and future works
This is only a first version that has a lot of space for improvements, but it can already be a nice tool for people to relax and to feel better, just using it and passing some time in this virtual environment.
Hardware is expensive, but thanks to Virtual Reality in some situations it can be emulated in order to obtain the same (or better in this case also because of isolation, portability, replication, etc.) effect with 0 additional cost. That’s what I did with RoomPathY, I’ve remade a project that was thought for the real world into a virtual world and I improved a lot it while doing this!
But that’s not over, those are the future steps that I want to focus on:
- In order to evaluate the mood, I want to use a better algorithm that can rely also on physiological parameters such as ECG and GSR and can be more accurate. According to some studies, each emotion can be represented in a Valence/Arousal cartesian plot. My objective is to do some research in order to evaluate in an accurate way both arousal and valence values and find the emotion relative to them.
- I want to add more moods to be detected, using all the emotions present in the wheel of emotions.
How to build
In order to try the experience from the Unity Editor (and not for the Gear VR build):
- In EventSystem toogle on Standalone Input Module and off OVR Input Module.
- In both IntroCanvas and MenuCanvas toggle on Graphic Raycaster and off OVR Raycaster.
- In OVRCameraRig toggle on the “In PC” value in the MyPlayerController Component.
- In SessionController toggle on the “In PC” value in the SessionController Component.
In the “PC mode” move and rotate with WASD (or arrows), space to activate/deactivate the menu canvas during the session, mouse to interact with the canvas menus.
In the “Gear VR mode” teleporting with the trigger in the position of the ring to move, back button to activate/deactivate the menu canvas during the session, touchpad to interact with the canvas menus.
The Android apk can be downloaded here.
The source code is available here.
Sounds: YouTube for intro scene, royalty free sounds for session.