Looxid Labs Development Kit is the world’s first VR headset embedded with miniaturized eye and brain sensors. The Development Kit provides information about where the users have looked and how the users’ brains are activated in VR, enabling early adopters and developers to create unique and incredible experience for VR users.
Brain Wave + Eye Information + Open SDK + Emotion Analysis
Looxid Labs Research Kit is a mobile VR headset embedded with a unique sensor module that measures users’ brainwaves, eye-movement and pupil information. Once users wear our Research Kit, researchers and scientists are able to analyze the users’ emotional status in reaction to VR contents using our API. Our Research Kit helps the researchers and scientists gain the users’ robust bio-signals including brainwaves and pupil information, reveal the users’ genuine emotion from their bio-signals, and readily classify those signals into different emotional states by machine learning algorithm.
Open API & SDK for Developer.
Acquire robust bio-signals.
Independent Component Analysis based Algorithm.
Eliminate unwanted noises of bio-signals.
Machine Learning Algorithm.
Accurately classify a person’s emotional states
Create Interconnected Reality
through Users’ Emotional Interaction in VR
Looxid Labs’ ultimate emotion recognition system will introduce human emotions into a virtual environment and enable users to emotionally engage with a virtual character knowing that the users’ emotional states. This will make a significant leap forward in improving users’ VR immersion and feeling of presence.
What Happens When Artificial Intelligence Can Read Our Emotion in Virtual Reality
Being surrounded by machines that understand our emotion is one of many ‘what ifs’ that is kind of creepy to even think about. Don’t get surprised. We will get to that future sooner or later owing to technological advances, but how?
How does a machine ‘sense’ our emotion?
At Apple’s September keynote, Apple X for the first time showed off its slick design to the world, and Apple phone lovers couldn’t help but shout “hooray!” with enthusiasm. What caught people’s eyes unexpectedly among others was Animoji, a dozen different animal emojis that mirror users’ facial expressions and that can be shared with others. Animoji seems interesting for sure, but what does it really mean for our communication in a digital world?
Nowadays, an overwhelming amount of human-to-human communications happen every second via different digital platforms, but they are quite often void of the essence of human nature: emotion. To facilitate machine-mediated communication, many tech giants are spending a great deal of time and effort on finding proper sensors that can empower digital machines to interpret our emotion. At least for smartphones, since we take pictures and talk on the phone in a daily basis, it comes naturally to engineers to use a camera (facial recognition) and microphone (virtual assistants―Siri, Google Assistant or Amazon Alexa) to ‘sense’ our emotion.
What about in VR?
Social Virtual Reality (VR) is a new emerging digital platform that offers a virtual space where people with their avatars can interact with others. But how do we add an emotional texture to VR? That gets us to Massachusetts Institute of Technology (MIT) Media Lab.
MIT Media Lab decided to add an extra layer of emotional skin to a virtual avatar. The researchers created an ‘emotional beast’ in VR that changes its appearance responding to a user’s emotional state. In order to detect a user’s emotion in VR, the team integrated a physiological sensing module including electrodes―for galvanic skin response (GSR) data collection―and photoplethysmogram (PPG) sensors ―for heart rate data collection―into the mask of a VR headset. GSR data reflects a user’s emotional arousal, but it is not enough to determine whether a user is aroused positively or negatively. Thus, a PPG sensor―using light to track the rate of blood flow and gauge a user’s anxiety and stress levels (negative arousal)―is needed to complement GSR data. Basically, these selected physiological sensors act as a medium for emotion recognition just as a camera and microphone in smartphones.
The researchers crafted two types of ‘emotional beasts’: fur-based and particle-based.
The fur-based ‘emotional beast’ has the ability to contract and grow its fur to visually express the happiness of a user. Based on Lang’s Model, the team evaluated the four emotional states on a scale of 0 to 1. The fur-based beast grows its fur to full length if the evaluated emotion is ‘happy’ whereas the fur stays within the inner skin and thus results in the smooth outer skin if evaluated to be ‘neutral’.
The particle-based ‘emotional beast’, on the other hands, takes account of two variables: the brightness and color. On a scale of 0 to 1, the arousal level of a user is estimated. At a high arousal level, the particles illuminate while at a neutral state almost invisible. In a similar manner, a user can express his/her frustration and anger to other avatars by converting the color of the particles from blue to red.
Indeed, MIT Media Lab has crafted visually scintillating artwork. These colorful and vibrant creatures enabled the users to express their emotions in most vivid way possible and thus brought a surface-level experience of VR to an emotional human-to-human interaction (See the video here).
How Can Emotion AI Revolutionize VR?
Yet what’s working behind ‘emotional beast’ is machine learning algorithm. The researchers let the system to learn the physiological data-sets and predict a person’s emotional states. Without this process, GSR and PPG data are just a bunch of numbers that tells us nothing. In fact, any system that detects emotion based on user-provided data absolutely entails machine learning process.
Although the “emotional beast” project has successfully portrayed how emotion detection technology can be used in VR, being able to perform human-to-human communication within VR may become of little interest to us if Artificial Intelligence(AI) comes into play―because VR coupled with Emotion AI will eventually touch every part of our lives and bring up so many ‘what ifs’.
“What if AI can gauge your preference towards all the products you’ve seen in a virtual shopping mall and then suggest a purchase list of the preferred products or even automatically purchase them for you?”
“What if AI can measure the concentration and excitement level of a middle school student listening to a lecture in VR and come up with the customized curriculum specifically for that student?”
These ‘what if’ scenarios of AI reading our emotion will not remain as a creepy pipe-dream anymore.
Looxid Labs is a tech start-up to seamlessly integrate an emotion recognition system with VR (virtual reality) using eye and brain interface. We are making a big splash to develop world’s first technology that seamlessly
integrates an emotion recognition system with VR using eye-brain interface and thus contributes to developing a completely new VR interaction technique.
5F Tipstown, 165, Yeoksam-ro,Gangnam-gu, Seoul 06247
9F Daejeon CCEI, KAIST Yuseong-gu, Daejeon 34141