Explore User Emotion in VR

Make a Real Impact at Every User Touchpoint in VR

Unlock New Levels of VR Experience
by Extending Users’ Mind

Developer KIT

Brain Wave + Eye Information + Open SDK

Looxid Labs Development Kit is the world’s first VR headset embedded with miniaturized eye and brain sensors. The Development Kit provides information about where the users have looked and how the users’ brains are activated in VR, enabling early adopters and developers to create unique and incredible experience for VR users.

LEARN MORE

Research KIT

Brain Wave + Eye Information + Open SDK + Emotion Analysis

Looxid Labs Research Kit is a mobile VR headset embedded with a unique sensor module that measures users’ brainwaves, eye-movement and pupil information. Once users wear our Research Kit, researchers and scientists are able to analyze the users’ emotional status in reaction to VR contents using our API. Our Research Kit helps the researchers and scientists gain the users’ robust bio-signals including brainwaves and pupil information, reveal the users’ genuine emotion from their bio-signals, and readily classify those signals into different emotional states by machine learning algorithm.

LEARN MORE

Coming Soon

Accesory KIT

Additional Mounting in VR Device

Next Year

Delve into Our Technical Excellence

Data Acquisition

Open API & SDK for Developer.
Acquire robust bio-signals.

Bio-signal Processing

Independent Component Analysis based Algorithm.
Eliminate unwanted noises of bio-signals.

Algorithm

Machine Learning Algorithm.
Accurately classify a person’s emotional states

Potential Uses

Create Interconnected Reality
through Users’ Emotional Interaction in VR

Looxid Labs’ ultimate emotion recognition system will introduce human emotions into a virtual environment and enable users to emotionally engage with a virtual character knowing that the users’ emotional states. This will make a significant leap forward in improving users’ VR immersion and feeling of presence.

NEWS

NEWS
2017-12-15

최근 오큘러스 에듀케이션 팀은 VR 환경에서 학생들의 학습 효과를 측정하고 학습 효율을 증진시킬 수 있는 방안을 찾기 위해 Cornell, MIT, Yale 대학교 등 여러…

최근 오큘러스 에듀케이션 팀은 VR 환경에서 학생들의 학습 효과를 측정하고 학습 효율을 증진시킬 수 있는 방안을 찾기 위해 Cornell, MIT, Yale 대학교 등 여러 연구기관과 협력하겠다고 발표했습니다.

VR은 교육 체험 효과를 높여준다는 점에서 새로운 교육 플랫폼으로 주목받고 있지만, VR 환경에서 학생들의 학습 능력을 정량적으로 평가하기는 쉽지 않은 상황입니다.

어떻게 하면 VR 환경에서 학생들의 인지 능력을 정량적으로 평가하고 학습효과를 증진시킬 수 있을까요?

이 질문에 대한 답을 이번 미디엄 포스트에서 확인해 보시죠.

#VR #EducationTechnology #Engagement #Oculus #EyeTracking #EEG

출저: https://goo.gl/s8Koos


Since Mark Zuckerberg opened Facebook’s door for Oculus’ VR technology, there has been a growing trend for the use of VR for business for…

(RSS generated with FetchRss)

NEWS
2017-12-14

여러분은 상대방이 진실을 말하고 있는지 어떻게 판단하시나요? 눈동자의 흔들림? 목소리의 떨림? 아니면 제스쳐? 행동 및 인지연구 또는 마켓 리서치 분야에서는 여러 사람들의…

여러분은 상대방이 진실을 말하고 있는지 어떻게 판단하시나요?

눈동자의 흔들림? 목소리의 떨림? 아니면 제스쳐?

행동 및 인지연구 또는 마켓 리서치 분야에서는 여러 사람들의 의견과 반응을 가늠하는 데 주로 설문조사를 사용하고 있습니다.

하지만 설문조사의 특성상 사회적 통념, 사회적 가치를 고려해 속내를 숨기고 답변하는 응답자들도 있고, 응답자들의 의도를 파악하지 못해 왜곡된 조사 결과를 도출하는 경우도 있어 정확한 데이터를 수집하는데 한계가 있는데요.

시선, 뇌파와 같은 생체 신호를 포착해 분석할 수 있다면 설문을 통해 파악하기 어려운 사람들의 행동에 숨겨진 의도와 인지능력도 파악할 수 있지 않을까요?

#EyeTracking #EEG #Psychology #Neuroscience #Cognitive

출저: https://goo.gl/NxmePA


Have you ever given others the benefit of doubt? If you have, on what grounds? Their facial expression? Their gesture? Their tone?

(RSS generated with FetchRss)

BLOG
2017-12-13

What Happens When Artificial Intelligence Can Read Our Emotion in Virtual Reality

Apple: Animoji

Being surrounded by machines that understand our emotion is one of many ‘what ifs’ that is kind of creepy to even think about. Don’t get surprised. We will get to that future sooner or later owing to technological advances, but how?

How does a machine ‘sense’ our emotion?

At Apple’s September keynote, Apple X for the first time showed off its slick design to the world, and Apple phone lovers couldn’t help but shout “hooray!” with enthusiasm. What caught people’s eyes unexpectedly among others was Animoji, a dozen different animal emojis that mirror users’ facial expressions and that can be shared with others. Animoji seems interesting for sure, but what does it really mean for our communication in a digital world?

Nowadays, an overwhelming amount of human-to-human communications happen every second via different digital platforms, but they are quite often void of the essence of human nature: emotion. To facilitate machine-mediated communication, many tech giants are spending a great deal of time and effort on finding proper sensors that can empower digital machines to interpret our emotion. At least for smartphones, since we take pictures and talk on the phone in a daily basis, it comes naturally to engineers to use a camera (facial recognition) and microphone (virtual assistants―Siri, Google Assistant or Amazon Alexa) to ‘sense’ our emotion.

What about in VR?

Facebook Social VR

Social Virtual Reality (VR) is a new emerging digital platform that offers a virtual space where people with their avatars can interact with others. But how do we add an emotional texture to VR? That gets us to Massachusetts Institute of Technology (MIT) Media Lab.

A: circuit board with bluetooth connection B: PPG senor C: GSR Electrode

MIT Media Lab decided to add an extra layer of emotional skin to a virtual avatar. The researchers created an ‘emotional beast’ in VR that changes its appearance responding to a user’s emotional state. In order to detect a user’s emotion in VR, the team integrated a physiological sensing module including electrodes―for galvanic skin response (GSR) data collection―and photoplethysmogram (PPG) sensors ―for heart rate data collection―into the mask of a VR headset. GSR data reflects a user’s emotional arousal, but it is not enough to determine whether a user is aroused positively or negatively. Thus, a PPG sensor―using light to track the rate of blood flow and gauge a user’s anxiety and stress levels (negative arousal)―is needed to complement GSR data. Basically, these selected physiological sensors act as a medium for emotion recognition just as a camera and microphone in smartphones.

The researchers crafted two types of ‘emotional beasts’: fur-based and particle-based.

A fur-based emotional beast

The fur-based ‘emotional beast’ has the ability to contract and grow its fur to visually express the happiness of a user. Based on Lang’s Model, the team evaluated the four emotional states on a scale of 0 to 1. The fur-based beast grows its fur to full length if the evaluated emotion is ‘happy’ whereas the fur stays within the inner skin and thus results in the smooth outer skin if evaluated to be ‘neutral’.

A particle-based emotional beast

The particle-based ‘emotional beast’, on the other hands, takes account of two variables: the brightness and color. On a scale of 0 to 1, the arousal level of a user is estimated. At a high arousal level, the particles illuminate while at a neutral state almost invisible. In a similar manner, a user can express his/her frustration and anger to other avatars by converting the color of the particles from blue to red.

Indeed, MIT Media Lab has crafted visually scintillating artwork. These colorful and vibrant creatures enabled the users to express their emotions in most vivid way possible and thus brought a surface-level experience of VR to an emotional human-to-human interaction (See the video here).

How Can Emotion AI Revolutionize VR?

Yet what’s working behind ‘emotional beast’ is machine learning algorithm. The researchers let the system to learn the physiological data-sets and predict a person’s emotional states. Without this process, GSR and PPG data are just a bunch of numbers that tells us nothing. In fact, any system that detects emotion based on user-provided data absolutely entails machine learning process.

Although the “emotional beast” project has successfully portrayed how emotion detection technology can be used in VR, being able to perform human-to-human communication within VR may become of little interest to us if Artificial Intelligence(AI) comes into play―because VR coupled with Emotion AI will eventually touch every part of our lives and bring up so many ‘what ifs’.

“What if AI can gauge your preference towards all the products you’ve seen in a virtual shopping mall and then suggest a purchase list of the preferred products or even automatically purchase them for you?”

“What if AI can measure the concentration and excitement level of a middle school student listening to a lecture in VR and come up with the customized curriculum specifically for that student?”

“What if…”

These ‘what if’ scenarios of AI reading our emotion will not remain as a creepy pipe-dream anymore.

Reference

  1. Emotional Beasts: Visually Expressing Emotions through Avatars in VR
  2. Apple: Animoji

MEET US AT

Techcrunch SF 2017. Oct

SFN 2017. Nov

CES 2018. Jan

Please Leave Your E-mail to Get Notified