Combination of Virtual Reality and Eye Tracking: Explore the Mind of Consumers

By | BLOG

What do you think is the most relevant information about a product that can successfully induce a consumer’s purchase behavior? If you were a promotion manager for a granola selling company, how would you try to understand your potential consumers’ purchase behavior and deep psychology inside it?

Eye tracking: Keeping track of consumer attention

Source: bluekiteinsight.com

Eye tracking, the sensor technology that enables a device to measure exactly where and when people’s eyes are focused, is known to provide a better understanding of consumers’ visual attention. People tend to stare longer and look more times at the object that they are interested in. In addition, their visual path gives much information about cognitive flow. Therefore, carefully investigating visual logs of consumers — eye tracking data in other words — might help those who are desperately looking for ways to promote sales of particular products to get insignificant insight. On the consumer’s perspective, the general public might also be able to enjoy the far better shopping experience with time-to-time recommendation system based on their eye gaze information.

But how can we track consumer attention in the real world?

Recall your shopping experience. As you enter a supermarket and stand in front of the shelf stuffed with the product category you were looking for, you will skim through several products and finally pick one of them to your cart. As a matter of fact, the process of making purchase decisions happens within seconds. Consequently, it is highly important for retailing researchers to investigate consumers’ natural attentional process “in situ.”

The majority of current research, however, even when analyzing eye tracking data, is undertaken in laboratory settings. The laboratory environment would make it easy to exercise any experimental controls to investigate what you want to know deeply. On the other side, keeping experimental controls inevitably leads to low level of ecological validity. If the ecological validity level is poor, any kinds of well-analyzed result might become valueless as we cannot guarantee that similar effects would happen in the wild. This trade-off between control and ecological validity level has always been quite a serious issue for many researchers.

Virtual reality mobile eye tracking: A new research opportunity

Fortunately, the advent of a virtual reality (VR) is extending the previous trade-off frontier for the existing researches. It is because VR not only allows various levels of experimental control but also makes it available to build up shopping experience that feels like a reality. This sort of experimental environment now puts the researches on the point where the optimal combination of experimental control and ecological validity is implemented together. Therefore, with the help of VR, eye tracking technology can be used way more effectively to capture the user’s visual attention with better reliability. This week’s research — “Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research”— has reviewed how mobile eye tracking can be used in the virtual reality and discussed the pros and cons of applying eye-tracking technology in terms of experimental environments. Particularly, this research focused on three different kinds of environments — conventional 2-D monitor based setting, virtual reality, and the real environment. Besides, the research has proposed the experiment in a virtual reality setting to discuss the validity of using mobile eye tracking in VR to study consumer behaviors.

Figure.1. Interacting in a virtual reality

First of all, this paper has set up criteria and rated both of relative superiority and inferiority among three different experimental settings for each criterion. The result of ratings, as written in the table below, might work as a useful guideline to decide which equipment to use and how to design eye tracking experiments. As we can read from the table, the “desktop eye tracking”, compared with “mobile eye tracking in the field”, has relative advantage in criteria that are concerned with experimental control (“Ease of creating/using non-existing stimuli”, “Ease of controlling and randomizing treatment and extraneous factors”, “Naturalness of the eye tracking task”, “Ease of analyzing and reacting to respondent’s attention and behavior in real time”, “Ease of generating large sample sizes”, “Ease of obtaining retailer permission to record”, “Ease of data preparation”, “Reliability of AOI coding”, “Reproducibility of experimental setting”). In contrast, “mobile eye tracking in the field” shows better rating over “desktop eye tracking” in criteria about the ecological validity (“Realism of stimulus display” and “Realism of interaction”).

Table.1. Criteria for deciding which environment to use — eye tracking specific criteria are highlighted in grey

How about “mobile eye tracking in virtual reality”? Interestingly, “mobile eye tracking in virtual reality” seems to be the compromising plan that appropriately mixes up relative advantages of both sides (“desktop eye tracking” and “mobile eye tracking in the field”). “Mobile eye tracking in virtual reality” is rated with high scores in almost every criterion where “desktop eye tracking” outperforms “mobile eye tracking in the field.” What is more, different from “desktop eye tracking,” “mobile eye tracking in virtual reality” is rated with enhanced scores in “Realism of stimulus display” and “Realism of interaction.” Although it still needs to tackle with the problem of cost-effectiveness and to meet further technological requirements concerning realistic visualization as well as convincing presentation of the setting, it is anticipated that mobile eye tracking in VR might open a lot of new research opportunities.

Fig.2. Trade-off between experimental control and ecological validity

Observing shopper behavior with eye tracking data in a virtual supermarket

Here is one of the new studies that adopted eye tracking in virtual reality in a new field: shopper research. In order to prove how mobile eye tracking in virtual reality can contribute to answering unresolved questions in the retailing study, this research team has tried to design the virtual store to test whether additional information about the product can help change the consumers’ final purchase decision.

In the virtual supermarket which was designed to create a realistic shopping experience, there were several shelves filled up with assortments of different granola and baking mixture products. The supermarket was presented in a virtual reality lab equipped with the front projection screen of the CAVE environment, and respondents went through the experiment wearing SMI eye tracking glasses. They underwent three successive stages. In the first stage, they had to choose the most preferred product out of 20 from the shelf. Then, the same set of products reappeared with the additional red frame highlighting the initially chosen product. Soon after that, the recommendations of six other products were highlighted with a blue frame. There was also a pop-up bubble with the additional information about the product presented right next to the product where the respondent gazed at for more than 200ms. In the end, the subjects were asked if they would stay with their initial product choice or not.

Fig.3. Example scenes of the virtual supermarket

The results showed that some subjects had changed their preference during the stages. In other words, their decisions were affected by additional information provided in VR, which in turn implies that the virtual supermarket induced quite an active interaction between human and technology, and that such an experiment setting is helpful in testing and observing consumer responses.

Soon, when eye tracking technology is integrated into hands of electronic devices, far more innovations in retailing researches and people’s shopping experiences would come true. For instance, a gaze-based assistant system which can provide individualized recommendations based off of a consumer’s preference might change the expectation of what shopping should be in the future.

Try out your research with virtual reality and eye tracking

Although the paper mainly focused on the field of shopping, such gaze-based assistant system that reflects a real-time preference of the user in a virtual environment can be widely used in many areas in which exploring people’s minds is important. If yet uncertain of its validity, check out some available technology that has successfully combined virtual reality with eye tracking and try it out to investigate no matter what you want to know. A great deal of valuable but so far hidden information such as consumers’ complicated in-store decision processes, critical interior design elements that significantly influence people’s mood, and more would be in your hand.

If you are interested in using a brain and eye interface in the virtual reality, visit our website www.looxidlabs.com and get relevant information of our newly released product that provides the world first mobile VR headset with an interface for both the eyes and the brain.

LooxidVR

In addition, we are sending our newsletter on the VR trend and VR research periodically. Subscribe us if interested in receiving our newsletter.

Reference

  1. Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research | Journal of Business Research
  2. How Eye Tracking Works | Blue Kite Insight

Read More

Getting to Know Your Working Stress Level through EEG in the Virtual Reality

By | BLOG

How can a CEO ascertain that all of his or her members are working with their full capacity?
Are you sure that your brain is not too overloaded with the everyday working condition?

Source: DEFACTO BLOG

Now with the help of the combined use of electroencephalogram (EEG) and Virtual Reality (VR), you can find out the mental workload and stress level of you and other co-workers. Adopting this combined tool will lead to much more efficient operational decisions that achieve a fair distribution of workload and responsibility among various workers.

What affects job performance?

In fact, a multiple of variables, from workplace culture to the size of work equipment, hinder our ability from thoroughly assessing any situation, which ultimately affects job performance. Mainly, fatigue and stress are critical human factors that should not be taken lightly. EHS Today reported that about a half of US workers suffer from fatigue; this is not only the story of them, but the world workforce population complains of tiredness. The most critical problem in stress at work is that excessive workload and corresponding stress would directly lead to safety issues or sometimes severe injuries. In other words, tiredness affects our judgment and might put our health at risk, and thus, the perceived level of mental stress and workload for workers should be continuously monitored and evaluated so as to secure them from various industrial accidents. Though it sounds like something costly for managers to keep their eyes on, the action is imperative in a sense that managing workers’ stress level would contribute to their overall enhanced work effectiveness.

Bio-signals would help you check your mental workload

Yet, how can we assess one’s workload? There are mainly three types of workload assessment methodologies: subjective measures, performance measures, and physiological measures. Conventionally, people had to rely on the worker’s subjective assessment where the user determines and assesses how much he or she is mentally overloaded by themselves. As a matter of fact, a few versions of Subjective Workload Assessment Techniques (SWAT) have been developed. Nonetheless, such method cannot escape from its fundamental fail point that is it not sensitive enough to catch subtle mental workloads, which if accumulated, can, in turn, lead to chronic fatigue. The performance measures, which record performance scores and use these as an indicator of task demand, or difficulty, is way more objective but they are hard to be widely used due to their intrusiveness to various work settings.

Sensors | Using Psychophysiological Sensors to Assess Mental Workload During Web Browsing

The best and the most straightforward way to make a diagnosis of our physical state is to look at bio-signals. Apart from the conventional methods such as statistical analysis of special events or keeping track of the worker’s complaints, physiological information can be considered to evaluate human factors. In addition, among many other bio-signals, electroencephalogram (EEG) is well-known for high time-resolution, possibility to continuously monitor brain stress with the adequate accuracy, and most importantly, the recognition of human emotion, stress, vigilance, etc. That is, EEG can be utilized to monitor mental workload, emotion, and stress of the workers when they perform any task. Still, some of you might worry about the way to collect EEG signals from the real working environment, as it is hard to be simulated physically. However, now that the virtual reality (VR) technology has been fairly well advanced, simulating your working condition in a virtual environment is not a matter.

Measurement of stress recognition of crew members by EEG in a Virtual Environment

This week’s research review illustrates the measurement of mental workload through EEG in a virtually simulated environment — EEG-based Mental Workload and Stress Recognition of Crew Members in Maritime Virtual Simulator: A case study. The research team has focused their study on the maritime industry where human factors are considered to be one of the leading causes of accidents, attributing to nearly 96% of the entire maritime accidents. Even though the industry has achieved a notable improvement of ship equipment and the overall system, human factors have not been considered enough to enhance the whole safety level. Therefore, the research aimed to study cause and effect of human errors of crew members by monitoring mental workload, emotion and stress level of the maritime trainees.

Fig.1. Simulator at SMA

To be more specific, in order to study the relationship between maritime trainees’ mental workload, stress levels, and task performance, the research team conducted the experiment with four maritime trainees forming the crew. Consisted of an officer on watch (OOW), a steersman, a captain, and a pilot with each assigned with duty corresponding to that of the real crew member, the crew had to navigate the vessel to the destination within SMA’s Integrated Simulation Centre (ISC) where a highly realistic environment was simulated. During their voyage, each of the subject’s emotion level (positive, neutral, negative), workload (no, minimal, moderate, high), and stress (low, medium low, moderate low, medium, medium high, moderate high, high, very high) had been observed and were further analyzed after the experiment.

Fig.2. OOW, captain, and pilot in the simulator during the experiment

The following describes the result of the analysis. The OOW, who always had to maintain watch-keeping was in the most negative emotional state; the captain, who was required to give our orders to the crew and assigned with the most significant responsibility showed the highest workload; the captain and the pilot, who had relatively higher responsibility than OOW and steersman were recorded with higher stress level as well.

Though the experiment is still in a preliminary stage of studying human factors, the success in monitoring emotion, mental workload, and stress implies that the proposed approach can be applied far beyond the maritime domain. The EEG-based human factors evaluation tools can be used for any industry that involves a multiple of people working together. In addition, it is anticipated that such mechanism can broaden the research that studies the human-machine interaction.

LooxidVR: The All-in-one device with VR compatible EEG sensor and eye tracking camera

LooxidVR

Then what should be the next step? In order to achieve a more accurate measurement of human factors in a far more immersing environment, the data-collecting sensor and the environment which is being simulated should be correlated as closely as possible. LooxidVR, the winning product of CES 2018 Best of Innovation Award, is now here for you to provide a robust data acquisition of the user’s brain activity and even eye movement in VR environment. Made by Looxid Labs, Looxid VR is the world first mobile VR headset to provide an interface for both the eyes and the brain. Looxid Labs is ready to provide the integrated solution to many of those who are interested in exploring user’s mind. It will be especially helpful for researchers who are interested in recognizing diverse emotion state of the user such as stress, mental workload, and preference.

LooxidVR has begun pre-order from 1st, Feb. For more information, visit our website www.looxidlabs.com and do not miss the pre-order opportunity to enrich your current research and study.

Also, we are sending our newsletter on the VR trend and VR research periodically. Subscribe us if interested in receiving our newsletter.

Reference

  1. EEG-based Mental Workload and Stress Recognition of Crew Members in Maritime Virtual Simulator: A Case Study | http://ieeexplore.ieee.org/document/8120300/
  2. Human Factors In Safety: How do stress and fatigue affect work? | https://www.pro-sapien.com/blog/2017/10/human-factors-safety-how-stress-fatigue-affect-work/
  3. Workload Assessment | https://www.ergonomicsblog.uk/workload-assessment/

Read More

The Virtual Environment-based Adaptive System Helps Children with Autism to Enhance Social…

By | BLOG

The Virtual Environment-based Adaptive System Helps Children with Autism to Enhance Social Functioning

According to estimates from CDC (Centers for Disease Control and Prevention)’s Autism and Developmental Disabilities Monitoring (ADDM) Network, about 1 in 68 children in this world is suffering from Autism Spectrum Disorder (ASD), a developmental disability that can cause some significant social problems including difficulties communicating and interacting with others. Specifically, children with ASD have shown impairment in understanding complex facial emotional expressions of others and are slow when processing people’s faces. In other words, they can hardly get the sense of context when interacting with people, which might later cause more severe problems in communication.

Unfortunately, little is known about the diagnosis and even treatment for ASD; currently, there is no cure for ASD but only some evidence which states that early intervention treatment services can improve a child’s development. These services refer to medical therapy that helps the child talk, walk, and interact with others. However, the real problem that blocks children with ASD to overcome social interaction impairments lies in the lack of accessibility of the therapy. The traditional intervention paradigm, which requires a professional therapist to sit next to the child, is not accessible to the vast majority of ASD population. There aren’t as many trained therapists available to assist a lot of children in need of help, and even when they are accessible, it is burdensome for the most of the households with ASD child to afford excessive intervention costs.

Technology can help children with ASD to overcome social interaction disabilities

There is good news, though. Recent advances in computer and robotic technology are introducing innovative assistive technologies for ASD therapy. In particular, among all emerging technologies, virtual Reality (VR) is the most leading one since it has its potential to individualize autism therapy to offer useful technology-enabled therapeutic systems. As children suffering ASD manifest varying social deficits from one individual to another, it is exceedingly essential to provide proper help to each of them through personalized therapy; VR-based intervention system that keeps track of the child’s mental state can fulfill this customization need. Moreover, a number of studies indicated that many children with ASD are in favor of the advanced technology. This preference can be further interpreted to assume that the new intervention paradigm for ASD such as VR might be, and should be well adopted by children with ASD.

Multimodal Adaptive Social Interaction in Virtual Environment

To the point, this week’s research review covers the new VR-based intervention system by introducing Multimodal Adaptive Social Interaction in Virtual Environment (MASI-VR) for children with ASD. This study presents design, development and a usability study of MASI-VR platform. It first has aimed to design the multimodal VR-based social interaction platform that integrates eye gaze, EEG signals, and peripheral psychophysiological signals. The research team has proved the usefulness of the designed system, particularly for emotional face processing task. Through this review, we hope you to get the sense of how virtual environment based technological system works as a whole to help improve overall social functioning in autism.

Synthesizing different aspects of a social interaction

The research team has designed the VR system that incorporated various aspects of emotional social interaction. The system, in turn, aims to help children with ASD to learn proper processing of emotional faces.

Fig.1. System architecture of MASI-VR

It mainly consists of three parts: VR task engine and dialog management module; the central supervisory controller; peripheral interfaces that monitor eye gaze, EEG, and peripheral physiological signals to assess the subject’s affective state. When the central controller facilitates the event synchronization between the other two parts, the subject starts to undergo various social task while their physiological information is collected and analyzed in real time. The signals further work as a primary determinant to control the next stage within the virtual environment, letting the whole process to become individualized.

Fig.2. Various emotion and gestural animations

To be more specific, there were total seven characters of teenagers presented in the virtual environment, and they can change their facial emotional expressions among seven kinds (enjoyment, surprise, contempt, sadness, fear, disgust, and anger) in line with the situational context. In the pre-set VR cafeteria environment, the subject wanders around the virtual space and meets one of the characters who wishes to interact with the subject. In this situation, the subject can either choose or not choose to start a conversation with the avatar. If it decides to communicate, different kinds of conversational dialog missions will take place. After each session, the training trial begins for the subject to practice recognizing the character’s emotional state through observing its facial expression. At the end of each dialog, the face of the character will be presented with oval occlusion. The occlusion will gradually disappear following the gaze of the subject to give adaptive gaze feedback. This process encourages children with ASD to look at critical parts in the face that determines one’s emotional state such as areas around eyes and mouth. Therefore, if the subject succeeds in paying enough attention to those parts, the face reveals the emotion and the subject gets to choose what the emotion was.

Fig.3. The VR cafeteria environment for the social task

Effectiveness of MASI-VR in improving eventual social functioning

In order to prove the usability and effectiveness of the gaze-sensitive system, the nearly identical system only without gaze feedback was also tested by the control group. The performance difference showed that the adaptive system was significantly more helpful to enhance the subject’s engagement to the social task as well as the accuracy of recognizing the character’s facial emotion. In other words, MASI-VR is considerably useful in training core deficit areas of children with ASD. Though the study is still in the preliminary stage, the findings suggest that VR-based social interactive environment can be utilized to help improve the eventual social functioning of those with ASD.

LooxidVR monitors eye gaze and EEG in the virtual environment

Now that the effectiveness of Multimodal Adaptive Social Interaction in Virtual Environment for children with social communication disabilities has been proved, which device should be chosen to further enrich the study to develop the quality of the therapy?

LooxidVR

In the study, several different devices were used simultaneously to monitor each corresponding physiological signals from the subject. However, there exists some inconvenience caused in the process of installing and setting up all of those devices; it would be best if the entire data can be collected and analyzed in a single VR device. Though sounds like a future dream yet to be realized, there is one in this world that enables concurrent measurement of a person’s eye gaze and EEG data in VR situation. LooxidVR, the world first mobile VR headset to provide an interface for both the eyes and the brain, allows robust data acquisition through VR compatible sensor that measures the user’s brain activity and eye movement. Recently winning Best Of Innovation Award at CES 2018, Looxid Labs is ready to provide the integrated solution to many of those who are interested in exploring user’s mind. With LooxidVR, further development of in-person therapy for children with ASD to enhance social functioning would come true.

LooxidVR pre-orders will start on Feb 1st, 2018. For more information, visit our website www.looxidlabs.com and do not miss the pre-order opportunity to enrich your current research and study.

Also, we are sending our newsletter on the VR trend and VR research periodically. Subscribe us if interested in receiving our newsletter.

Reference

  1. Multimodal adaptive social interaction in virtual environment (MASI-VR) for children with Autism spectrum disorders (ASD)| Virtual Reality (VR), 2016 IEEE
  2. Autism Spectrum Disorder (ASD)|Centers for Disease Control and Prevention

Read More