Eye Tracking Virtual Reality

If you want to analyze subjects` gaze behavior in Virtual Environments presented by Head-Mounted Displays D-Lab Eye-Tracking Virtual Reality is the right software for you. It matches the virtual video with the eye tracking data to allow automatic and precise analysis where a person is looking at in a Virtual Environment. Powerful visualizations and metrics provide all information about the subjects` pupil movement and gaze behavior.

Plan – Measure – Analyze
Eye-Tracking in a virtual environment is easy to achieve. Just use D-Lab Eye-Tracking Virtual Reality, put the HMD with our Eye-Tracker built in on and you are ready to go. Even pre-defined areas of interest in the VR can be used to automatically calculate the related metrics.


– Quick and easy calibration
– Synchronization with motion capturing systems
– Gaze control by virtual reality screen fixation coordinates
– Automated offline optimization of pupil detection
– Integrated scripting language

PRODUCT Features

D-Lab Eye-Tracking Virtual Reality offers several key features:

D-LAB EYE TRACKING Virtual Reality
Stand-alone module or sychronized with D-Lab Audio, CAN Bus, Data Stream, Head Tracking, ...
Recorded data available in raw format
Realtime data hub to all data via relay function
Various visualization options such as heat map, shadow map, gaze path, bee swarm
Realtime access to VR screen fixation coordinates
Realtime access to marker positions
Eye cam videos available as separate videos
Access to pupilometry
Scripting language to build correlations to other channels or sensors

application examples

– Behavior in virtual environment
– Games testing
– Testing of virtual shelfes and virtual supermarkets
– Evaluation of virtual design concepts
– First test of new product ideas
– Test of early interface designs
– Market research
– Design clinics
– Behavioral studies in virtual reality
– Factory planning
– Military research
– Development pschology
– Neuroscience
– Flight safety research