Eye Tracking in Virtual Reality - PubMed (original) (raw)

Eye Tracking in Virtual Reality

Viviane Clay et al. J Eye Mov Res. 2019.

Abstract

The intent of this paper is to provide an introduction into the bourgeoning field of eye tracking in Virtual Reality (VR). VR itself is an emerging technology on the consumer market, which will create many new opportunities in research. It offers a lab environment with high immersion and close alignment with reality. An experiment which is using VR takes place in a highly controlled environment and allows for a more in-depth amount of information to be gathered about the actions of a subject. Techniques for eye tracking were introduced more than a century ago and are now an established technique in psychological experiments, yet recent development makes it versatile and affordable. In combination, these two techniques allow unprecedented monitoring and control of human behavior in semi-realistic conditions. This paper will explore the methods and tools which can be applied in the implementation of experiments using eye tracking in VR following the example of one case study. Accompanying the technical descriptions, we present research that displays the effectiveness of the technology and show what kind of results can be obtained when using eye tracking in VR. It is meant to guide the reader through the process of bringing VR in combination with eye tracking into the lab and to inspire ideas for new experiments.

Keywords: Eye movement; VR; eye tracking; gaze; region of interest; smooth pursuit; virtual reality.

PubMed Disclaimer

Conflict of interest statement

The author(s) declare(s) that the contents of the article are in agreement with the ethics described in http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html and that there is no conflict of interest regarding the publication of this paper.

Figures

Figure 1.

Figure 1.

Our HTC Vive setup. (A) Participant sitting in a swivel chair during a session. Walking in the virtual world is done with the controller in our experiment. (B) The flexible cable management system keeps the cables from tangling up and getting in the way of the subject during the session. (C) Pupil Labs Eye-Tracker inside the HTC Vive Headset. Image source:

https://pupil-labs.com/vr-ar/

(21.10.2017)

Figure 2.

Figure 2.

(left) real world: vergence = focal distance. (right) Focal distance always stays the same in VR.

Figure 3.

Figure 3.

Ray cast (yellow) inside of Unity going from the player to the reference object. The first intersection with an object is taken as the hit point. Colored spheres visualize previous hit points of the players gaze.

Figure 4.

Figure 4.

Seahaven - Surrounded by an ocean, the subject is forced to stay in the city. Many different houses and streets can be explored.

Figure 5.

Figure 5.

Design of the task. Two statements had to be rated for each picture. Here the familiarity rating is shown. After the response the same picture is shown again with the navigation question.

Figure 6.

Figure 6.

Points used for Calibration (black) and Validation (blue). The fixation point is shown in one of the locations at a time.

Figure 7.

Figure 7.

Size relations in Unity. (A) 10x2x1 Units cube. (B) 1x2x1 Units cube standing next to a bus stop. This is also the size of the player.

Figure 8.

Figure 8.

(A) Map of a single subject's walking path during a 30-minute session. (B) Map with walking path of 27 subjects. (C) Number of subjects that visited a certain area of the city.

Figure 9.

Figure 9.

Excerpt out of a timeline of a full session. Represents houses on the x-Axis (Names are numbers between 001 and 200 + a rotation of 0, 3 or 6). Yellow blocks represent timespans during which a certain house was looked at. Orange arrows point to the house on the map.

Figure 10.

Figure 10.

Visualization of gaze points from one subject. Spheres represent hit points of the gaze vector. Color-coded by distance from which it was looked at during the session (far->close = red->blue).

Figure 11.

Figure 11.

Houses that were looked at for a longer time had an overall higher familiarity (and navigation, not depicted) rating.

Figure 12.

Figure 12.

(Top) Average distance from which a house was seen. (Bottom) Average variance in distances.

Figure 13.

Figure 13.

3D Heat map of one subject. Points are colored according to the number of close neighbors of a point and thereby code the density of points in this area (yellow=high density, blue=low density). Close-Far axis shows Log(Distance). (Top) Original gazes show concentration at center of visual field at a medium distance. Maximum density at distance of ln(23)=3.1. (Bottom) Randomly distributed gaze vectors in x and y plane show the intrinsic properties of the VR environment.

Figure 14.

Figure 14.

Normal heat map - heat map of hit points resulting from shuffled gaze vectors (One subject). It shows a systematic bias for bigger distances.

Figure 15.

Figure 15.

Sample Data of two subjects. (A) Heat map of gazes while the subject is not turning. (B) Gazes in 10 frame time windows while the subject is making a right turn >20 degree. (C) Gazes in 10 frame time windows while the subject is making a left turn >20 degree.

Figure 16.

Figure 16.

Gaze during walking compared to gaze during standing. (A) Scatterplot of the two gaze classes (Orange - walking, Blue - standing). (B) Heat map of (gazes while standing) – (gazes while walking).

Similar articles

Cited by

References

    1. Abras, M. Latency – the sine qua non of AR and VR [Internet]. 2012.
    1. Buswell, G. T. ( 1938, July). How Adults Read. [Internet] The Library Quarterly, 8( 3), 419– 420. 10.1086/614288 - DOI
    1. Deubel, H. , & Schneider, W. X. ( 1996, June). Saccade target selection and object recognition: Evidence for a common attentional mechanism. [Internet]. Vision Research, 36( 12), 1827–1837. 10.1016/0042-6989(95)00294-4 - DOI - PubMed
    1. Duchowski, A. T. , Shivashankaraiah, V. , Rawls, T. , Gramopadhye, A. K. , Melloy, B. J. , & Kanki, B. . Binocular eye tracking in virtual reality for inspection training. Proc Symp Eye Track Res Appl - ETRA ’00 [Internet]. 2000;89–96. 10.1145/355017.355031 - DOI
    1. Eggleston, R. , Janson, W. P. , & Aldrich, K. A. . Virtual reality system effects on size-distance judgements in a virtual environment. Virtual Real Annu Int Symp [Internet]. 1996;139–46. 10.1109/VRAIS.1996.490521 - DOI

LinkOut - more resources