Eye Tracking in Virtual Reality - PubMed (original) (raw)
Eye Tracking in Virtual Reality
Viviane Clay et al. J Eye Mov Res. 2019.
Abstract
The intent of this paper is to provide an introduction into the bourgeoning field of eye tracking in Virtual Reality (VR). VR itself is an emerging technology on the consumer market, which will create many new opportunities in research. It offers a lab environment with high immersion and close alignment with reality. An experiment which is using VR takes place in a highly controlled environment and allows for a more in-depth amount of information to be gathered about the actions of a subject. Techniques for eye tracking were introduced more than a century ago and are now an established technique in psychological experiments, yet recent development makes it versatile and affordable. In combination, these two techniques allow unprecedented monitoring and control of human behavior in semi-realistic conditions. This paper will explore the methods and tools which can be applied in the implementation of experiments using eye tracking in VR following the example of one case study. Accompanying the technical descriptions, we present research that displays the effectiveness of the technology and show what kind of results can be obtained when using eye tracking in VR. It is meant to guide the reader through the process of bringing VR in combination with eye tracking into the lab and to inspire ideas for new experiments.
Keywords: Eye movement; VR; eye tracking; gaze; region of interest; smooth pursuit; virtual reality.
Conflict of interest statement
The author(s) declare(s) that the contents of the article are in agreement with the ethics described in http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html and that there is no conflict of interest regarding the publication of this paper.
Figures
Figure 1.
Our HTC Vive setup. (A) Participant sitting in a swivel chair during a session. Walking in the virtual world is done with the controller in our experiment. (B) The flexible cable management system keeps the cables from tangling up and getting in the way of the subject during the session. (C) Pupil Labs Eye-Tracker inside the HTC Vive Headset. Image source:
(21.10.2017)
Figure 2.
(left) real world: vergence = focal distance. (right) Focal distance always stays the same in VR.
Figure 3.
Ray cast (yellow) inside of Unity going from the player to the reference object. The first intersection with an object is taken as the hit point. Colored spheres visualize previous hit points of the players gaze.
Figure 4.
Seahaven - Surrounded by an ocean, the subject is forced to stay in the city. Many different houses and streets can be explored.
Figure 5.
Design of the task. Two statements had to be rated for each picture. Here the familiarity rating is shown. After the response the same picture is shown again with the navigation question.
Figure 6.
Points used for Calibration (black) and Validation (blue). The fixation point is shown in one of the locations at a time.
Figure 7.
Size relations in Unity. (A) 10x2x1 Units cube. (B) 1x2x1 Units cube standing next to a bus stop. This is also the size of the player.
Figure 8.
(A) Map of a single subject's walking path during a 30-minute session. (B) Map with walking path of 27 subjects. (C) Number of subjects that visited a certain area of the city.
Figure 9.
Excerpt out of a timeline of a full session. Represents houses on the x-Axis (Names are numbers between 001 and 200 + a rotation of 0, 3 or 6). Yellow blocks represent timespans during which a certain house was looked at. Orange arrows point to the house on the map.
Figure 10.
Visualization of gaze points from one subject. Spheres represent hit points of the gaze vector. Color-coded by distance from which it was looked at during the session (far->close = red->blue).
Figure 11.
Houses that were looked at for a longer time had an overall higher familiarity (and navigation, not depicted) rating.
Figure 12.
(Top) Average distance from which a house was seen. (Bottom) Average variance in distances.
Figure 13.
3D Heat map of one subject. Points are colored according to the number of close neighbors of a point and thereby code the density of points in this area (yellow=high density, blue=low density). Close-Far axis shows Log(Distance). (Top) Original gazes show concentration at center of visual field at a medium distance. Maximum density at distance of ln(23)=3.1. (Bottom) Randomly distributed gaze vectors in x and y plane show the intrinsic properties of the VR environment.
Figure 14.
Normal heat map - heat map of hit points resulting from shuffled gaze vectors (One subject). It shows a systematic bias for bigger distances.
Figure 15.
Sample Data of two subjects. (A) Heat map of gazes while the subject is not turning. (B) Gazes in 10 frame time windows while the subject is making a right turn >20 degree. (C) Gazes in 10 frame time windows while the subject is making a left turn >20 degree.
Figure 16.
Gaze during walking compared to gaze during standing. (A) Scatterplot of the two gaze classes (Orange - walking, Blue - standing). (B) Heat map of (gazes while standing) – (gazes while walking).
Similar articles
- Virtual Reality Analgesia With Interactive Eye Tracking During Brief Thermal Pain Stimuli: A Randomized Controlled Trial (Crossover Design).
Al-Ghamdi NA, Meyer WJ 3rd, Atzori B, Alhalabi W, Seibel CC, Ullman D, Hoffman HG. Al-Ghamdi NA, et al. Front Hum Neurosci. 2020 Jan 23;13:467. doi: 10.3389/fnhum.2019.00467. eCollection 2019. Front Hum Neurosci. 2020. PMID: 32038200 Free PMC article. - Eye Tracking in Virtual Reality.
Anderson NC, Bischof WF, Kingstone A. Anderson NC, et al. Curr Top Behav Neurosci. 2023;65:73-100. doi: 10.1007/7854_2022_409. Curr Top Behav Neurosci. 2023. PMID: 36710302 - An eye tracking based virtual reality system for use inside magnetic resonance imaging systems.
Qian K, Arichi T, Price A, Dall'Orso S, Eden J, Noh Y, Rhode K, Burdet E, Neil M, Edwards AD, Hajnal JV. Qian K, et al. Sci Rep. 2021 Aug 11;11(1):16301. doi: 10.1038/s41598-021-95634-y. Sci Rep. 2021. PMID: 34381099 Free PMC article. - A review of virtual reality technologies in the field of communication disability: implications for practice and research.
Bryant L, Brunner M, Hemsley B. Bryant L, et al. Disabil Rehabil Assist Technol. 2020 May;15(4):365-372. doi: 10.1080/17483107.2018.1549276. Epub 2019 Jan 13. Disabil Rehabil Assist Technol. 2020. PMID: 30638092 Review. - The impact of perception and presence on emotional reactions: a review of research in virtual reality.
Diemer J, Alpers GW, Peperkorn HM, Shiban Y, Mühlberger A. Diemer J, et al. Front Psychol. 2015 Jan 30;6:26. doi: 10.3389/fpsyg.2015.00026. eCollection 2015. Front Psychol. 2015. PMID: 25688218 Free PMC article. Review.
Cited by
- Let's get it started: Eye Tracking in VR with the Pupil Labs Eye Tracking Add-On for the HTC Vive.
Josupeit J. Josupeit J. J Eye Mov Res. 2023 Jun 19;15(3):10.16910/jemr.15.3.10. doi: 10.16910/jemr.15.3.10. eCollection 2022. J Eye Mov Res. 2023. PMID: 39139654 Free PMC article. - A systematic performance comparison of two Smooth Pursuit detection algorithms in Virtual Reality depending on target number, distance, and movement patterns.
Freytag SC, Zechner R, Kamps M. Freytag SC, et al. J Eye Mov Res. 2023 May 29;15(3):10.16910/jemr.15.3.9. doi: 10.16910/jemr.15.3.9. eCollection 2022. J Eye Mov Res. 2023. PMID: 39139653 Free PMC article. - Animated VR and 360-degree VR to assess and train team sports decision-making: a scoping review.
Jia Y, Zhou X, Yang J, Fu Q. Jia Y, et al. Front Psychol. 2024 Jul 15;15:1410132. doi: 10.3389/fpsyg.2024.1410132. eCollection 2024. Front Psychol. 2024. PMID: 39077210 Free PMC article. - Wearable Near-Eye Tracking Technologies for Health: A Review.
Zhu L, Chen J, Yang H, Zhou X, Gao Q, Loureiro R, Gao S, Zhao H. Zhu L, et al. Bioengineering (Basel). 2024 Jul 22;11(7):738. doi: 10.3390/bioengineering11070738. Bioengineering (Basel). 2024. PMID: 39061820 Free PMC article. Review. - Immersive virtual reality for interdisciplinary trauma management - initial evaluation of a training tool prototype.
Hanke LI, Vradelis L, Boedecker C, Griesinger J, Demare T, Lindemann NR, Huettl F, Chheang V, Saalfeld P, Wachter N, Wollstädter J, Spranz M, Lang H, Hansen C, Huber T. Hanke LI, et al. BMC Med Educ. 2024 Jul 18;24(1):769. doi: 10.1186/s12909-024-05764-w. BMC Med Educ. 2024. PMID: 39026193 Free PMC article.
References
- Abras, M. Latency – the sine qua non of AR and VR [Internet]. 2012.
- Buswell, G. T. ( 1938, July). How Adults Read. [Internet] The Library Quarterly, 8( 3), 419– 420. 10.1086/614288 - DOI
- Duchowski, A. T. , Shivashankaraiah, V. , Rawls, T. , Gramopadhye, A. K. , Melloy, B. J. , & Kanki, B. . Binocular eye tracking in virtual reality for inspection training. Proc Symp Eye Track Res Appl - ETRA ’00 [Internet]. 2000;89–96. 10.1145/355017.355031 - DOI
- Eggleston, R. , Janson, W. P. , & Aldrich, K. A. . Virtual reality system effects on size-distance judgements in a virtual environment. Virtual Real Annu Int Symp [Internet]. 1996;139–46. 10.1109/VRAIS.1996.490521 - DOI
LinkOut - more resources
Full Text Sources
Other Literature Sources
Miscellaneous