Abstract | ||
---|---|---|
Behavior-based authentication methods are actively being developed for XR. In particular, gaze-based methods promise continuous authentication of remote users. However, gaze behavior depends on the task being performed. Identification rate is typically highest when comparing data from the same task. In this study, we compared authentication performance using VR gaze data during random dot viewing, 360-degree image viewing, and a nuclear training simulation. We found that within-task authentication performed best for image viewing (72%). The implication for practitioners is to integrate image viewing into a VR workflow to collect gaze data that is viable for authentication. |
Year | DOI | Venue |
---|---|---|
2022 | 10.1109/VRW55335.2022.00223 | 2022 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS (VRW 2022) |
Keywords | DocType | Citations |
Eye Tracking, Virtual Reality, Authentication, Future of Work | Conference | 0 |
PageRank | References | Authors |
0.34 | 0 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Karina LaRubbio | 1 | 0 | 0.34 |
Jeremiah Wright | 2 | 0 | 0.34 |
Brendan David-John | 3 | 0 | 0.68 |
Andreas Enqvist | 4 | 0 | 0.34 |
Eakta Jain | 5 | 0 | 0.34 |