Download PDFOpen PDF in browserUsing Eye Gaze to Differentiate Internal Feelings of Familiarity in Virtual Reality Environments: Challenges and OpportunitiesEasyChair Preprint 152696 pages•Date: October 18, 2024AbstractOur group previously reported a feasible approach to detect the internal state of familiarity with eye-gaze features [1]. Utilizing an existing paradigm [2], we examined participants’ feelings of familiarity during immersion within virtual reality (VR) scenes, some of which had had their spatial layout familiarized through prior presentation of a different scene with the same configuration. While immersed in a test scene, participants indicated the onset of familiarity via a button press on a handheld controller, then verbally indicated whether they could state the source of the familiarity or not. A potential issue is that machine learning models may have detected eye-gaze features reflecting the act of pressing the button rather than features associated with the internal state of familiarity. Although in [1] we addressed this challenge by including a buffer period between the button press and the window of data used for model training, it remains uncertain within what time frame features associated with the button press may persist. Here, we introduce an approach for potentially overcoming the confounding effects of the button-press by holding it constant. We examine machine learning models’ ability to detect whether a scene’s layout had been experimentally familiarized among only instances where subjective familiarity was reported. We then repeat this method for instances where no familiarity was reported. Finally, we examine experimentally familiarized scenes where familiarity was reported to detect recall-success vs. recall-failure for the familiarity’s source. Keyphrases: Intelligent Virtual Tutoring Systems, Internal State Detection, Virtual Reality, familiarity, machine learning
|