Published on in Vol 27 (2025)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/75327, first published .
Wearable Augmented Reality for Nystagmus Examination in Patients With Vertigo: Randomized Crossover Usability Study

Wearable Augmented Reality for Nystagmus Examination in Patients With Vertigo: Randomized Crossover Usability Study

Wearable Augmented Reality for Nystagmus Examination in Patients With Vertigo: Randomized Crossover Usability Study

Research Letter

1Department of Otolaryngology, Kaohsiung Chang Gung Memorial Hospital, Kaohsiung, Taiwan

2School of Traditional Chinese Medicine, College of Medicine, Chang Gung University, Taoyuan, Taiwan

3Department of Otolaryngology, Kaohsiung Municipal Ta-Tung Hospital, Kaohsiung, Taiwan

4Department of Electronic Engineering, Southern Taiwan University of Science and Technology, Tainan, Taiwan

5Graduate Institute of Clinical Medical Sciences, College of Medicine, Chang Gung University, Taoyuan, Taiwan

6School of Medicine, College of Medicine, National Sun Yat-sen University, Kaohsiung, Taiwan

7Department of Electronic Engineering, National Kaohsiung University of Science and Technology, kaohsiung, Taiwan

Corresponding Author:

Ming-Che Chen, Prof Dr

Department of Electronic Engineering

Southern Taiwan University of Science and Technology

1 Nantai St

Yungkang District

Tainan, 710301

Taiwan

Phone: 886 913661700

Email: jerryhata@stust.edu.tw




Vertigo commonly arises from benign vestibular dysfunction but may also have a central cause such as stroke (in approximately 10% of cases) [1]. Nystagmus analysis is key to differentiating these disorders [2], yet conventional video-oculography (VOG) requires specialized laboratories and personnel, limiting access [3]. We developed a wearable augmented reality (AR)–based system delivering standardized oculomotor stimuli with real-time eye tracking. This study reports its design and preliminary validation.


Overview

This feasibility study evaluated the usability, accuracy, and tolerability of a wearable AR-based nystagmus system in a hospital clinical setting. The system integrated hardware and software to simulate conventional oculomotor testing with real-time eye tracking and automated data processing.

For comparison, the VNG Ulmer system (Synapsys; Multimedia Appendix 1) performs 6 standardized tests of 3 stimulus types: (1) gaze-evoked nystagmus at plus or minus 15° (60 s total; horizontal/vertical axes), (2) saccades with fixed displacements every 4 seconds over 30 seconds (8-9 trials; horizontal/vertical axes), and (3) smooth pursuit at 0.25 Hz for 30 seconds (7-8 cycles; horizontal/vertical axes) [4].

The wearable AR system (Multimedia Appendix 2) comprised J7EF Gaze smart glasses, an Android-based portable device (APD), and a back-end platform. The structural components included dual Si-OLED displays, a 30 Hz infrared eye-tracking sensor, and an optional magnetic light shield to replicate Frenzel goggles (Multimedia Appendix 3). The APD, connected via USB Type-C, ran Unity 3D software to generate a virtual 1-meter display.

In-house software delivered 6 standardized stimuli (fixation, saccades, and smooth pursuit in the horizontal and vertical axes), consistent with the VOG protocol. Real-time gaze data were transmitted via Wi-Fi for secure storage and automated analysis. This setup reproduced conventional vestibular assessments while enabling portable, automated nystagmus evaluation (Figure 1).

Figure 1. This figure illustrates the architecture of the wearable augmented reality–based nystagmus examination system. The J7EF Gaze smart glasses incorporate near-eye displays and infrared eye-tracking sensors to present standardized visual stimuli and capture real-time gaze positions. A virtual screen simulates a 1-meter viewing distance, displaying moving light dots for oculomotor assessment. Data are processed on an Android portable device and transmitted via Wi-Fi to a back-end information platform for visualization and further analysis.

Participants and Study Procedures

Nine patients with vertigo were enrolled (October 2024 to January 2025); 8 completed both AR and VOG examinations in a randomized crossover design with a 30-minute washout (Figure 2). After each examination, participants rated discomfort using a visual analog scale (VAS; range 0-10). All waveform outputs were pooled and blindly interpreted by a board-certified otologist. The primary outcome was diagnostic concordance, quantified as percentage agreement [5]. Secondary outcomes were VAS scores and diagnostic performance metrics (accuracy, sensitivity, specificity, positive predictive value [PPV], and negative predictive value [NPV]), each reported with 95% CIs (Clopper-Pearson exact method).

Figure 2. This figure illustrates the workflow of the augmented reality–based oculomotor examination, including software-generated visual stimuli and real-time eye movement tracking. The light dot, white stripe, and gaze trajectories are displayed on the back-end information platform, enabling comparative analysis of expected vs actual eye movement responses.

Ethical Considerations

This study was approved by the institutional review board of Kaohsiung Chang Gung Memorial Hospital (202202194B0C501). All participants provided written informed consent. Data were anonymized, encrypted, and securely stored. No compensation was given.


One participant with prior cataract surgery could not calibrate the AR glasses, leaving 8 valid cases (mean age 60.4, SD 8.9 years; range 46-72). No significant discomfort or adverse effects were reported (Multimedia Appendix 4).

A total of 48 oculomotor data points were analyzed (saccades, pursuit, and gaze fixation). Agreement rates between AR and VOG ranged from 62.5% to 87.5% (Multimedia Appendix 5). Overall diagnostic accuracy was 77.1%, with sensitivity 81.8% (9/11; 95% CI 48.2%-97.7%), specificity 75.7% (28/37; 95% CI 58.8%-88.2%), PPV 50.0% (9/18; 95% CI 26.0-73.9%), and NPV 93.3% (28/30; 95% CI 77.9%-99.2%) (Multimedia Appendix 6). For central pathology, sensitivity reached 83.3% and specificity 100%. VAS scores did not differ significantly between AR and VOG, confirming tolerability.


This study demonstrates the feasibility of wearable AR glasses for nystagmus examination, showing diagnostic consistency comparable to VOG, particularly in ruling out central abnormalities. Results align with recent AR-based HINTS (head impulse, nystagmus, test of skew) assessments using head-mounted devices [6] and smartphone nystagmus apps with approximately 82% sensitivity [7].

Although enrollment spanned 4 months, only 8 participants completed paired assessments due to the 2-to-3-month delay for conventional VOG. This explains the small sample and illustrates the clinical bottleneck the AR system seeks to address. Patient tolerance was favorable, with no significant discomfort, consistent with prior AR studies [8]. One participant with cataract surgery could not be calibrated, likely due to altered ocular optics, underscoring the need for adaptive algorithms [9]. Unlike stationary VOG laboratories, wearable AR systems are portable and deployable in clinics, emergency care, or telemedicine, enabling point-of-care testing without specialized infrastructure [10]. With automated guidance, real-time tracking, and potential artificial intelligence integration, they may reduce reliance on experts and support decision-making.

Limitations include the small sample, yielding wide CIs for sensitivity (81.8%; 48.2%-97.7%) and specificity (75.7%; 58.8%-88.2%), reducing precision. The moderate PPV (50.0%; 26.0%-73.9%) highlights the need for improved processing. Diagnostic concordance was measured by percent agreement, which does not adjust for chance; future studies should apply Cohen κ and multiple raters. Finally, as a single-center pilot with one clinician, multicenter validation is required to confirm generalizability.

Acknowledgments

This study was supported by a grant from the National Science and Technology Council, Taiwan (112-2314-B-182A-094-MY3). However, the funders had no role in the study design, data collection, data analysis, decision to publish, or manuscript preparation. We express our appreciation of the Biostatistics Center at Kaohsiung Chang Gung Memorial Hospital for helping with the study design and statistics work. Portions of the manuscript were edited for grammar and style using OpenAI’s ChatGPT (GPT-5); all scientific content and interpretation were reviewed and approved by the authors.

Data Availability

All deidentified data generated or analyzed during this study are included in this published article and its supplementary files.

Authors' Contributions

CNW contributed to conceptualization, project administration, formal analysis, writing—original draft, and data curation. CYC contributed to methodology, and technical support. HHC contributed to data curation, and technical support. SDL contributed to investigation. CFH contributed to validation. WJC contributed to conceptualization. MCC contributed to conceptualization, resources, supervision, and writing—review and editing..

Conflicts of Interest

None declared.

Multimedia Appendix 1

Full conventional video-oculography protocol, including test parameters and classification criteria.

DOCX File , 22 KB

Multimedia Appendix 2

Detailed system design and hardware/software specifications.

DOCX File , 23 KB

Multimedia Appendix 3

Structural components of the J7EF Gaze smart glasses.

DOCX File , 140 KB

Multimedia Appendix 4

Patient characteristics and oculomotor examination results.

DOCX File , 23 KB

Multimedia Appendix 5

Heatmap of agreement rates between augmented reality and video-oculography signals.

DOCX File , 4694 KB

Multimedia Appendix 6

Confusion matrix of augmented reality–based vs video-oculography–based classifications of suspected central vestibular pathology.

DOCX File , 20 KB

  1. Lui F, Foris L, Tadi P. Central Vertigo. Treasure Island, FL. StatPearls Publishing; 2025.
  2. Saha K. Vertigo related to central nervous system disorders. Continuum (Minneap Minn). Apr 01, 2021;27(2):447-467. [CrossRef] [Medline]
  3. Winnick A, Chen C-C, Chang T-P, Kuo Y-H, Wang C-F, Huang C-H, et al. Automated nystagmus detection: accuracy of slow-phase and quick-phase algorithms to determine the presence of nystagmus. J Neurol Sci. Nov 15, 2022;442:120392. [CrossRef] [Medline]
  4. Wu C, Luo S, Chen S, Huang C, Chiang P, Hwang C, et al. Applicability of oculomotor tests for predicting central vestibular disorder using principal component analysis. J Pers Med. Feb 02, 2022;12(2):203. [FREE Full text] [CrossRef] [Medline]
  5. McHugh ML. Interrater reliability: the kappa statistic. Biochem Med (Zagreb). 2012;22(3):276-282. [FREE Full text] [Medline]
  6. Sadok N, Luijten G, Bahnsen FH, Gsaxner C, Peters L, Eichler T, et al. Performing the HINTS-exam using a mixed-reality head-mounted display in patients with acute vestibular syndrome: a feasibility study. Front Neurol. 2025;16:1576959. [FREE Full text] [CrossRef] [Medline]
  7. van Bonn SM, Behrendt SP, Pawar BL, Schraven SP, Mlynski R, Schuldt T. Smartphone-based nystagmus diagnostics: development of an innovative app for the targeted detection of vertigo. Eur Arch Otorhinolaryngol. Dec 2022;279(12):5565-5571. [FREE Full text] [CrossRef] [Medline]
  8. Zhang J, Che X, Chang E, Qu C, Di X, Liu H, et al. How different text display patterns affect cybersickness in augmented reality. Sci Rep. May 22, 2024;14(1):11693. [FREE Full text] [CrossRef] [Medline]
  9. Krösl K, Elvezio C, Luidolt L, Hürbe M, Karst S, Feiner S, et al. CatARact: simulating cataracts in augmented reality. In: IEEE International Symposium on Mixed and Augmented Reality. 2020. Presented at: ISMAR; November 9-13, 2020; Porto de Galinhas, Brazil. [CrossRef]
  10. Wu C, Luo S, Lin H, Huang J, Lee C, Liu S, et al. Eligibility for live, interactive otolaryngology telemedicine: 19-month experience before and during the COVID-19 pandemic in Taiwan. Biomed J. Oct 2021;44(5):582-588. [FREE Full text] [CrossRef] [Medline]


APD: Android-based portable device
AR: augmented reality
HINTS: head impulse, nystagmus, test of skew
NPV: negative predictive value
PPV: positive predictive value
VAS: visual analog scale
VOG: video-oculography


Edited by A Coristine; submitted 01.Apr.2025; peer-reviewed by C-F Wang, A Hungbo; comments to author 11.Jun.2025; revised version received 28.Jun.2025; accepted 08.Sep.2025; published 11.Nov.2025.

Copyright

©Ching-Nung Wu, Ming-Che Chen, Chien-Yan Chien, Hsiang-Han Chang, Sheng-Dean Luo, Chung-Feng Hwang, Wan-Jung Chang. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 11.Nov.2025.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.