Research Letter
Abstract
We demonstrate the feasibility of a wearable, augmented reality–based nystagmus examination system, showing its preliminary diagnostic agreement with conventional video-oculography and its potential for portable vestibular assessment in patients with vertigo.
J Med Internet Res 2025;27:e75327doi:10.2196/75327
Keywords
Introduction
Vertigo commonly arises from benign vestibular dysfunction but may also have a central cause such as stroke (in approximately 10% of cases) []. Nystagmus analysis is key to differentiating these disorders [], yet conventional video-oculography (VOG) requires specialized laboratories and personnel, limiting access []. We developed a wearable augmented reality (AR)–based system delivering standardized oculomotor stimuli with real-time eye tracking. This study reports its design and preliminary validation.
Methods
Overview
This feasibility study evaluated the usability, accuracy, and tolerability of a wearable AR-based nystagmus system in a hospital clinical setting. The system integrated hardware and software to simulate conventional oculomotor testing with real-time eye tracking and automated data processing.
For comparison, the VNG Ulmer system (Synapsys; ) performs 6 standardized tests of 3 stimulus types: (1) gaze-evoked nystagmus at plus or minus 15° (60 s total; horizontal/vertical axes), (2) saccades with fixed displacements every 4 seconds over 30 seconds (8-9 trials; horizontal/vertical axes), and (3) smooth pursuit at 0.25 Hz for 30 seconds (7-8 cycles; horizontal/vertical axes) [].
The wearable AR system () comprised J7EF Gaze smart glasses, an Android-based portable device (APD), and a back-end platform. The structural components included dual Si-OLED displays, a 30 Hz infrared eye-tracking sensor, and an optional magnetic light shield to replicate Frenzel goggles (). The APD, connected via USB Type-C, ran Unity 3D software to generate a virtual 1-meter display.
In-house software delivered 6 standardized stimuli (fixation, saccades, and smooth pursuit in the horizontal and vertical axes), consistent with the VOG protocol. Real-time gaze data were transmitted via Wi-Fi for secure storage and automated analysis. This setup reproduced conventional vestibular assessments while enabling portable, automated nystagmus evaluation ().

Participants and Study Procedures
Nine patients with vertigo were enrolled (October 2024 to January 2025); 8 completed both AR and VOG examinations in a randomized crossover design with a 30-minute washout (). After each examination, participants rated discomfort using a visual analog scale (VAS; range 0-10). All waveform outputs were pooled and blindly interpreted by a board-certified otologist. The primary outcome was diagnostic concordance, quantified as percentage agreement []. Secondary outcomes were VAS scores and diagnostic performance metrics (accuracy, sensitivity, specificity, positive predictive value [PPV], and negative predictive value [NPV]), each reported with 95% CIs (Clopper-Pearson exact method).

Ethical Considerations
This study was approved by the institutional review board of Kaohsiung Chang Gung Memorial Hospital (202202194B0C501). All participants provided written informed consent. Data were anonymized, encrypted, and securely stored. No compensation was given.
Results
One participant with prior cataract surgery could not calibrate the AR glasses, leaving 8 valid cases (mean age 60.4, SD 8.9 years; range 46-72). No significant discomfort or adverse effects were reported ().
A total of 48 oculomotor data points were analyzed (saccades, pursuit, and gaze fixation). Agreement rates between AR and VOG ranged from 62.5% to 87.5% (). Overall diagnostic accuracy was 77.1%, with sensitivity 81.8% (9/11; 95% CI 48.2%-97.7%), specificity 75.7% (28/37; 95% CI 58.8%-88.2%), PPV 50.0% (9/18; 95% CI 26.0-73.9%), and NPV 93.3% (28/30; 95% CI 77.9%-99.2%) (). For central pathology, sensitivity reached 83.3% and specificity 100%. VAS scores did not differ significantly between AR and VOG, confirming tolerability.
Discussion
This study demonstrates the feasibility of wearable AR glasses for nystagmus examination, showing diagnostic consistency comparable to VOG, particularly in ruling out central abnormalities. Results align with recent AR-based HINTS (head impulse, nystagmus, test of skew) assessments using head-mounted devices [] and smartphone nystagmus apps with approximately 82% sensitivity [].
Although enrollment spanned 4 months, only 8 participants completed paired assessments due to the 2-to-3-month delay for conventional VOG. This explains the small sample and illustrates the clinical bottleneck the AR system seeks to address. Patient tolerance was favorable, with no significant discomfort, consistent with prior AR studies []. One participant with cataract surgery could not be calibrated, likely due to altered ocular optics, underscoring the need for adaptive algorithms []. Unlike stationary VOG laboratories, wearable AR systems are portable and deployable in clinics, emergency care, or telemedicine, enabling point-of-care testing without specialized infrastructure []. With automated guidance, real-time tracking, and potential artificial intelligence integration, they may reduce reliance on experts and support decision-making.
Limitations include the small sample, yielding wide CIs for sensitivity (81.8%; 48.2%-97.7%) and specificity (75.7%; 58.8%-88.2%), reducing precision. The moderate PPV (50.0%; 26.0%-73.9%) highlights the need for improved processing. Diagnostic concordance was measured by percent agreement, which does not adjust for chance; future studies should apply Cohen κ and multiple raters. Finally, as a single-center pilot with one clinician, multicenter validation is required to confirm generalizability.
Acknowledgments
This study was supported by a grant from the National Science and Technology Council, Taiwan (112-2314-B-182A-094-MY3). However, the funders had no role in the study design, data collection, data analysis, decision to publish, or manuscript preparation. We express our appreciation of the Biostatistics Center at Kaohsiung Chang Gung Memorial Hospital for helping with the study design and statistics work. Portions of the manuscript were edited for grammar and style using OpenAI’s ChatGPT (GPT-5); all scientific content and interpretation were reviewed and approved by the authors.
Data Availability
All deidentified data generated or analyzed during this study are included in this published article and its supplementary files.
Authors' Contributions
CNW contributed to conceptualization, project administration, formal analysis, writing—original draft, and data curation. CYC contributed to methodology, and technical support. HHC contributed to data curation, and technical support. SDL contributed to investigation. CFH contributed to validation. WJC contributed to conceptualization. MCC contributed to conceptualization, resources, supervision, and writing—review and editing..
Conflicts of Interest
None declared.
Full conventional video-oculography protocol, including test parameters and classification criteria.
DOCX File , 22 KBDetailed system design and hardware/software specifications.
DOCX File , 23 KBStructural components of the J7EF Gaze smart glasses.
DOCX File , 140 KBPatient characteristics and oculomotor examination results.
DOCX File , 23 KBHeatmap of agreement rates between augmented reality and video-oculography signals.
DOCX File , 4694 KBConfusion matrix of augmented reality–based vs video-oculography–based classifications of suspected central vestibular pathology.
DOCX File , 20 KBReferences
- Lui F, Foris L, Tadi P. Central Vertigo. Treasure Island, FL. StatPearls Publishing; 2025.
- Saha K. Vertigo related to central nervous system disorders. Continuum (Minneap Minn). Apr 01, 2021;27(2):447-467. [CrossRef] [Medline]
- Winnick A, Chen C-C, Chang T-P, Kuo Y-H, Wang C-F, Huang C-H, et al. Automated nystagmus detection: accuracy of slow-phase and quick-phase algorithms to determine the presence of nystagmus. J Neurol Sci. Nov 15, 2022;442:120392. [CrossRef] [Medline]
- Wu C, Luo S, Chen S, Huang C, Chiang P, Hwang C, et al. Applicability of oculomotor tests for predicting central vestibular disorder using principal component analysis. J Pers Med. Feb 02, 2022;12(2):203. [FREE Full text] [CrossRef] [Medline]
- McHugh ML. Interrater reliability: the kappa statistic. Biochem Med (Zagreb). 2012;22(3):276-282. [FREE Full text] [Medline]
- Sadok N, Luijten G, Bahnsen FH, Gsaxner C, Peters L, Eichler T, et al. Performing the HINTS-exam using a mixed-reality head-mounted display in patients with acute vestibular syndrome: a feasibility study. Front Neurol. 2025;16:1576959. [FREE Full text] [CrossRef] [Medline]
- van Bonn SM, Behrendt SP, Pawar BL, Schraven SP, Mlynski R, Schuldt T. Smartphone-based nystagmus diagnostics: development of an innovative app for the targeted detection of vertigo. Eur Arch Otorhinolaryngol. Dec 2022;279(12):5565-5571. [FREE Full text] [CrossRef] [Medline]
- Zhang J, Che X, Chang E, Qu C, Di X, Liu H, et al. How different text display patterns affect cybersickness in augmented reality. Sci Rep. May 22, 2024;14(1):11693. [FREE Full text] [CrossRef] [Medline]
- Krösl K, Elvezio C, Luidolt L, Hürbe M, Karst S, Feiner S, et al. CatARact: simulating cataracts in augmented reality. In: IEEE International Symposium on Mixed and Augmented Reality. 2020. Presented at: ISMAR; November 9-13, 2020; Porto de Galinhas, Brazil. [CrossRef]
- Wu C, Luo S, Lin H, Huang J, Lee C, Liu S, et al. Eligibility for live, interactive otolaryngology telemedicine: 19-month experience before and during the COVID-19 pandemic in Taiwan. Biomed J. Oct 2021;44(5):582-588. [FREE Full text] [CrossRef] [Medline]
Abbreviations
| APD: Android-based portable device |
| AR: augmented reality |
| HINTS: head impulse, nystagmus, test of skew |
| NPV: negative predictive value |
| PPV: positive predictive value |
| VAS: visual analog scale |
| VOG: video-oculography |
Edited by A Coristine; submitted 01.Apr.2025; peer-reviewed by C-F Wang, A Hungbo; comments to author 11.Jun.2025; revised version received 28.Jun.2025; accepted 08.Sep.2025; published 11.Nov.2025.
Copyright©Ching-Nung Wu, Ming-Che Chen, Chien-Yan Chien, Hsiang-Han Chang, Sheng-Dean Luo, Chung-Feng Hwang, Wan-Jung Chang. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 11.Nov.2025.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.

