Original Paper
Abstract
Background: Diabetic retinopathy (DR), a common complication of diabetes mellitus, is the leading cause of impaired vision in adults worldwide. Smartphone ophthalmoscopy involves using a smartphone camera for digital retinal imaging. Utilizing smartphones to detect DR is potentially more affordable, accessible, and easier to use than conventional methods.
Objective: This study aimed to determine the diagnostic accuracy of various smartphone ophthalmoscopy approaches for detecting DR in diabetic patients.
Methods: We performed an electronic search on the Medical Literature Analysis and Retrieval System Online (MEDLINE), EMBASE, and Cochrane Library for literature published from January 2000 to November 2018. We included studies involving diabetic patients, which compared the diagnostic accuracy of smartphone ophthalmoscopy for detecting DR to an accurate or commonly employed reference standard, such as indirect ophthalmoscopy, slit-lamp biomicroscopy, and tabletop fundus photography. Two reviewers independently screened studies against the inclusion criteria, extracted data, and assessed the quality of included studies using the Quality Assessment of Diagnostic Accuracy Studies–2 tool, with disagreements resolved via consensus. Sensitivity and specificity were pooled using the random effects model. A summary receiver operating characteristic (SROC) curve was constructed. This review is reported in line with the Preferred Reporting Items for a Systematic Review and Meta-analysis of Diagnostic Test Accuracy Studies guidelines.
Results: In all, nine studies involving 1430 participants were included. Most studies were of high quality, except one study with limited applicability because of its reference standard. The pooled sensitivity and specificity for detecting any DR was 87% (95% CI 74%-94%) and 94% (95% CI 81%-98%); mild nonproliferative DR (NPDR) was 39% (95% CI 10%-79%) and 95% (95% CI 91%-98%); moderate NPDR was 71% (95% CI 57%-81%) and 95% (95% CI 88%-98%); severe NPDR was 80% (95% CI 49%-94%) and 97% (95% CI 88%-99%); proliferative DR (PDR) was 92% (95% CI 79%-97%) and 99% (95% CI 96%-99%); diabetic macular edema was 79% (95% CI 63%-89%) and 93% (95% CI 82%-97%); and referral-warranted DR was 91% (95% CI 86%-94%) and 89% (95% CI 56%-98%). The area under SROC curve ranged from 0.879 to 0.979. The diagnostic odds ratio ranged from 11.3 to 1225.
Conclusions: We found heterogeneous evidence showing that smartphone ophthalmoscopy performs well in detecting DR. The diagnostic accuracy for PDR was highest. Future studies should standardize reference criteria and classification criteria and evaluate other available forms of smartphone ophthalmoscopy in primary care settings.
doi:10.2196/16658
Keywords
Introduction
Diabetic retinopathy (DR) is the leading cause of impaired vision worldwide [
]. One in three patients with diabetes mellitus (DM) have DR [ ]. DR includes proliferative DR (PDR) and various levels of nonproliferative DR (NPDR). PDR, characterized by retinal neovascularization at the disc and elsewhere, displays signs of angiogenesis in response to retinal tissue hypoxia. Neovascularization potentially leads to preretinal and vitreous hemorrhage, resulting in visual loss and, eventually, tractional retinal detachment. It may also cause iris neovascularization with resultant increase in intraocular pressure, eventually leading to neovascular glaucoma [ ]. Typical clinical features of NPDR include the following: (1) microaneurysms and intraretinal hemorrhages from weak capillary walls; (2) hard exudates from vascular protein leakage; and (3) cotton wool spots, caused by ischemic infarcts leading to fluid accumulation. Diabetic macular edema (DME), caused by the thickening of and fluid accumulation in the retina, can occur at any stage of DR [ ].Diabetic eye disease is treatable. Treatments include vascular endothelial growth factor inhibitors, panretinal or focal photocoagulation, and vitrectomy [
]. Strict glycemic and blood pressure control can also delay the development of DR or reduce DR severity [ ]. Treatments available are more effective at halting or slowing visual loss than reversing visual impairment. Yet, most patients remain asymptomatic until the advanced stages of DR. Therefore, early detection of DR before irreversible loss of visual acuity is crucial to ensure better patient outcomes [ ].The gold standard diagnostic test for DR is the Early Treatment Diabetic Retinopathy Study (ETDRS) 7-field stereoscopic color fundus photography or fluorescein angiography [
]. However, fundus cameras are nonportable, expensive, and operator dependent, often requiring patients to sit upright [ , ]. Moreover, fluorescein angiography is invasive, costly, and associated with prominent side effects [ ]. Thus, they are impractical for screening in primary care or mobile settings. Other accurate [ ] and frequently employed DR identification approaches include the following: (1) ophthalmoscopy; (2) slit-lamp biomicroscopy; and (3) other forms of fundus photography [ ]. Optical coherence tomography is an emerging technology that reliably identifies DME by quantifying retinal thickness [ ], but it is expensive and bulky and it cannot accurately grade DR severity.Smartphone ophthalmoscopy, the use of a smartphone’s in-built camera for retinal imaging, could be a valuable method for detecting DR because of its affordability, portability, and ease of use compared with traditional approaches [
]. Various health care workers could potentially operate a smartphone-based retinal imaging device, without limiting this procedure to highly specialized staff. Images acquired by smartphones can be easily shared with and graded remotely by ophthalmologists or other trained graders via telemedicine. These benefits are particularly important in resource-constrained health care settings, such as rural areas in developing countries lacking medical equipment and trained health care professionals [ ]. Several literature reviews [ - ] have discussed smartphone retinal imaging technology and underscored the huge potential of smartphone ophthalmoscopy for detecting DR. Given the potential of this novel approach, we performed a scoping review to systematically collate and assess evidence regarding the accuracy of smartphone ophthalmoscopy for DR identification.Methods
Reporting Guidelines
This scoping review was reported in line with the Preferred Reporting Items for a Systematic Review and Meta-analysis of Diagnostic Test Accuracy Studies (PRISMA-DTA) guidelines [
] and conducted according to the Cochrane Handbook for Systematic Reviews of Diagnostic Test Accuracy [ ]. We adopted a scoping review approach [ , ] because of a broad set of inclusion criteria. The protocol for this review was published in BMJ Open [ ]. We were unable to register this protocol with PROSPERO as it does not include scoping reviews.Search Strategy
We performed a librarian-assisted search on the Medical Literature Analysis and Retrieval System Online (MEDLINE) (Ovid), EMBASE (Ovid), and the Cochrane Library for papers published from January 2000 to November 2018. Articles published before 2000 were excluded because before that smartphone technology was limited. We used both medical subject headings (MeSH) and keywords relating to DR (eg, “diabetic retinopathy,” “macular edema,” and “diabetic maculopathy”) and to smartphones (eg, “mobile health,” “mobile phones,” and “applications”) or AI (eg, “artificial intelligence” and “machine learning”;
). We also explored the bibliography of both primary articles and reviews to identify potentially eligible studies missed by the electronic search.Study Selection
The inclusion criteria were as follows: (1) studies evaluating the diagnostic test accuracy of smartphone ophthalmoscopy for detecting DR in patients with type 1 or 2 DM; (2) studies utilizing a smartphone’s in-built camera for retinal imaging, including the use of any attachments externally fitted to the smartphone; (3) studies comparing smartphone ophthalmoscopy with any acceptable and commonly employed reference standard, such as fundus photography, indirect ophthalmoscopy, slit-lamp biomicroscopy, or fluorescein angiography; (4) studies employing any kind of health care professional to acquire the smartphone images. Language was not an exclusion criterion.
Examples of eligible smartphone ophthalmoscopy techniques include the following:
- Direct ophthalmoscopy: An adaptor is externally attached to a smartphone’s camera. These adaptors usually contain polarizers that reduce artifacts from corneal reflections. The arrangement of polarizers, beam-splitters, and lenses produces an annular illumination pattern.
- Indirect ophthalmoscopy: This simpler, monocular design involves a single lens (eg, 20 D condenser) placed between the smartphone camera and eye. It can be mounted on the phone via hardware or manually held in position.
Covidence software (Veritas Health Innovation, Melbourne, Australia) was used to remove duplicated studies [
]. After a pilot screening of 20 citations to calibrate the judging criteria, two reviewers independently screened all articles retrieved from the search strategy by title and abstract, using Covidence. Subsequently, we screened the full text of the remaining articles and performed data extraction using a prepiloted form. Any disagreements were resolved through consensus.Data Collection
A data extraction form (
) was created and piloted to record the following data from each study: (1) study author and date published; (2) sample size; (3) participant characteristics (eg, age, duration and type of DM); (4) information regarding imaging techniques (eg, details about smartphones and adaptors used, image resolution); (5) health care professional performing smartphone ophthalmoscopy; (6) reference standard used; and (7) test results (eg, true positives [TP], false positives [FP], true negatives [TN], and false negatives [FN]). Corresponding authors were contacted for additional details or missing data required to construct a 2×2 table. Two reviewers independently extracted study data using a data extraction template created in Microsoft Excel, with disagreements resolved via consensus.Quality Assessment
The Quality Assessment of Diagnostic Accuracy Studies tool, QUADAS-2, consisting of descriptions and signaling questions, was used to assess the risk of bias and applicability of all included studies in four domains pertaining to (1) patient selection, (2) index test, (3) reference standard, and (4) flow of participants through the study and timing between the index test and reference standard [
]. Two reviewers independently assessed study quality, and disagreements were resolved via discussion until a consensus was reached.Statistical Analysis
We constructed 2×2 tables based on data from each study. The sensitivity, specificity, positive likelihood ratio (LR+), negative likelihood ratio (LR−), diagnostic odds ratio (DOR), and area under summary receiver operating characteristic (SROC) curve were calculated using a random effects model because of the high expected heterogeneity [
]. We constructed SROC using the bivariate model where possible. Being both a hierarchical and random effects model, the bivariate model is preferred to the Moses-Littenberg SROC curve—the former method accounts for between-study heterogeneity. For SROC curves employing the bivariate model, elliptical 95% confidence regions were obtained by joining the individual confidence regions for logit sensitivity and logit specificity via parametric representations [ , ].Heterogeneity was evaluated using chi-square (χ²) and I2 values of likelihood ratio tests (LRT) or DOR, with I2<25%, 25–75%, and >75% representing low, moderate, and high degree of inconsistency, respectively. Threshold effect was measured using the Spearman correlation coefficient ρ between logits of sensitivity and specificity, with ρ closer to −1 indicating higher threshold effect and better fit of the SROC curve. If information regarding a condition’s prevalence was available from the literature, we calculated the posttest probability using the Fagan nomogram. A P<.05 was considered statistically significant. All analyses were performed using Review Manager version 5.3 from the Cochrane Collaboration [
], METANDI and MIDAS commands in Stata 15.1 (StataCorp, College Station, Texas), Meta-Disc version 1.4 (Ramón y Cajal Hospital, Madrid, Spain) [ ], and mada package in R.Results
Study Selection and Study Characteristics
Our search strategy yielded 1571 unique records. Of those records, the full text for 41 articles was assessed, and nine studies [
- ] met the inclusion criteria ( ). Two [ , ] of the included studies were conference abstracts. A total of 1430 diabetic patients (at least 2743 eyes) were recruited among these studies.All studies reported smartphone fundoscopy techniques involving mydriatic, color, and nonstereoscopic imaging (
and , ). A total of four studies originated from India, three studies from the United States, and one from Italy. All studies which reported data on gender recruited both males and females. These fundus photographs were graded by ophthalmologists, retinal specialists, or artificial intelligence (AI). In all, seven studies utilized direct ophthalmoscopy to acquire smartphone fundus images, while two studies [ , ] used indirect ophthalmoscopy.A total of five studies [
, , , , ] employed slit-lamp biomicroscopy as the reference standard, of which two studies complemented the examination with dilated indirect ophthalmoscopy [ , ]. In all, two studies [ , ] utilized 7-field mydriatic fundus photography using a tabletop fundus camera; one study [ ] used traditional in-clinic diagnosis including a dilated eye examination; and one study [ ] utilized ophthalmologists’ grading of the same smartphone-acquired images as the reference standard. Overall, four studies [ , , , ] utilized the International Clinical DR Disease Severity Scale to grade DR; three studies [ , , ] employed the Airlie House or modified ETDRS criteria; and one study [ ] used the United Kingdom’s National Health Service (NHS) guidelines. Referral-warranted DR (RWDR) was defined as moderate NPDR or worse or DME; vision-threatening DR (VTDR) as severe NPDR, PDR, or DME; and sight-threatening DR (STDR) as PDR or DME. Health professionals performing smartphone ophthalmoscopy included medical students, interns or assistants, retinal specialists, ophthalmologists, and ophthalmic photographers. Most studies reported no funding sources or conflicting interests, while such information was unavailable for two studies [ , ]. In all, two studies [ , ] received funding, one of which disclosed multiple authors holding positions in DigiSight Technologies, Inc.Study author, year | Country, setting | Sample size (patients/eyes) | Age (years), mean (SD) | Diabetes duration (years) | Diabetic retinopathy severity scale | Reference standard | |||
Mean (SD) | Range | ||||||||
Bhat, 2016 [ | ]N/Aa | 80/N/A | N/A | N/A | N/A | ICDRb severity scale; no referral defined as no or mild signs of DRc. | Slit-lamp exam | ||
Kim, 2017 [ | ]United States, Retina Clinic | 72/144 | N/A | N/A | N/A | Referable DR defined as moderate NPDRd or worse, or DMEe. | Slit-lamp biomicroscopy | ||
Kim, 2018 [ | ]United States, Retina Clinic | 71/142 | 56.7 (16.9) | N/A | N/A | Airlie House ETDRSf criteria; RWDRg defined as moderate NPDR or worse, or DME. | Gold standard dilated eye examination, with optical coherence tomography for DME | ||
Rajalakshmi, 2015 [ | ]India, Tertiary care diabetes hospital | 301/602 | 53.5 (9.6) | 12.5 (7.3) | N/A | Modified ETDRS criteria; STDRh defined as PDRi or DME | Mydriatic 7-standard field digital retinal photography | ||
Rajalakshmi, 2018 [ | ]India, Tertiary care diabetes hospital | 301/602 | N/A | N/A | N/A | ICDR severity scale; STDR defined as severe NPDR, PDR, or DME; RDRj defined as moderate NPDR or worse, or DME. | Remidio Fundus On Phone images graded by ophthalmologists | ||
Russo, 2015 [ | ]Italy, Diabetic center | 120/240 | 58.8 (16.4) | 11.6 (9.7) | N/A | ICDR severity scale; ETDRS criteria for DME; RWDR defined as moderate NPDR or worse, regardless of DME status. | Dilated slit-lamp biomicroscopy by a retinal specialist | ||
Ryan, 2015 [ | ]India, Ophthalmology clinic of a tertiary diabetes care center | 300/600 | 48.0 (11.0) | N/A | 0.1-37.2 years | Modified ETDRS criteria; VTDRk defined as severe NPDR or worse, or DME. | Mydriatic 7-field fundus photography by trained optometrists | ||
Sengupta, 2018 [ | ]India, Aravind Eye Hospital | 135/233 | 54.1 (8.3) | 10.7 (5.1) | N/A | National Health Service guidelines; VTDR defined as R2-level or worse (severe NPDR, PDR), or DME. | Dilated slit-lamp biomicroscopy (+90 D lens) and indirect ophthalmoscopy by retinal specialists | ||
Toy, 2016 [ | ]United States, Health care safety-net ophthalmology clinic | 50/100 | 60.5 (10.6) | 11.9 (8.4) | N/A | ICDR severity scale; RWDR defined as moderate NPDR or worse, or ungradable images. | Slit-lamp exam + dilated ophthalmoscopy by technicians |
aN/A: not available.
bICDR: International Clinical Diabetic Retinopathy.
cDR: diabetic retinopathy.
dNPDR: nonproliferative diabetic retinopathy.
eDME: diabetic macular edema.
fETDRS: Early Treatment Diabetic Retinopathy Study.
gRWDR: referral-warranted diabetic retinopathy.
hSTDR: sight-threatening diabetic retinopathy.
iPDR: proliferative diabetic retinopathy.
jRDR: referable diabetic retinopathy.
kVTDR: vision-threatening diabetic retinopathy.
Study author, year | Attachment used | Imaging technique | Smartphone used | Ungradable |
Bhat, 2016 [ | ]Ocular Cellscope | Up to five fields, 50°; AIa: EyeArt v1.2 software used to grade images; acquired by: medical interns and assistants. | iPhone 5S | N/Ab |
Kim, 2017 [ | ]Cellscope Retina | Both human and AI (EyeApp) graders employed. | N/A | N/A |
Kim, 2018 [ | ]Cellscope Retina | 5-field, 50°; fields imaged: central, inferior, superior, nasal, and temporal retina; images were digitally stitched, creating a 100° image; pixels per retinal degree: 52.3; acquired by: medical students or interns. | iPhone 5S | 2 (1.7%) images/eyes |
Rajalakshmi, 2015 [ | ]Remidio Fundus on Phone (FOP) | 4-field, 45°; fields imaged: macula, disc and nasal to optic disc, superior-temporal, inferior-temporal retina; autofocus function of smartphone was used. | Android phone | 0 |
Rajalakshmi, 2018 [ | ]Remidio Fundus on Phone (FOP) | 4-field, 45°; fields imaged: macula centered, disc centered, superior-temporal, and inferior-temporal retina; AI: EyeArt software used to grade images. | Android phone | 5 (1.7%) patients |
Russo, 2015 [ | ]D-Eye (Si14 SpA, Padova, Italy) | 20°; videography and digital images acquired, comprising the posterior pole, macula, optic disc, and peripheral retina; resolution: 3264×2448 pixels; pixels per retinal degree: 150; acquired by: a retinal specialist. | iPhone 5 | 9 (3.8%) eyes |
Ryan, 2015 [ | ]20 D lens | Videography and then screenshots to obtain the best images of optic nerve and macula; resolution: 3264×2488 pixels; FilmIc Pro app used to adjust focus and zoom independently; acquired by: a medical student with limited training. | iPhone 5 | 11 (1.8%) photographs |
Sengupta, 2018 [ | ]Remidio FOP | 3-field, 45°; fields imaged: posterior pole (macula centered), nasal, and superotemporal field; resolution: 441 pixels per inch; acquired by: ophthalmic photographer without special training. | HTC One (M8) | 1.7-2.1% of images |
Toy, 2016 [ | ]Volk Digital ClearField lens mounted on Paxos Scope posterior segment hardware adapter | Variable number of fields, 45°; acquired by: an ophthalmologist. | iPhone 5S | 2 (2%) eyes |
aAI: artificial intelligence.
bN/A: not available.
Quality Assessment
We carried out the quality assessment of the included studies using the QUADAS-2 criteria (
, ). Most studies were of high quality with low risk of bias and applicability concerns. A total of four studies had an unclear risk of bias for patient selection because of the lack of information regarding patient sampling or inappropriate exclusions. The two abstracts were of lower quality than the other studies because of the limited amount of information available. One study contained applicability concerns because it employed ophthalmologists’ grading of smartphone fundoscopy images as the reference standard; it was excluded from the meta-analysis.Meta-Analysis
Any Diabetic Retinopathy
In all, six studies (977 participants;
and ; ) presented data on detecting any DR [ , , - ]. I2LRT was 96.8% (95% CI 94.6%-99.1%), χ²5=63.3, and ρ=−0.332. DOR was 100 (95% CI 27.4-368). Sensitivity and specificity ranged from 50% to 94% and 40% to 99%, respectively. The pooled sensitivity was 87.1% (95% CI 73.9%-94.2%); pooled specificity was 93.7% (95% CI 80.9%-98.1%). LR+ was 13.8 (95% CI 4.37-43.6); LR− was 0.138 (95% CI 0.066-0.287). The area under curve (AUC) was 0.957 (95% CI 0.936-0.972). Considering a pretest probability of 35.4% in diabetic patients [ ], using the Fagan nomogram, the posttest probability for a positive and negative result was 88% and 7%, respectively.We performed subgroup analysis by removing studies individually and investigating the effect on both I2 and ρ. When one study [
] was removed, I2 decreased to 93.0% (95% CI 86.8%-99.3%) and ρ decreased to −1.00, implying this study contributed to the heterogeneity. However, both the type of ophthalmoscopy (direct vs indirect) and reference standard used did not account for the heterogeneity.DRa staging | Studies, n | Overall pooled sensitivity, % (95% CI) | Overall pooled specificity, % (95% CI) | Positive likelihood ratio (95% CI) | Negative likelihood ratio (95% CI) | Diagnostic odds ratio (95% CI) | Area under summary receiver operating characteristic curve (95% CI) |
Any DR | 6 | 87 (74-94) | 94 (81-98) | 14 (4.4–44) | 0.14 (0.06-0.29) | 100 (27.4-368) | 0.957 (0.936-0.972) |
Mild NPDRb | 4 | 39 (10-79) | 95 (91-98) | 8.6 (3.6-20) | 0.64 (0.32-1.3) | 13.6 (3.14-58.5) | 0.939 (0.915-0.957) |
Moderate NPDR | 4 | 71 (57-81) | 95 (88-98) | 15 (4.9-43) | 0.31 (0.20-0.49) | 46.9 (10.6-208) | 0.879 (N/A) |
Severe NPDR | 5 | 80 (49-94) | 97 (88-99) | 28 (6.1-133) | 0.21 (0.069-0.65) | 134 (17.5-1040) | 0.965 (0.945-0.978) |
PDRc | 5 | 92 (79-97) | 99 (96-99) | 97 (22-425) | 0.079 (0.027-0.23) | 1225 (117-12,800) | 0.979 (N/A) |
DMEd | 4 | 79 (63-89) | 93 (82-97) | 11 (4.2-30) | 0.22 (0.12-0.42) | 49.8 (13.7-180) | 0.925 (0.898-0.945) |
RWDRe (moderate NPDR or worse) | 4 | 91 (86-94) | 89 (56-98) | 8.1 (1.6-41) | 0.11 (0.072-0.16) | 75.8 (13.9-414) | 0.921 (0.894-0.941) |
RWDR, VTDRf, STDRg | 6 | 87 (77-92) | 96 (71-99) | 24 (2.6-226) | 0.14 (0.087-0.23) | 171 (25.9-1142) | 0.929 (0.903-0.949) |
AIh (RWDR) | 2 | 91 (84-96) | 50 (38-62) | 1.8 (1.4-2.3) | 0.17 (0.088-0.32) | 11.3 (4.92-26.1) | N/Ai |
aDR: diabetic retinopathy.
bNPDR: nonproliferative diabetic retinopathy.
cPDR: proliferative diabetic retinopathy.
dDME: diabetic macular edema.
eRWDR: referral-warranted diabetic retinopathy.
fVTDR: vision-threatening diabetic retinopathy.
gSTDR: sight-threatening diabetic retinopathy.
hAI: artificial intelligence.
iN/A: not available.
Mild Nonproliferative Diabetic Retinopathy
In all, four studies (542 participants) presented data on detecting mild NPDR [
, , , ]. I2LRT was 81.5% (95% CI 60.6%-100%), χ²3=10.8, and ρ=−0.862. DOR was 13.6 (95% CI 3.14-58.5). Sensitivity and specificity ranged from 0% to 75% and 91% to 99%, respectively. The pooled sensitivity was 39.4% (95% CI 10.1%-79.0%); pooled specificity was 95.4% (95% CI 91.3%-97.6%). LR+ was 8.60 (95% CI 3.64-20.3); LR− was 0.635 (95% CI 0.323-1.25). One study [ ] using a lens mounted on Paxos scope yielded a sensitivity of 0%. The AUC was 0.939 (95% CI 0.915-0.957).Moderate Nonproliferative Diabetic Retinopathy
A total of four studies (542 participants) presented data on detecting moderate NPDR [
, , , ]. DOR was 46.9 (95% CI 10.6-208; I2DOR=85.4%; χ²3=20.6). Sensitivity and specificity ranged from 53% to 82% and 83% to 99%, respectively. The pooled sensitivity was 70.5% (95% CI 56.6%-81.4%); pooled specificity was 95.1% (95% CI 87.8%-98.2%). LR+ was 14.5 (95% CI 4.89-43.2); LR− was 0.310 (95% CI 0.195-0.492). The AUC was approximately 0.879.One study [
] assessed the sensitivity and specificity of smartphone ophthalmology in detecting R1 disease (ie, mild and moderate NPDR) to be 88.2% (95% CI 85.7%-91.6%) and 83.4% (95% CI 78%-87%), respectively.Severe Nonproliferative Diabetic Retinopathy
Overall, five studies (677 participants) presented data on detecting severe NPDR [
, , , , ]. I2LRT was 94.0% (95% CI 88.9%-99.1%), χ²4=33.4, and ρ=−0.111. DOR was 134 (95% CI 17.5-1039). Sensitivity and specificity ranged from 55% to 100% and 84% to 100%, respectively. The pooled sensitivity was 79.5% (95% CI 48.6%-94.1%); pooled specificity was 97.1% (95% CI 87.7%-99.4%). LR+ was 28.4 (95% CI 6.06-133); LR− was 0.211 (95% CI 0.0688-0.645). The AUC was 0.965 (95% CI 0.945-0.978).Removing one study [
] employing medical students and interns for smartphone ophthalmoscopy led to the greatest decrease in ρ to −0.639, indicating that the remaining studies fitted well within the SROC curve. However, removing the study utilizing indirect ophthalmoscopy [ ] resulted in both a decrease in ρ and I2 to −0.464 and 93.8% (95% CI 88.5%-99.2%), respectively. Thus, our subgroup analysis for severe NPDR was inconclusive.Proliferative Diabetic Retinopathy
A total of five studies (677 participants) presented data on detecting PDR [
, , , , ]. DOR was 1225 (95% CI 117-12,800; I2DOR=78.0%; χ²4=18.2). Sensitivity and specificity ranged from 72% to 100% and 94% to 100%. The pooled sensitivity was 92.1% (95% CI 79.1%-97.4%); pooled specificity was 99.0% (95% CI 96.1%-99.8%). LR+ was 96.6 (95% CI 21.9-425); LR− was 0.0789 (95% CI 0.0273-0.228). The AUC was approximately 0.979.Removing one study [
] decreased I2DOR to 0.0%. This study employed a medical student and an intern to acquire smartphone ophthalmoscopy images, potentially resulting in heterogeneity. Removing the only study [ ] using 7-field ETDRS fundus photography as a reference standard, or another study [ ] utilizing indirect ophthalmoscopy, did not reduce I2DOR.Diabetic Macular Edema
Although the diagnosis of DME generally requires stereoscopic retinal imaging, these studies used substitute markers, such as the presence of hard exudates or laser photocoagulation scars.
In all, four studies (627 participants) presented data on detecting DME [
, , , ]. I2LRT was 87.9% (95% CI 75.5%-100%), χ²3=16.6, and ρ=−0.038. DOR was 49.8 (95% CI 13.7–180). Sensitivity and specificity ranged from 47% to 87% and 76% to 98%, respectively. The pooled sensitivity was 79.2% (95% CI 63.2%-89.4%); pooled specificity was 92.9% (95% CI 82.3%-97.4%). LR+ was 11.1 (95% CI 4.22-29.5); LR− was 0.224 (95% CI 0.119-0.422). The AUC was 0.925 (95% CI 0.898–0.945). Considering a pretest probability of 7.48% in diabetic patients [ ], using the Fagan nomogram, the posttest probability for a positive and negative result was 47% and 2%, respectively.Referral-Warranted Diabetic Retinopathy
In all, four studies (313 participants) presented data on detecting RWDR [
, , , ]. I2LRT was 94.3% (95% CI 89.6%-99.1%), χ²3=35.3, and ρ=−1.00. DOR was 75.8 (95% CI 13.9-414). Sensitivity and specificity ranged from 88% to 93% and 57% to 98%, respectively. The pooled sensitivity was 90.5% (95% CI 85.5%-93.8%); pooled specificity was 88.9% (95% CI 56.2%-98.0%). LR+ was 8.13 (95% CI 1.63-40.5); LR− was 0.107 (95% CI 0.0721-0.159). The AUC was 0.921 (95% CI 0.894-0.941).Referral-Warranted Diabetic Retinopathy, Vision-Threatening Diabetic Retinopathy, and Sight-Threatening Diabetic Retinopathy
Overall, six studies (914 participants) presented data on detecting RWDR, VTDR, and STDR [
- , , , ]. I2LRT was 98.6% (95% CI 97.8%-99.4%), χ²5=139, and ρ=−1.00. DOR was 171 (95% CI 25.9-1142). Sensitivity and specificity ranged from 59% to 93% and 57% to 100%, respectively. The pooled sensitivity was 86.5% (95% CI 77.1%-92.4%); pooled specificity was 96.4% (95% CI 71.1%-99.7%). LR+ was 24.1 (95% CI 2.58-226); LR− was 0.140 (95% CI 0.0865-0.228). The AUC was 0.929 (95% CI 0.903-0.949). Owing to a good fit of the SROC curve, subgroup analysis was not performed.One study excluded from the analysis found the agreement for detecting VTDR to be high, κ=0.76 (95% CI 0.68-0.85) [
].Artificial Intelligence in Smartphone Ophthalmoscopy
In all, two studies (152 participants) presented data on detecting RWDR using AI to grade retinal images acquired via smartphone ophthalmoscopy compared with conventional slit-lamp biomicroscopy [
, ]. I2LRT was 0.0%. DOR was 11.3 (95% CI 4.92-26.1). Specificity ranged from 46% to 57%. The pooled sensitivity was 91.2% (95% CI 84.3%-95.7%); pooled specificity was 50.0% (95% CI 38.3%-61.7%). LR+ was 1.80 (95% CI 1.42-2.28); LR− was 0.167 (95% CI 0.088-0.316). Owing to the limited number of included studies, the fixed effects model Moses-Littenberg SROC curve was employed for this analysis, and a 95% confidence region was not available.Another study (301 participants) compared an AI’s grading of smartphone ophthalmoscopy images with the reference standard ophthalmologists’ grading of the same images [
]. It reported a high sensitivity of 95.8% (95% CI 92.9%-98.7%), 99.1% (95% CI 95.1%-99.9%), and 99.3% (95% CI 96.1%-99.9%), and a specificity of 80.2% (95% CI 72.6%-87.8%), 80.4% (95% CI 73.9%-85.9%), and 68.8% (95% CI 61.5%-76.2%) for any DR, STDR, and RWDR, respectively.Discussion
Summary of Results
Overall, smartphone ophthalmoscopy performed well in detecting DR. Depending on the severity of DR, smartphone ophthalmoscopy had different accuracy. Progressing from mild NPDR to PDR, we observed an increasing trend in smartphone ophthalmoscopy’s sensitivity, specificity, and DOR. In addition, smartphone ophthalmoscopy had the best performance in detecting PDR, RWDR, VTDR, and STDR; these are important categories to detect as they can significantly affect vision. The lowest sensitivity was observed for detecting mild NPDR, mainly caused by one study enrolling only 7 participants with RWDR. The DOR was lowest for AI’s detection of RWDR. There was also a low percentage of ungradable images across most studies, implying that smartphone ophthalmoscopy is relatively reliable. Common causes of ungradable images included cataracts, poor pupil dilation, vitreous hemorrhages, or poor image focus.
Most studies performed smartphone direct ophthalmoscopy utilizing one of four different attachments. The included studies also assessed two methods of indirect ophthalmoscopy. Smartphone ophthalmoscopy in the included studies surpasses the UK NHS targets requiring DR retinal imaging equipment to have a minimum resolution of 6 megapixels or 30 pixels per retinal degree [
, ]. Two studies [ , ] assigned DR grades at a patient level instead of assessing each eye individually. In some cases, smartphone apps were used to digitally stitch the multiple images obtained per eye or enhance the image acquisition process by facilitating ergonomic focusing and capturing of images. Videography was used in two studies [ , ]. Different grading criteria and reference standards were applied across the included studies. High heterogeneity among studies was observed for most types of DR—especially for moderate NPDR and PDR—except for studies employing AI to detect RWDR. In other studies reporting on the use of non-AI smartphone ophthalmoscopy in mild NPDR, RWDR only, and RWDR, VTDR, and STDR, a significant proportion of heterogeneity can be attributed to the threshold effect.The diagnostic accuracy of AI in grading smartphone ophthalmoscopy images was unexpectedly low. In two studies, the specificity and DOR of AI in detecting RWDR was lower than that of human graders (retinal specialists and ophthalmologists). Nevertheless, one of those studies employed both human and AI to grade identical smartphone-acquired images; the specificity of AI was higher than that of humans. In contrast, a 2015 study demonstrated that AI detects RWDR in smartphone ophthalmoscopy images with 100% sensitivity and 80% specificity (AUC 0.94) [
]. In addition, a recent review revealed that AI software achieved high sensitivity and specificity for detecting DR in datasets of fundus images acquired from other imaging modalities [ ]. Finally, IDx-DR was the first commercially approved AI-based autonomous diagnostic system for DR detection. In a prospective study of 900 participants, this system attained high sensitivity and specificity of 87.2% (95% CI 81.8%-91.2%) and 90.7% (95% CI 88.3%-92.7%), respectively, in detecting more than mild DR [ ].Smartphone ophthalmoscopy is a safe means of acquiring retinal images [
]. One study [ ] surveyed patients on their comfort levels while undergoing retinal imaging and revealed that all participants felt more comfortable with the light from Cellscope Retina than the light from slit lamps. Other studies [ , ] employing either an intrinsic smartphone light source or external light sources reported lower luminance than conventional fundus cameras.Comparison to Existing Studies
To our knowledge, this is the first meta-analysis evaluating the diagnostic accuracy of smartphone ophthalmoscopy for detecting DR in diabetic patients. A meta-analysis [
] evaluated the agreement between smartphone retinal imaging and retinal cameras encompassing multiple eye pathologies such as DR, glaucoma, and ocular hypertension. It reported excellent image quality in 84.7% of smartphone images, with good diagnostic accuracy; combined κ agreement was 77.8% (95% CI 70.34%-83.70%), AUC=0.86. However, the patient selection was not limited to diabetic individuals. A large study [ ] involving 1460 participants (2920 eyes) had previously evaluated the diagnostic accuracy of smartphone ophthalmoscopy for optic disc imaging. Videography was performed with Peek Retina adaptor attached to an 8.0-megapixel Samsung SIII smartphone. This technique demonstrated excellent agreement (weighted κ=0.69) with a reference standard tabletop fundus camera in measuring vertical cup-disc ratio. Using smartphone ophthalmoscopy, 79.5% of eyes were gradable, compared with 86.4% for tabletop retinal imaging. Furthermore, there was no significant difference between image quality acquired by professional and inexperienced photographers. This study reported a lower percentage of gradable eyes compared with most studies included in our scoping review. This could be attributed to inherent differences in the process of DR grading (which requires examination of the retina in general) compared with measuring cup-disc ratio (which specifically examines the optic disc). Regardless of sample size, the agreement of smartphone ophthalmoscopy with a well-established reference standard remains high.Strengths
This scoping review aimed to provide a comprehensive analysis of the available literature in this field. Correspondingly, we had broad inclusion criteria encompassing different smartphone ophthalmoscopy techniques, reference standards, DR severity scales, and health care professionals. Smartphone retinal imaging is an emerging technology, and we wanted to capture as much of the available evidence as possible (
). The included studies were published relatively recently, from 2015 to 2018, heralding future breakthroughs in the diagnostic accuracy of smartphone retinal imaging as affordable and accessible means of DR detection.Our study employed a comprehensive search strategy and examined studies from different countries involving different types of diabetic patients. At least two reviewers performed quality assessment and data extraction independently following Cochrane methodology. Based on the QUADAS-2 tool, most studies possessed minimal risk of bias and little applicability concerns. In particular, all included studies blinded or masked the graders.
Limitations
Although the protocol for this scoping review was published in BMJ Open, this protocol was not registered. Three studies [
, , ] utilized two graders, thereby creating two separate 2×2 tables; to avoid double-counting, we averaged the TP, FP, TN, and FN values, and rounded those average values to the nearest whole number for analysis. For one study [ ], we assumed none of the four excluded eyes had DME. Owing to the small number of studies and limited information available, we were not able to conduct a meta-regression analysis or assess for publication bias.The large 95% CIs for most SROC curves indicate imprecision. Although only studies involving diabetic patients were included, most studies were conducted in tertiary health care settings: eye or diabetes clinics. These settings can afford tabletop or portable fundus cameras. Instead, smartphone ophthalmoscopy is more relevant for screening in primary settings or resource-constrained countries. All studies required mydriasis, despite the availability of nonmydriatic smartphone ophthalmoscopy attachments [
].Implications for Future Research
Future studies on smartphone ophthalmoscopy could utilize more consistent reference standards, such as the gold standard 7-field ETDRS stereoscopic color photographs, and standardize the DR classification criteria. Such standardization minimizes bias and heterogeneity between studies. In addition, ultrawide-field retinal imaging may detect DR features outside the 7-field ETDRS field of view, which may be of clinical significance [
]. More studies could focus on (1) indirect ophthalmoscopy or ultrawide-field retinal imaging; (2) nonmydriatic techniques; (3) AI; and (4) primary health care settings where the comorbidities and prevalence of DR in this demographic differs.Conclusions
Smartphone ophthalmoscopy may have an important role in identifying DR in areas with limited access to expensive retinal imaging equipment and trained staff. Our findings show that smartphone ophthalmoscopy performs well in detecting DR. However, the included studies were scarce and heterogeneous and provided imprecise findings. Future studies should use more consistent reference standards and DR classification criteria, evaluate other available forms of smartphone ophthalmoscopy, and recruit participants from primary care settings.
Acknowledgments
The authors thank Ms Soong Ai Jia for her contribution to the data extraction stage of this review. The authors gratefully acknowledge funding support from Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore, Singapore.
Authors' Contributions
LTC conceived the idea for this study. CHT and BK screened the articles, extracted the data, and performed the analysis. CHT and LTC wrote the manuscript. BK, HS, CT, and LTC revised the manuscript critically.
Conflicts of Interest
None declared.
Medical Literature Analysis and Retrieval System Online (MEDLINE), EMBASE, and Cochrane Library search strategy.
PDF File (Adobe PDF File), 54 KB
Data extraction sheet.
PDF File (Adobe PDF File), 70 KB
Additional details of included studies.
PDF File (Adobe PDF File), 58 KB
Details of quality assessment of included studies using Quality Assessment of Diagnostic Accuracy Studies-2.
PDF File (Adobe PDF File), 38 KB
Additional study details received from study investigators.
PDF File (Adobe PDF File), 173 KBReferences
- Cheung N, Mitchell P, Wong TY. Diabetic retinopathy. Lancet 2010 Jul 10;376(9735):124-136. [CrossRef] [Medline]
- Lee R, Wong TY, Sabanayagam C. Epidemiology of diabetic retinopathy, diabetic macular edema and related vision loss. Eye Vis (Lond) 2015;2:17 [FREE Full text] [CrossRef] [Medline]
- Rodrigues GB, Abe RY, Zangalli C, Sodre SL, Donini FA, Costa DC, et al. Neovascular glaucoma: a review. Int J Retina Vitreous 2016;2:26 [FREE Full text] [CrossRef] [Medline]
- Cohen S, Gardner T. Diabetic retinopathy and diabetic macular edema. Dev Ophthalmol 2016;55:137-146 [FREE Full text] [CrossRef] [Medline]
- Stewart MW. Treatment of diabetic retinopathy: Recent advances and unresolved challenges. World J Diabetes 2016 Aug 25;7(16):333-341 [FREE Full text] [CrossRef] [Medline]
- Mohamed Q, Gillies MC, Wong TY. Management of diabetic retinopathy: a systematic review. J Am Med Assoc 2007 Aug 22;298(8):902-916. [CrossRef] [Medline]
- Ellis D, Burgess PI, Kayange P. Management of diabetic retinopathy. Malawi Med J 2013 Dec;25(4):116-120 [FREE Full text] [Medline]
- Falavarjani KG, Wang K, Khadamy J, Sadda SR. Ultra-wide-field imaging in diabetic retinopathy; an overview. J Curr Ophthalmol 2016 Jun;28(2):57-60 [FREE Full text] [CrossRef] [Medline]
- Panwar N, Huang P, Lee J, Keane PA, Chuan TS, Richhariya A, et al. Fundus Photography in the 21st Century--a review of recent technological advances and their implications for worldwide healthcare. Telemed J E Health 2016 Mar;22(3):198-208 [FREE Full text] [CrossRef] [Medline]
- Shields CL, Materin M, Shields JA. Panoramic imaging of the ocular fundus. Arch Ophthalmol 2003 Nov;121(11):1603-1607. [CrossRef] [Medline]
- Lira RP, Oliveira CL, Marques MV, Silva AR, Pessoa CD. Adverse reactions of fluorescein angiography: a prospective study. Arq Bras Oftalmol 2007;70(4):615-618 [FREE Full text] [CrossRef] [Medline]
- Ahmed J, Ward TP, Bursell S, Aiello LM, Cavallerano JD, Vigersky RA. The sensitivity and specificity of nonmydriatic digital stereoscopic retinal imaging in detecting diabetic retinopathy. Diabetes Care 2006 Oct;29(10):2205-2209. [CrossRef] [Medline]
- Goh JK, Cheung CY, Sim SS, Tan PC, Tan GS, Wong TY. Retinal imaging techniques for diabetic retinopathy screening. J Diabetes Sci Technol 2016 Feb 1;10(2):282-294 [FREE Full text] [CrossRef] [Medline]
- Massin P, Girach A, Erginay A, Gaudric A. Optical coherence tomography: a key to the future management of patients with diabetic macular oedema. Acta Ophthalmol Scand 2006 Aug;84(4):466-474 [FREE Full text] [CrossRef] [Medline]
- Hong SC. 3D printable retinal imaging adapter for smartphones could go global. Graefes Arch Clin Exp Ophthalmol 2015 Oct;253(10):1831-1833. [CrossRef] [Medline]
- Jones S, Edwards RT. Diabetic retinopathy screening: a systematic review of the economic evidence. Diabet Med 2010 Mar;27(3):249-256. [CrossRef] [Medline]
- Micheletti JM, Hendrick AM, Khan FN, Ziemer DC, Pasquel FJ. Current and next generation portable screening devices for diabetic retinopathy. J Diabetes Sci Technol 2016 Feb 16;10(2):295-300 [FREE Full text] [CrossRef] [Medline]
- Fenner BJ, Wong RL, Lam W, Tan GS, Cheung GC. Advances in retinal imaging and applications in diabetic retinopathy screening: a review. Ophthalmol Ther 2018 Dec;7(2):333-346 [FREE Full text] [CrossRef] [Medline]
- Bolster NM, Giardini ME, Bastawrous A. The diabetic retinopathy screening workflow: potential for smartphone imaging. J Diabetes Sci Technol 2015 Nov 23;10(2):318-324 [FREE Full text] [CrossRef] [Medline]
- McInnes MD, Moher D, Thombs BD, McGrath TA, Bossuyt PM, the PRISMA-DTA Group, et al. Preferred reporting items for a systematic review and meta-analysis of diagnostic test accuracy studies: The PRISMA-DTA Statement. J Am Med Assoc 2018 Jan 23;319(4):388-396. [CrossRef] [Medline]
- Higgins JP, Green S. Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 [updated March 2011]. Hoboken, New Jersey: Wiley; 2011.
- Levac D, Colquhoun H, O'Brien KK. Scoping studies: advancing the methodology. Implement Sci 2010 Sep 20;5:69 [FREE Full text] [CrossRef] [Medline]
- Arksey H, O'Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol 2005 Feb;8(1):19-32. [CrossRef]
- Tan CH, Quah W, Tan CS, Smith H, Tudor Car L. Use of smartphones for detecting diabetic retinopathy: a protocol for a scoping review of diagnostic test accuracy studies. BMJ Open 2019 Dec 8;9(12):e028811 [FREE Full text] [CrossRef] [Medline]
- Covidence - Better systematic review management. URL: https://www.covidence.org/home [accessed 2020-03-16]
- Whiting PF, Rutjes AW, Westwood ME, Mallett S, Deeks JJ, Reitsma JB, QUADAS-2 Group. QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med 2011 Oct 18;155(8):529-536. [CrossRef] [Medline]
- Glas AS, Lijmer JG, Prins MH, Bonsel GJ, Bossuyt PM. The diagnostic odds ratio: a single indicator of test performance. J Clin Epidemiol 2003 Nov;56(11):1129-1135. [CrossRef] [Medline]
- Reitsma JB, Glas AS, Rutjes AW, Scholten RJ, Bossuyt PM, Zwinderman AH. Bivariate analysis of sensitivity and specificity produces informative summary measures in diagnostic reviews. J Clin Epidemiol 2005 Oct;58(10):982-990. [CrossRef] [Medline]
- Harbord RM, Deeks JJ, Egger M, Whiting P, Sterne JA. A unification of models for meta-analysis of diagnostic accuracy studies. Biostatistics 2007 Apr;8(2):239-251. [CrossRef] [Medline]
- Cochrane Community. Copenhagen: The Nordic Cochrane Centre: The Cochrane Collaboration; 2014. Review Manager (RevMan) [Computer program]. Version 5.3 URL: https://community.cochrane.org/help/tools-and-software/revman-5 [accessed 2020-03-16]
- Zamora J, Abraira V, Muriel A, Khan K, Coomarasamy A. Meta-DiSc: a software for meta-analysis of test accuracy data. BMC Med Res Methodol 2006 Jul 12;6:31 [FREE Full text] [CrossRef] [Medline]
- Bhat S, Bhaskaranand M, Ramachandra C, Qi O, Liu L, Apte R, et al. Automated image analysis for Diabetic Retinopathy Screening with iPhone-based fundus camera. Invest Ophthalmol Vis Sci 2016 Sep;57(12):5964 [FREE Full text]
- Kim T, Li P, Niziol LM, Bhaskaranand M, Bhat S, Ramachandra C, et al. Comparison of automated and expert human grading of diabetic retinopathy using smartphone-based retinal photography. Invest Ophthalmol Vis Sci 2017;58(8):659 [FREE Full text]
- Kim TN, Myers F, Reber C, Loury PJ, Loumou P, Webster D, et al. A smartphone-based tool for rapid, portable, and automated wide-field retinal imaging. Transl Vis Sci Technol 2018 Sep;7(5):21 [FREE Full text] [CrossRef] [Medline]
- Rajalakshmi R, Arulmalar S, Usha M, Prathiba V, Kareemuddin KS, Anjana RM, et al. Validation of smartphone based retinal photography for diabetic retinopathy screening. PLoS One 2015;10(9):e0138285 [FREE Full text] [CrossRef] [Medline]
- Rajalakshmi R, Subashini R, Anjana RM, Mohan V. Automated diabetic retinopathy detection in smartphone-based fundus photography using artificial intelligence. Eye (Lond) 2018 Jun;32(6):1138-1144 [FREE Full text] [CrossRef] [Medline]
- Russo A, Morescalchi F, Costagliola C, Delcassi L, Semeraro F. Comparison of smartphone ophthalmoscopy with slit-lamp biomicroscopy for grading diabetic retinopathy. Am J Ophthalmol 2015 Feb;159(2):360-4.e1. [CrossRef] [Medline]
- Ryan ME, Rajalakshmi R, Prathiba V, Anjana RM, Ranjani H, Narayan KM, et al. Comparison Among Methods of Retinopathy Assessment (CAMRA) Study: smartphone, nonmydriatic, and mydriatic photography. Ophthalmology 2015 Oct;122(10):2038-2043 [FREE Full text] [CrossRef] [Medline]
- Sengupta S, Sindal MD, Baskaran P, Pan U, Venkatesh R. Sensitivity and specificity of smartphone-based retinal imaging for diabetic retinopathy: a comparative study. Ophthalmol Retina 2019 Feb;3(2):146-153. [CrossRef] [Medline]
- Toy BC, Myung DJ, He L, Pan CK, Chang RT, Polkinhorne A, et al. Smartphone-based dilated fundus photography and near visual acuity testing as inexpensive screening tools to detect referral warranted diabetic eye disease. Retina 2016 May;36(5):1000-1008. [CrossRef] [Medline]
- Bhat S, Bhaskaranand M, Ramachandra C, Margolis T, Fletcher D, Solanki K. Fully-automated diabetic retinopathy screening using cellphone-based cameras. Invest Ophthalmol Vis Sci 2015;56(7):1428 [FREE Full text]
- Ting DS, Pasquale LR, Peng L, Campbell JP, Lee AY, Raman R, et al. Artificial intelligence and deep learning in ophthalmology. Br J Ophthalmol 2019 Feb;103(2):167-175 [FREE Full text] [CrossRef] [Medline]
- Abràmoff MD, Lavin PT, Birch M, Shah N, Folk JC. Pivotal trial of an autonomous AI-based diagnostic system for detection of diabetic retinopathy in primary care offices. NPJ Digit Med 2018;1:39 [FREE Full text] [CrossRef] [Medline]
- Kim DY, Delori F, Mukai S. Smartphone photography safety. Ophthalmology 2012 Oct;119(10):2200-1; author reply 2201. [CrossRef] [Medline]
- Vilela MA, Valença FM, Barreto PK, Amaral CE, Pellanda LC. Agreement between retinal images obtained via smartphones and images obtained with retinal cameras or fundoscopic exams - systematic review and meta-analysis. Clin Ophthalmol 2018;12:2581-2589. [CrossRef]
- Bastawrous A, Giardini ME, Bolster NM, Peto T, Shah N, Livingstone IA, et al. Clinical validation of a smartphone-based adapter for optic disc imaging in Kenya. JAMA Ophthalmol 2016 Feb;134(2):151-158 [FREE Full text] [CrossRef] [Medline]
- Silva PS, Cavallerano JD, Haddad NM, Kwak H, Dyer KH, Omar AF, et al. Peripheral lesions identified on ultrawide field imaging predict increased risk of diabetic retinopathy progression over 4 years. Ophthalmology 2015 May;122(5):949-956. [CrossRef] [Medline]
Abbreviations
AI: artificial intelligence |
AUC: area under curve |
DM: diabetes mellitus |
DME: diabetic macular edema |
DOR: diagnostic odds ratio |
DR: diabetic retinopathy |
ETDRS: Early Treatment Diabetic Retinopathy Study |
FN: false negatives |
FP: false positives |
LR−: negative likelihood ratio |
LR+: positive likelihood ratio |
LRT: likelihood ratio test |
NHS: National Health Service |
NPDR: nonproliferative diabetic retinopathy |
PDR: proliferative diabetic retinopathy |
QUADAS: Quality Assessment of Diagnostic Accuracy Studies |
RWDR: referral-warranted diabetic retinopathy |
SROC: summary receiver operating characteristic |
STDR: sight-threatening diabetic retinopathy |
TN: true negatives |
TP: true positives |
VTDR: vision-threatening diabetic retinopathy |
Edited by G Eysenbach; submitted 11.10.19; peer-reviewed by M Tse, R Raman, G Lim; comments to author 16.01.20; revised version received 12.02.20; accepted 16.02.20; published 15.05.20
Copyright©Choon Han Tan, Bhone Myint Kyaw, Helen Smith, Colin S Tan, Lorainne Tudor Car. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.05.2020.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.