Skip to main content
Advertisement
  • Loading metrics

Diagnostic accuracy of a smartphone-based device (VistaView) for detection of diabetic retinopathy: A prospective study

  • Rida Shahzad,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Supervision, Validation, Writing – original draft

    Affiliation Shahzad Eye Hospital, Karachi, Pakistan

  • Arshad Mehmood,

    Roles Investigation

    Affiliation Shahzad Eye Hospital, Karachi, Pakistan

  • Danish Shabbir,

    Roles Investigation, Software

    Affiliation Shahzad Eye Hospital, Karachi, Pakistan

  • M. A. Rehman Siddiqui

    Roles Conceptualization, Formal analysis, Investigation, Methodology, Resources, Supervision, Writing – review & editing

    rehman.siddiqui@gmail.com

    Affiliations Shahzad Eye Hospital, Karachi, Pakistan, Department of Ophthalmology and Visual Sciences, Aga Khan University, Karachi, Pakistan

Abstract

Background

Diabetic retinopathy (DR) is a leading cause of blindness globally. The gold standard for DR screening is stereoscopic colour fundus photography with tabletop cameras. VistaView is a novel smartphone-based retinal camera which offers mydriatic retinal imaging. This study compares the diagnostic accuracy of the smartphone-based VistaView camera compared to a traditional desk mounted fundus camera (Triton Topcon). We also compare the agreement between graders for DR screening between VistaView images and Topcon images.

Methodology

This prospective study took place between December 2021 and June 2022 in Pakistan. Consecutive diabetic patients were imaged following mydriasis using both VistaView and Topcon cameras at the same sitting. All images were graded independently by two graders based on the International Classification of Diabetic Retinopathy (ICDR) criteria. Individual grades were assigned for severity of DR and maculopathy in each image. Diagnostic accuracy was calculated using the Topcon camera as the gold standard. Agreement between graders for each device was calculated as intraclass correlation coefficient (ICC) (95% CI) and Cohen’s weighted kappa (k).

Principal findings

A total of 1428 images were available from 371 patients with both cameras. After excluding ungradable images, a total of 1231 images were graded. The sensitivity of VistaView for any DR was 69.9% (95% CI 62.2–76.6%) while the specificity was 92.9% (95% CI 89.9–95.1%), and PPV and NPV were 80.5% (95% CI 73–86.4%) and 88.1% (95% CI 84.5–90.9) respectively. The sensitivity of VistaView for RDR was 69.7% (95% CI 61.7–76.8%) while the specificity was 94.2% (95% CI 91.3–96.1%), and PPV and NPV were 81.5% (95% CI 73.6–87.6%) and 89.4% (95% CI 86–92%) respectively. The sensitivity for detecting maculopathy in VistaView was 71.2% (95% CI 62.8–78.4%), while the specificity was 86.4% (82.6–89.4%). The PPV and NPV of detecting maculopathy were 63% (95% CI 54.9–70.5%) and 90.1% (95% CI 86.8–92.9%) respectively. For VistaView, the ICC of DR grades was 78% (95% CI, 75–82%) between the two graders and that of maculopathy grades was 66% (95% CI, 59–71%). The Cohen’s kappa for retinopathy grades of VistaView images was 0.61 (95% CI, 0.55–0.67, p<0.001), while that for maculopathy grades was 0.49 (95% CI 0.42–0.57, p<0.001). For images from the Topcon desktop camera, the ICC of DR grades was 85% (95% CI, 83–87%), while that of maculopathy grades was 79% (95% CI, 75–82%). The Cohen’s kappa for retinopathy grades of Topcon images was 0.68 (95% CI, 0.63–0.74, p<0.001), while that for maculopathy grades was 0.65 (95% CI, 0.58–0.72, p<0.001).

Conclusion

The VistaView offers moderate diagnostic accuracy for DR screening and may be used as a screening tool in LMIC.

Author summary

Diabetic retinopathy (DR) is a highly prevalent retinal disease globally which can lead to irreversible loss of vision if left untreated. Therefore, it is essential that efficient systematic screening processes be established to facilitate timely diagnosis and management to prevent loss of vision. Standard methods of DR screening require heavy and expensive equipment operated by trained professionals and are often inaccessible to marginalised communities. In this study, we investigated the diagnostic accuracy of a lightweight, portable and relatively inexpensive smartphone-based retinal camera to detect DR, compared to a standard tabletop imaging device. We found that diagnostic accuracy needs to improve further to make these devices a suitable option for DR screening, especially in low- and middle-income countries where access to healthcare has several barriers.

Introduction

Diabetic retinopathy (DR) is a leading cause of blindness globally within the working age group [1]. The prevalence of diabetes mellites is increasing exponentially every year especially in low-middle income countries (LMIC) including Pakistan and India. Consequently the burden of DR is also rising, and its global prevalence is estimated to reach 160.5 million by 2045 [2]. DR remains asymptomatic until it reaches an advanced stage when loss of vision occurs. To prevent blindness, it is essential that DR is detected and treated in its early stages, as timely management can reduce the risk of severe visual loss by up to 90% [3]. Detection of DR is conventionally done through retinal screening in diabetic patients by ophthalmologists or colour fundus photography [4,5]. The gold standard for DR screening is seven field stereoscopic colour fundus photography. This requires trained personnel and bulky, immovable desk-mounted fundus cameras, which are costly and not universally accessible in the community. However, single-field posterior fundus imaging is reported to be as accurate for the screening of DR [6,7].

Recently smartphone-based retinal photography to screen DR has gained traction. This modality has the potential to be efficient in terms of time, cost, and space [810]. VistaView is a smartphone-based, portable, handheld camera manufactured by Volk Optical Inc (Mentor, Ohio) which enables quick mydriatic retinal imaging. It captures single-field 55 degrees fundus images with a resolution of 28.4 pixels/degree which can be instantly viewed and analysed.

The aim of this study was to evaluate the diagnostic accuracy of VistaView (the index test) with a standard Topcon Triton (Topcon, Tokyo, Japan) desk mounted fundus camera for the detection of diabetic retinopathy.

Materials and Methods

This prospective, cohort study was carried out at a tertiary eye hospital in Karachi, Pakistan between December 2021 to June 2022. We recruited 375 consecutive patients (714 eyes) attending the eye clinic with known type-1 or type-2 diabetes above the age of 16 years. Exclusion criteria was pre-diabetics, and patients who were previously treated for PDR with laser or vitrectomy, as these patients would have retinal scars from treatment and likely to be aware of the advanced stage of DR and/ or already under a hospital eyecare service.

This manuscript followed the Standards for Reporting of Diagnostic Accuracy Studies (STARD) 2015 guidelines [11]. Ethics review board of the hospital approved the study (Ref# 605685). Informed written consent was obtained from all patients. The research adhered to the tenets of Helsinki. We collected demographic information from each patient including age, gender, and duration of the disease. Each recruited patient underwent dilation with Tropicamide 1% and Cyclopentoate Hydrochloride 1%, and colour fundus photography of both eyes was performed by trained technicians using the standard Topcon camera and the handheld VistaView device at the same sitting in a random sequence. Topcon fundus camera captures high resolution fundus photographs enhanced by PixelSmart technology. It offers up to 45 degrees field of view. Its resolution on fundus is 60 Lines/mm in the center, 40 lines/mm in the middle, and 25 Lines/mm in the periphery. The VistaView acquires images with an output resolution of 3072 x 2122 pixels with a field of view of up to 55 degrees. All images were anonymized, and uploaded to a protected cloud database.

Diabetic retinopathy grading

Images acquired from both devices were analysed and graded independently by two certified graders using the ICDR (International Classification of Diabetic Retinopathy) system [12]. The DR stages included: no retinopathy, mild nonproliferative DR(NPDR), moderate NPDR, severe NPDR, and PDR (Fig 1). The images were also graded for presence or absence of maculopathy (exudates, hemorrhages or apparent thickening within 1 disc diameter from the fovea). The images acquired using VistaView were graded before the Topcon images to avoid information bias. Consensus was reached for any disagreements. If required, arbitration was done by a fellowship trained vitreo-retinal specialist for final grades. The final DR grades assigned to images acquired with the desktop Topcon camera were considered as the gold standard.

thumbnail
Fig 1. Examples of stages of diabetic retinopathy based on ICDR system.

(A) No DR; (B) Mild NPDR; (C) Moderate NPDR; (D) Severe NPDR; (E) PDR. DR, diabetic retinopathy; NPDR, nonproliferative diabetic retinopathy; PDR, proliferative diabetic retinopathy. ICDR, International Classification of Diabetic Retinopathy.

https://doi.org/10.1371/journal.pdig.0000649.g001

Image quality

The photographs taken with both the cameras were graded on a 5-level grading scheme as follows: [13,14] (Fig 2)

  • Grade 0: Ungradable (no retinal details visible due to media opacities such as dense cataract)
  • Grade 1: Poor (only gross retinal changes visible such as hemorrhages and dense hard exudates)
  • Grade 2: Satisfactory (major retinopathy details visible; minor degrees of retinopathy and subtle new vessels not clearly detectable.
  • Grade 3: Good (most of retinopathy changes clear)
  • Grade 4: Excellent (all lesions clearly visible)
thumbnail
Fig 2. Examples of image quality grades.

(A) Grade 0; (B) Grade 1; (C) Grade 2; (D) Grade 3; (D) Grade 4.

https://doi.org/10.1371/journal.pdig.0000649.g002

Statistical analysis

Continuous variables such as age and gender of patients were reported as means (+/- standard deviation). Statistical analysis was performed using SPSS (IBM, SPSS Inc.) version 27. The sensitivity and specificity of the VistaView was calculated, against Topcon Triton fundus image (the gold standard). Diagnostic accuracy was evaluated at 2 dichotomous values for any DR and referable DR (RDR). RDR was defined as moderate NPDR stage or higher (ICDR>1) and/or maculopathy. The level of agreement between the two graders was assessed using the Cohen’s kappa statistic for both Topcon images and VistaView images. Inter-rater reliabilities between the two graders were calculated as ICCs (intra-class coefficients). Gradeability of images was reported as descriptive frequencies.

Results

Demographics

A total of 371 patients were enrolled in the study. Of these, 194 (52%) were male. The mean age of the overall cohort was 59 (Table 1). The mean duration of diabetes in the study population was 10.5 years.

Gradeability

A total of 1428 images were available from 371 patients from both the cameras (Fig 3). A total of 1231 images were graded. Numbers and percentages of retinopathy and maculopathy grades detected with each device are given in Table 2.

thumbnail
Table 2. Retinopathy and maculopathy stages detected with both devices with percentages.

NPDR: nonproliferative diabetic retinopathy, PDR: proliferative diabetic retinopathy, M: maculopathy, U: ungradable.

https://doi.org/10.1371/journal.pdig.0000649.t002

Diagnostic accuracy

The overall sensitivity of VistaView for any DR was 69.9% (95% CI 62.2–76.6%) while the specificity was 92.9% (95% CI 89.9–95.1%), (Table 3). PPV and NPV were 80.5% (95% CI 73–86.4%) and 88.1% (95% CI 84.5–90.9) respectively. The receiver operating curve (ROC) for any DR is given in Fig 4. The sensitivity of VistaView for RDR was 69.7% (95% CI 61.7–76.8%) while the specificity was 94.2% (95% CI 91.3–96.1%). PPV and NPV were 81.5% (95% CI 73.6–87.6%) and 89.4% (95% CI 86–92%) respectively. The sensitivity for detecting maculopathy in VistaView was 71.2% (95% CI 62.8–78.4%), while the specificity was 86.4% (82.6–89.4%). The PPV and NPV of detecting maculopathy were 63% (95% CI 54.9–70.5%) and 90.1% (95% CI 86.8–92.9%) respectively (Table 3).

thumbnail
Table 3. Diagnostic accuracy for retinopathy and maculopathy detection by VistaView in comparison to Topcon.

DR: diabetic retinopathy, RDR: referrable diabetic retinopathy.

https://doi.org/10.1371/journal.pdig.0000649.t003

Agreement

For VistaView, the ICC of DR grades was 78% (95% CI, 75–82%) between the two graders and that of maculopathy grades was 66% (95% CI, 59–71%). The Cohen’s kappa for retinopathy grades of VistaView images was 0.61 (95% CI, 0.55–0.67, p<0.001), while that for maculopathy grades was 0.49 (95% CI 0.42–0.57, p<0.001). For images from the Topcon desktop camera, the ICC of DR grades was 85% (95% CI, 83–87%), while that of maculopathy grades was 79% (95% CI, 75–82%). The Cohen’s kappa for retinopathy grades of Topcon images was 0.68 (95% CI, 0.63–0.74, p<0.001), while that for maculopathy grades was 0.65 (95% CI, 0.58–0.72, p<0.001). (Fig 5, Table 4)

thumbnail
Fig 5. Agreement metrics of graders for both devices.

DR, diabetic retinopathy; M, maculopathy.

https://doi.org/10.1371/journal.pdig.0000649.g005

thumbnail
Table 4. Interrater reliability expressed as ICC (intra-class coefficient) and Cohen’s kappa agreement.

https://doi.org/10.1371/journal.pdig.0000649.t004

Image quality

Image quality scores are summarized in Table 5. For VistaView images, grader 1 scored 83 images (11.6%) as ungradable, 85 images (12%) as poor, 247 (34.6%) as satisfactory, 233 (32.6%) as good, and 66 (9.2%) as excellent. Grader 2 scored 106 images (14.8%) as ungradable, 261 images (36.5%) as poor, 258 images (35.9%) as satisfactory, 77 images (10.8%) as good, and 12 (1.7%) as excellent.

For the desktop camera images, grader 1 scored 56 images (7.8%) as ungradable, 66 images (9.2%) as poor, 123 images (17.2%) as satisfactory, 273 images (38.2%) as good, and 196 (27.4%) as excellent. Grader 2 scored 51 images (7.1%) as ungradable, 35 images (4.9%) as poor, 37 images (5.1%) as satisfactory, 155 images (21.7%) as good, and 436 (61.1%) as excellent.

Discussion

This study evaluates the diagnostic accuracy of VistaView, a smartphone-based fundus camera for the detection of diabetic retinopathy. To the best of our knowledge, there are no diagnostic accuracy studies of VistaView which evaluate it against a gold standard for detecting DR. We obtained sensitivity of 69.9% and specificity of 92.9% for any DR. For RDR, the sensitivity and specificity were 69.7% and 94.2% respectively. Our findings corroborate with previously published reports in the literature validating the use of smartphone-based devices for screening of DR, as shown in a recent meta-analysis done by Tan et al. [15] They reported pooled sensitivities and specificities of 87% and 94% for smartphone-based devices for any DR; 91% and 89% for RDR and 79% and 93% for diabetic macular edema respectively; when the gold standard was colour fundus photos. Studies comparing such devices with clinical examination have also found high sensitivities and specificities. Sengupta et al. described the sensitivity and specificity of smartphone-based retinal imaging for DR screening compared to dilated fundus examination to be approximately 93% and 89% respectively [16]. Zhang also reported comparable findings in a similar study done earlier [17]. The British Diabetic Association has established the cut-offs of 80% sensitivity and 95% specificity for a viable screening system. National Institute for Clinical Excellence (NICE) guidelines recommend similar standards for sensitivity and specificity [18]. Although our study showed sensitivity lower than these cut-offs due to some limiting factors of the VistaView discussed hereafter, the specificity we obtained met these criteria–a high specificity reduces the burden on hospital eye services for patients who need to seek specialist care for positive tests.

Image quality can be a limiting factor to the usefulness of smartphone-based DR screening systems. Poor image quality may lead to ungradable images. Eighteen percent of the VistaView images were labelled as ungradable. Gradeability also depends on other factors such as presence of media opacities which can lower the overall diagnostic accuracy of the device. The percentage of ungradable desktop camera images (10.2%) was markedly less than that of smart-phone based camera (18.6%). The reason for poor quality images includes low internal resolution of the device, and the focusing process of handheld cameras. These factors may contribute to the VistaView’s sensitivity being lower than that of other handheld cameras in the literature, and lower than the recommended 80%. Additionally, it may be difficult to achieve well focused images with smartphone-based devices because, unlike standard desktop cameras, smart-phone devices do not come with a built-in chin stabilizer. It is worth noting that some studies in the literature which have reported higher diagnostic accuracy metrices compared to the VistaView have excluded patients with media opacities [8,13,19]. Our study includes such patients which makes it more representative of real-world settings. Interestingly, both certified graders had good agreement scores (k>0.60) for retinopathy grades with VistaView images, (k = 0.61) which were comparable to agreement levels using the standard desktop camera. (k = 0.68)

Patient preference and comfort is an important factor to consider when developing a pathway for DR screening. Studies have reported acceptable patient comfort with smartphone-based fundus cameras due to lower light intensity of the LED, compared to the high intensity flash in the traditional fundus camera [19]. Design simplicity and portability are additional factors which contribute to higher general acceptability among patients. These factors may improve compliance with screening and improve overall effectiveness of such a system. In future studies, patient satisfaction with VistaView smartphone-based camera may be explicitly evaluated.

Our technical staff received comprehensive training on the VistaView before initiating retinal imaging for data collection. Studies have shown that duration of training with smartphone-based fundus cameras for screening DR has an association with higher image quality and reduced examination time [2022].

Vision loss has significant financial and economic implications especially in LMICs such as Pakistan. Because LMICs contribute to majority of the world’s diabetic population [23], systematic screening needs be implemented to prevent vision loss secondary to DR in these countries. Smartphone-based screening of DR is particularly valuable in these regions, where there are several barriers to availability and access to health services, including cost, poor infrastructure, lack of equipment, and deficiencies in trained personnel. Smartphone-based fundus imaging has the potential to lower the burden of DR screening on hospital eye services, particularly in LMICs.

Studies show that eye care is imperative to achieve The United Nations 2030 Sustainable Development Goals (SDGs). Improving access to eyecare will help achieve many of these SDGs including reduction of poverty, and increased economic productivity, educational performance, and equity [24]. The World Health Organization (WHO) and International Diabetes Federation have acknowledged the role of low-cost smartphone-based devices for DR screening especially by non-physicians [23,2527]. This can be attributed to their cost-effectiveness, portability, low computational power and space requirements, ease of use, and less training requirement [28]. Tele-ophthalmology may benefit from these attributes of the VistaView to provide efficient and cost-effective DR screening programmes. When introduced at the primary care level, effective screening may be delivered at the point-of-care and the risk of vision loss secondary to DR may be significantly lowered.

Recent advances in technology have allowed integration of artificial intelligence (AI) into DR screening systems. AI algorithms trained to make automated diagnoses of DR on retinal images acquired from smartphone-based fundus cameras have shown to have high diagnostic accuracy in high prevalence settings when compared to human grading [2931]. This can potentially replace the need for human graders’ classification of DR in screening programmes. In future, this could diminish the burden of DR screening on ophthalmologists, and their expertise can be redirected to appropriately managing advance disease.

There were certain limitations in our study. Firstly, requirement of mydriasis adds to the overall acquisition time for imaging and is associated with the side effect of blurry vision for several hours. Another limitation was the use of convenience sampling as all recruited participants were attending a tertiary care eye hospital. Therefore, results may not be generalizable to community settings. Lastly, we did not perform patient acceptability survey in our study.

In conclusion, DR screening with smartphone-based fundus cameras is a potential option for systematic screening of DR. Further studies are needed to evaluate sensitivity and specificity in various populations.

Supporting information

S1 File. Infographic: Diagnostic accuracy of VistaView for diabetic retinopathy detection.

https://doi.org/10.1371/journal.pdig.0000649.s001

(PDF)

S2 File. STARD checklist: Completed STARD checklist.

https://doi.org/10.1371/journal.pdig.0000649.s002

(PDF)

Acknowledgments

We would like to acknowledge the contributions of the doctors of Shahzad Eye Hospital–Dr. M. H. Shahzad and Dr. Harris Shahzad for referring diabetic patients for DR screening.

References

  1. 1. Leasher JL, Bourne RR, Flaxman SR, Jonas JB, Keeffe J, Naidoo K, et al. Global estimates on the number of people blind or visually impaired by diabetic retinopathy: a meta-analysis from 1990 to 2010. Diabetes care. 2016;39(9):1643–9. pmid:27555623
  2. 2. Teo ZL, Tham Y-C, Yu M, Chee ML, Rim TH, Cheung N, et al. Global prevalence of diabetic retinopathy and projection of burden through 2045: systematic review and meta-analysis. Ophthalmology. 2021;128(11):1580–91. pmid:33940045
  3. 3. Vashist P, Singh S, Gupta N, Saxena R. Role of early screening for diabetic retinopathy in patients with diabetes mellitus: an overview. Indian Journal of Community Medicine. 2011;36(4):247–52. pmid:22279252
  4. 4. Pieczynski J, Grzybowski A. Diabetic Retinopathy Screening Methods and Programmes Adopted in Different Parts of the World–Further Insights. Journal-Diabetic Retinopathy Screening Methods and Programmes Adopted in Different Parts of the World–Further Insights. 2015.
  5. 5. Fenner BJ, Wong RL, Lam W-C, Tan GS, Cheung GC. Advances in retinal imaging and applications in diabetic retinopathy screening: a review. Ophthalmology and therapy. 2018;7:333–46. pmid:30415454
  6. 6. Farley TF, Mandava N, Prall FR, Carsky C. Accuracy of primary care clinicians in screening for diabetic retinopathy using single-image retinal photography. The Annals of Family Medicine. 2008;6(5):428–34. pmid:18779547
  7. 7. Srinivasan S, Shetty S, Natarajan V, Sharma T, Raman R. Development and validation of a diabetic retinopathy referral algorithm based on single-field fundus photography. PLoS One. 2016;11(9):e0163108. pmid:27661981
  8. 8. Ryan ME, Rajalakshmi R, Prathiba V, Anjana RM, Ranjani H, Narayan KV, et al. Comparison Among Methods of Retinopathy Assessment (CAMRA) study: smartphone, nonmydriatic, and mydriatic photography. Ophthalmology. 2015;122(10):2038–43. pmid:26189190
  9. 9. Queiroz MS, de Carvalho JX, Bortoto SF, de Matos MR, das Graças Dias Cavalcante C, Andrade EAS, et al. Diabetic retinopathy screening in urban primary care setting with a handheld smartphone-based retinal camera. Acta Diabetologica. 2020;57:1493–9. pmid:32748176
  10. 10. Piyasena MMPN, Yip JL, MacLeod D, Kim M, Gudlavalleti VSM Diagnostic test accuracy of diabetic retinopathy screening by physician graders using a hand-held non-mydriatic retinal camera at a tertiary level medical clinic. BMC ophthalmology. 2019;19:1–13.
  11. 11. Das T, Takkar B, Sivaprasad S, Thanksphon T, Taylor H, Wiedemann P, et al. Recently updated global diabetic retinopathy screening guidelines: commonalities, differences, and future possibilities. Eye. 2021;35(10):2685–98. pmid:33976399
  12. 12. Bossuyt PM, Reitsma JB, Bruns DE, Gatsonis CA, Glasziou PP, Irwig L, et al. STARD 2015: an updated list of essential items for reporting diagnostic accuracy studies. Radiology. 2015;277(3):826–32. pmid:26509226
  13. 13. Prathiba V, Rajalakshmi R, Arulmalar S, Usha M, Subhashini R, Gilbert CE, et al. Accuracy of the smartphone-based nonmydriatic retinal camera in the detection of sight-threatening diabetic retinopathy. Indian journal of ophthalmology. 2020;68(Suppl 1):S42–S6. pmid:31937728
  14. 14. Scanlon PH, Malhotra R, Greenwood R, Aldington S, Foy C, Flatman M, et al. Comparison of two reference standards in validating two field mydriatic digital photography as a method of screening for diabetic retinopathy. British journal of ophthalmology. 2003;87(10):1258–63. pmid:14507762
  15. 15. Tan CH, Kyaw BM, Smith H, Tan CS, Tudor Car L. Use of smartphones to detect diabetic retinopathy: scoping review and meta-analysis of diagnostic test accuracy studies. Journal of medical Internet research. 2020;22(5):e16658. pmid:32347810
  16. 16. Sengupta S, Sindal MD, Baskaran P, Pan U, Venkatesh R. Sensitivity and specificity of smartphone-based retinal imaging for diabetic retinopathy: a comparative study. Ophthalmology Retina. 2019;3(2):146–53. pmid:31014763
  17. 17. Zhang W, Nicholas P, Schuman SG, Allingham MJ, Faridi A, Suthar T, et al. Screening for diabetic retinopathy using a portable, noncontact, nonmydriatic handheld retinal camera. Journal of Diabetes Science and Technology. 2017;11(1):128–34. pmid:27402242
  18. 18. Association BD. Retinal photography screening for diabetic eye disease. London: BDA. 1997.
  19. 19. Kim TN, Myers F, Reber C, Loury P, Loumou P, Webster D, et al. A smartphone-based tool for rapid, portable, and automated wide-field retinal imaging. Translational vision science & technology. 2018;7(5):21. pmid:30280006
  20. 20. Krieger B, Hallik R, Kala K, Ülper K, Polonski M. Validation of mobile-based funduscope for diabetic retinopathy screening in Estonia. European Journal of Ophthalmology. 2022;32(1):508–13. pmid:33164567
  21. 21. Jansen LG, Shah P, Wabbels B, Holz FG, Finger RP, Wintergerst MW. Learning curve evaluation upskilling retinal imaging using smartphones. Scientific Reports. 2021;11(1):12691. pmid:34135452
  22. 22. Adam MK, Brady CJ, Flowers AM, Juhn AT, Hsu J, Garg SJ, et al. Quality and diagnostic utility of mydriatic smartphone photography: the smartphone ophthalmoscopy reliability trial. Ophthalmic Surgery, Lasers and Imaging Retina. 2015;46(6):631–7. pmid:26114843
  23. 23. Barometer D. The diabetic retinopathy barometer report: global findings. International Federation on Ageing Toronto, ON Canada. 2017.
  24. 24. Burton MJ, Ramke J, Marques AP, Bourne RR, Congdon N, Jones I, et al. The lancet global health commission on global eye health: vision beyond 2020. The Lancet Global Health. 2021;9(4):e489–e551. pmid:33607016
  25. 25. Sabanayagam C, Yip W, Ting DS, Tan G, Wong TY. Ten emerging trends in the epidemiology of diabetic retinopathy. Ophthalmic epidemiology. 2016;23(4):209–22. pmid:27355693
  26. 26. Organization WH. Global Initiative for the Elimination of Avoidable Blindness: action plan 2006–2011. 2007.
  27. 27. Wintergerst MW, Mishra DK, Hartmann L, Shah P, Konana VK, Sagar P, et al. Diabetic retinopathy screening using smartphone-based fundus imaging in India. Ophthalmology. 2020;127(11):1529–38. pmid:32464129
  28. 28. Rajalakshmi R, Prathiba V, Arulmalar S, Usha M. Review of retinal cameras for global coverage of diabetic retinopathy screening. Eye. 2021;35(1):162–72. pmid:33168977
  29. 29. Malerbi FK, Andrade RE, Morales PH, Stuchi JA, Lencione D, de Paulo JV, et al. Diabetic retinopathy screening using artificial intelligence and handheld smartphone-based retinal camera. Journal of diabetes science and technology. 2022;16(3):716–23. pmid:33435711
  30. 30. Natarajan S, Jain A, Krishnan R, Rogye A, Sivaprasad S. Diagnostic accuracy of community-based diabetic retinopathy screening with an offline artificial intelligence system on a smartphone. JAMA ophthalmology. 2019;137(10):1182–8. pmid:31393538
  31. 31. Hasan SU, Siddiqui MR. Diagnostic accuracy of smartphone-based artificial intelligence systems for detecting diabetic retinopathy: A systematic review and meta-analysis. Diabetes Research and Clinical Practice. 2023:110943. pmid:37805002