Decentralised screening in ophthalmology
A smartphone-based handheld optical device that captures retinal images automatically, with intelligent guidance during acquisition. A computer-aided diagnosis of eye diseases is available on-device, as part of the fundus camera.
There are 217 million people in the world with moderate to severe vision impairment, while Age-related Macular Degeneration, Glaucoma and Diabetic Retinopathy (DR) are among the top causes. DR may lead to complete vision loss, being related to the systemic nature of diabetes, as the glycemic control and hypertension. To prevent its progression, extensive screening actions should be performed. The recommendation for yearly retinal imaging of the diabetic population is often not followed, due to high costs and lack of trained personnel.
The Ophtha portfolio by Fraunhofer Portugal AICOS includes the EyeFundusScope, a solution for portable, accessible and non-mydriatic retinal imaging (without drug-induced pupil dilation). The technology for eye screening consists of a 3D-printed and affordable optical device, that can be operated by non-experts in ophthalmology.
Following our contribution to reduce the prevalence of avoidable blindness in the world, the Ophtha portfolio includes a mobile application that employs parallel computation and artificial intelligence offline, to control and simplify the image acquisition of the retina through real-time guidance and evaluation of the right moment for auto-capture. Computer-Aided Diagnosis running on-device can be used to rate the risk level of DR, visualize computer-detected annotations of exudates and microaneurysms, and classify them between pathology-free and DR cases.
Almost 300 samples with laboratory-grade quality have been acquired from real patients, during medical appointments in the ophthalmology service – Centro Hospitalar do Porto (CHP). Our machine learning algorithms have been trained with diverse public databases of conventional table-top fundus cameras, with up to 35000 samples. The performance of our models in predicting DR achieved 87% of Sensitivity (SENS) and 66% of Specificity (SPEC), in a balanced dataset of 1200 samples acquired in 45° field-of-view (FOV). For unbalanced CHP dataset of mobile acquisitions in 25° FOV, 73% of SENS and 94% of SPEC were achieved. More data collections are being planned for validation in the screening context in patients with diabetes using the mobile fundus camera with 40° FOV.