An 11-year-old girl from Kerala, Leena Rafeeq, has created an application based on artificial intelligence (AI) to detect eye diseases and other conditions through a unique scanning method using an iPhone. Ms Rafeeq has named the application as ‘Ogler EyeScan’ and developed it at the age of 10.
She took to LinkedIn to explain how her application works along with a video demonstration. She said that her application can analyse various parametres like light and colour intensity, distance and look-up points to locate the eyes within the range of the frame using “advanced computer vision and machine learning”. “It also identifies any light burst issues and if the eyes are positioned exactly inside the scanner frame,” she added.
After the scan is taken appropriately, trained models are used to diagnose potential eye diseases or conditions like Arcus, Melanoma, Pterygium and Cataracts. Ms Rafeeq said, “This App was developed natively with SwiftUI without any third-party libraries or packages, and it took me six months of research and development to bring this innovative app to life.”
“I learned more about different eye conditions, computer vision, algorithms, machine learning models, and advanced levels of Apple iOS development, including sensors data, AR, CreateML, CoreML, and more,” she continued.
However, it is to be noted that the Ogler EyeScan is only supported in iPhone 10 and above with iOS 16+. She added that her app is currently under review on the App Store. “I am hopeful that it will be listed soon,” the 11-year-old added.
Many users were left extremely impressed with her innovation and congratulated her on achieving this feat at such a young age.
“Good example of how we can reduce / curtail equity in health using AI. Great work,” said a user.
“You deserve a huge congrats Leena. Such an awesome job at the age of 10.,” said another person.
A third person said, “Wow, that’s amazing! You have accomplished so much with your creation of Ogler EyeScan and I wish you the best of luck and some amazing appstore reviews.”
Replying to a user, Ms Rafeeq said that the model accuracy is “almost 70 per cent”. She said, “However, I am encountering difficulties with the presence of glare and burst from lights, caused by the distance required for capturing scans through the phone device. To address this, we have implemented metrics and detection for light-related issues, which prompt users to conduct a re-scan. I am currently focused on training more sophisticated models, and once Ogler is accepted by the Appstore, we will release an update.”