Remember that song by Bryan Adams that said “Look into my eyes… And when you find me there, you’ll search no more” ? Google’s new AI algorithm can do one better — it can look into your eyes, search and find signs of cardiovascular risks.
Developed by researchers from Google and its health-tech subsidiary Verily, the details of the innovation were published in a paper titled Prediction of cardiovascular risk factors from retinal fundus photographs via deep learning in an international science journal.
The algorithm can correctly infer information such as an individual’s age, smoking habits and blood pressure levels by examining scans of the back of a person’s eyes. Prediction of the risk of a severe cardiac incident such as a heart attack can be made using this information with an accuracy almost similar to that of current diagnostic procedures.
Luke Oakden-Rayner, a medical researcher at the University of Adelaide told a website, “They’re taking data that’s been captured for one clinical reason and getting more out of it than we currently do.” He also added that this method would not replace doctors, but in fact would help expand their current diagnostic abilities.
The researchers from Google trained the algorithm by feeding images of retinal scans of around 3,00,000 patients from the United Kingdom and US into the neural networks. The neural networks were then trained to look for patterns in them and associate indicative signs in the scans with the criterias necessary to predict probability of cardiovascular dangers.
This opens up a new avenue for the employment of AI in healthcare, especially for diagnostic purposes without the need for invasive methods. Also, the notion of using retinal scans to predict cardiac issues is not unfounded, as conditions such as diabetes and high blood pressure can cause significant changes to an individual’s retina. Both the health conditions inherently increase heart risks.
“Diagnosis is about to get turbo-charged by technology. And one avenue is to empower people with rapid ways to get useful information about their health,” said Harlan Krumholz, a cardiologist at Yale University.
However, the new method has a long way to go before it can see commercial clinical application as it is not without its flaws. While comparing the scans from two different patients, one who suffered from a cardiovascular disease and the other who did not, the algorithm was able to discern the difference 70 percent of the time. This is 2 short of the 72 percent accuracy that the commonly-used invasive method SCORE registers.
With Google already working on using deep learning and AI for the diagnosis of diseases such a diabetic retinopathy and cancer, the contributions the technology can make to healthcare may see a significant increase in the future.