News

A new Google algorithm can predict heart disease just from scanning your eyes

Eye scans processed by machine learning could be used to spot early signs of heart disease, according to the research.
Eye scans processed by machine learning could be used to spot early signs of heart disease, according to the research. Eye scans processed by machine learning could be used to spot early signs of heart disease, according to the research.

Scientists at Google have discovered a new way to assess someone’s risk of developing heart disease – a scan of a patient’s eye that uses machine learning to predict their chances of being affected.

The technology takes data gathered from the eye scan, such as a patient’s age, gender, blood pressure and whether or not they smoke.

From this, the software is then able to predict how likely they are to suffer from heart issues.

The technology has been built in collaboration with Verily, the health tech subsidiary of Google’s parent firm Alphabet.

Details of the research have been published in the Nature Biomedical Engineering Journal, and explains how the scientists trained their machine learning algorithm to make its predictions on data from almost 300,000 patients.

(Google/Verily)
(Google/Verily) An original image of the interior rear of the eye (left) and a second where Google’s algorithm has highlighted blood vessels in green to predict blood pressure (Google/Verily)

Using scans of the eye to build a more general health picture has been used in medical research before – the amount of blood vessels that can be seen via microscopic scans of the back of the eye make it a sound location to gain an insight into the body’s general health.

In their research, the scientists said the method was beneficial because data can be obtained “quickly, cheaply and non-invasively”.

Artificial intelligence and machine learning being used to improve medical treatment, particularly around diagnosis or ailment identification, are becoming steadily more common. Last year an algorithm that could identify skin cancer was unveiled by researchers at Stanford university.

That software had been trained by analysing more than 100,000 images of moles and rashes.

Google’s Verily wing also has a history in focusing on the human eye for help with diagnosis. The development of a smart contact lens prototype that could monitor glucose levels in the body was announced in 2014 but is yet to be seen.