Google Neural Network Can Predict Your Health Status From Your Retina

Google Neural Network Can Predict Your Health Status From Your Retina
This site may earn affiliate commissions from the links on this page. Terms of use.

Machine learning can be used to recognize faces, drive cars, and even spot exoplanets, but now Google is teaching its computers to do something even more unexpected. Researchers at Google have developed a way to predict a person’s blood pressure, age, and smoking status from an image of their retina, according to Scientific American. This data may even be enough to determine when someone is at high risk of a heart attack.

Google’s research used a convolutional neural network, the same biologically inspired system used to identify objects in photos. However, these networks can do plenty of other things if you just train them on different data sets. Convolutional neural networks are able to analyze an overall image more like the human brain, without splitting it up into pieces. These networks can actually understand the content of an image, and they’re getting very good at it.

Google’s research arm had the idea to apply neural network design to biological problems, but it didn’t start with the retina. In a past study, Google created a tool called DeepVariant that could scan a DNA sequence to find small mutations that would be missed by other methods. Outside of Google, researchers from the Allen Institute for Cell Science in Seattle are using convolutional neural networks to automatically identify cellular organelles in 3D images from microscopes. The components are colored by the computer, which eliminates the need to stain cells.

Deep neural networks have at least one hidden layer, and often hundreds. That makes them expensive to emulate on traditional hardware.

To develop its retina-scanning neural network, Google needed a lot of data. It used retinal images from 284,335 patients to set up the network. Later, it validated the network’s deep learning abilities using two different data sets of 12,026 and 999 patients. This was an important step as it showed that Google’s model could accurately predict health metrics. Just from retinal images, the model can determine age within about 3 years, gender (97 percent accuracy), smoking status (71 percent accuracy), blood pressure (within 11.23mmHg), and how likely it is that someone will have an “adverse cardiac event.” The model was able to predict that last one with 70 percent accuracy. It’s not a sure thing, but that’s pretty accurate when you consider it’s just looking at blood vessels in the eye.

The study is still just in preprint right now and has not been peer reviewed. Other researchers will need to go over the models and validate the results before we’ll know the impact, but it could be a boon to medicine. Even if it’s not 100 percent accurate, a retina scan is a simple, noninvasive procedure that could provide more data to doctors.

Let’s block ads! (Why?)

ExtremeTechExtremeTech

Leave a Reply

Your email address will not be published. Required fields are marked *