Robot Radiologists Will Soon Analyze Your X-Rays

XRay
link to source
| Wired

Google is getting really, really good at recognizing photos and videos of cats. All it took was supplying millions of examples so that the company’s software—based on a branch of artificial intelligence called deep learning—could start recognizing the difference between cats and other furry creatures. But Jeremy Howard wants to use deep learning for something a little more practical: diagnosing illnesses. And he’s finally getting his chance.

Today Howard’s company, Enlitic, said it was going to start working with Capitol Health Limited, a radiology clinic with locations across Australia, to have its software look at X-rays.

Enlitic won’t replace radiologists. Instead, the software is designed to help them do their jobs more quickly and make fewer mistakes. First, it checks each file submitted to make sure the image matches what the technicians say it’s supposed to be—for example, it makes sure that if an image is tagged as a left knee that it’s not actually a right knee. Then, it looks for anomalies in the image.

Depending on what it finds, it assigns a priority to the X-ray and routes it to a radiologist. For example, if it finds nodules on an image of a pair of lungs, it will…

Read more here.