The Business Times

Google's DeepMind to create product to spot sight-threatening disease

Published Mon, Aug 13, 2018 · 03:26 PM
Share this article.

[LONDON] DeepMind, the London-based artificial intelligence company that is owned by Alphabet, plans to develop a medical product that will help doctors to detect more than 50 sight-threatening conditions from a common type of eye scan.

DeepMind trained artificial intelligence software to detect signs of disease better than human doctors, according to a study published Monday in the scientific journal Nature Medicine.

DeepMind and its partners in the research, London's Moorfields Eye Hospital and the University College London Institute of Ophthalmology, said they plan prospective clinical trials of the technology in 2019. If those trials are successful, DeepMind said it would seek to create a regulator-approved product that Moorfields could roll out across the UK.

It said the product would be free for an initial five-year period. The software would be the first time a DeepMind AI algorithm using machine learning has ended up in a healthcare product. Alphabet has several initiatives aimed at using artificial intelligence to improve healthcare.

Earlier this year, Verily, an Alphabet company that says its goal is to extend human lifespans, teamed up with AI experts from Alphabet's Google, to develop an algorithm that could spot a range of cardiovascular issues from a different kind of retinal image. DeepMind itself has an entire division devoted to healthcare, and has research projects with the UK's National Health Service and with the US Department of Veterans Affairs, among others.

DeepMind's work with the UK's National Health Service has been controversial. Last year, the UK data privacy watchdog said a different NHS trust, the Royal Free Hospital, had illegally provided 1.6 million patient records to DeepMind to help it develop a mobile app that would alert doctors if patients were at risk of developing acute kidney injury.

In the eye scan study, DeepMind and its NHS partners took steps to avoid similar issues. Pearse Keane, the senior doctor who lead the Moorfields team working on the project, said in an interview that the hospital "did everything we could" to anonymise the 16,000 eye scans it used both to train and test the algorithms DeepMind developed.

This process was approved and audited by the hospital's information governance department, and DeepMind was barred from trying to re-identify patients whose scans were being used.

The NHS also stressed that Moorfields owns the data used to develop the software, and can use it freely in other research. The DeepMind-Moorfields research looked at a type of eye scan called optical coherence tomography (OCT) that can be used to diagnose age-related macular degeneration (AMD), now the leading cause of blindness in the developed world, as well as other retinal disorders linked to conditions such as diabetes.

But, Keane said, the use of OCT scanners has outstripped the training of experts who can correctly interpret their imagery. As a result, almost any abnormality picked up on OCT scan leads to a referral to an ophthalmologist for further review. Between 2007 and 2017, ophthalmology referrals in the UK increased by 37 per cent.

This has led to waiting times that make treating those who actually need quick intervention to prevent blindness difficult. To benchmark the system, DeepMind tested the software on 1,000 scans not used to train the AI, and compared its performance to four senior ophthalmologists and four optometrists who had also been specifically trained to interpret OCT scans.

The researchers found their AI could make the correct referral decision for over 50 eye diseases with 94 per cent accuracy - better than most of the humans. Critically, the software had zero false negatives - cases where it missed indicators of disease - and only two false positives, where the system recommended urgent assessment in cases where doctors had recommended the patient simply monitor their symptoms. This was better than any of the human experts.

DeepMind's software used two separate neural networks, a kind of machine learning loosely based on how the human brain works. One neural network labels features in OCT images associated with eye diseases, while the other diagnoses eye conditions based on these features. Splitting the task means that -- unlike an individual network that makes diagnoses directly from medical imagery - DeepMind's AI isn't a black box whose decision-making rationale is completely opaque to human doctors, Keane said.

BLOOMBERG

BT is now on Telegram!

For daily updates on weekdays and specially selected content for the weekend. Subscribe to  t.me/BizTimes

Technology

SUPPORT SOUTH-EAST ASIA'S LEADING FINANCIAL DAILY

Get the latest coverage and full access to all BT premium content.

SUBSCRIBE NOW

Browse corporate subscription here