In the next few years, you will probably have your first interaction with a medical artificial intelligence (AI) system. The same technology that powers self-driving cars, voice assistants in the home, and self-tagging photo galleries is making rapid progress in the field of health care, and the first medical AI systems are already rolling out to clinics.
Thinking now about the interactions we will have with medical AI, the benefits of the technology, and the challenges we might face will prepare you well for your first experience with a non-human health care worker.
How AI can diagnose illness
The technology behind these advances is a branch of computer science called deep learning, an elegant process that learns from examples to understand complex forms of data. Unlike previous generations of AI, these systems are able to perceive the world much like humans do, through sight and sound and the written word.
Read more: How can doctors use technology to help them diagnose?
While most people take these skills for granted, they actually play a major role in human expertise in topics like medicine. Since deep learning grants computers these abilities, many medical tasks are now being solved by artificial intelligence.
In the last 12 months, researchers have revealed computer systems that can diagnose diabetic eye disease, skin cancer, and arrhythmias at least as well as human doctors. These examples illustrate three ways patients will interact with medical AI in the future.
The first of these three ways is the most traditional, and will occur where specialised equipment is needed to make a diagnosis. You will make an appointment for a test, go to the clinic and receive a report. While the report will be written by a computer, the patient experience will be unchanged.
Google’s diabetic eye disease AI is an example of this approach. It was trained to recognise the leaky, fragile blood vessels that occur at the back of the eye in poorly controlled diabetes, and the AI is now working with real patients in several Indian hospitals.
![](https://theconversation.imgix.net/files/186310/original/file-20170918-27943-ga6sf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754)
The second way of interacting with medical AI will be the most radical, because many diagnostic tasks don’t need any special equipment at all. The Stanford team that created a skin cancer detector as accurate as dermatologists is already working on a smartphone app.
Before long, people will be able to take their own skin lesion selfies and have their blemishes analysed on the spot. This AI is leading the race to become the first app that can reliably assess your health without a human doctor involved.
Read more: How to pick the good from the bad smartphone health apps
The third method of interaction is somewhere in between. While detecting heart rhythms requires an electrocardiogram (ECG), these sensors can be incorporated into cheap wearable technology and connected to a smartphone. A patient could wear a monitor daily, record every heartbeat, and only occasionally see their doctor to review their results. If something serious occurs and the rhythm changes suddenly, the patient and their doctor could be immediately notified.