Interventional gastroenterology

AI won’t replace the endoscopist – but it will change the way you work


Associate Professor Tyler Berzin

Artificial intelligence (AI) is going to change things up in the endoscopy suite – and it’s not all about polyp detection.

Associate Professor Tyler Berzin, from Harvard Medical School and the Centre for Advanced Endoscopy at the Beth Israel Deaconess Medical Center, told the GESA AGW 2020 meeting that the potential for AI went beyond computer aided detection and diagnosis.

He said the biggest problem for many clinicians was not polyp detection but all the non-clinical work such as documentation, patient data entry, etcetera.

“I think we need to acknowledge that these are real issues to address as well as using AI. There is potential to reduce documentation burden in a big way.”

He said voice recognition and ambient clinical intelligence meant AI could listen to clinician-patient conversations and automatically document the visit intelligently, not just word for word.

Natural language processing and AI could also be used to pull data from charts and generate lists for upload to a registry or to send a reminder to patients who were overdue for a colonoscopy.

“That would save hundreds of hours in administrative work and we are hopeful that these types of technology will be available to us in the future. We would love to see these types of technologies simplify what we do and make our practices more enjoyable and more effective.”

Computer vision

Associate Professor Berzin said AI was just reaching the point where, for very specific tasks, it was starting to match and in some cases exceed human performance.

“And that can include playing chess, classifying skin lesions, looking at X rays … and now polyp detection.”

He said endoscopists were already good at detecting subtle polyps but computer-aided detection (CADe) would encourage endoscopists to “go back and take a second look”.

“CADe for polyps isn’t perfect and if you begin using this in the endoscopy suite you are going to see a few funny behaviours and false positives is one of them.”

“It’s usually not too hard to say ‘well, that is a bubble’ and these false positive experiences are not typically very distracting and I find them not to be a huge issue.”

“I also make the point that even with the help of a computer, good practice for cleaning and fully inflating the lumen is still important. The computer doesn’t like it and frankly we don’t like it as endoscopists when the lumen is not fully inflated and sometimes you get noise using the computer vision software in this setting.”

He said the technical challenge of AI and CADe of polyps was very much a solved problem.

“I would argue that what we really need going forward in 2020 is better and higher quality clinical validation.”

He said the first prospective RCT on computer aided polyp detection in 1,000 patients had found the adenoma detection rate (ADR) was higher with CADe versus standard colonoscopy.

“There are some key weaknesses to this study which are important to acknowledge. Crucially most of the polyps detected were very small. A major part of the ADR benefit was driven by finding diminutive adenomas. How important is this during screening colonoscopy?”

Associate Professor Berzin said polyp detection was not “the only game in town”.

Instead, endoscopists might expect to see computer aided detection and diagnosis integrated during colonoscopy within a few years. However diagnosis required magnification colonoscopy which was not yet widely available.

He said some computer vision software will be able to assess prep quality and distension and other measures of adequacy – “a fingerprint for the quality of each procedure”.

These might be a more reliable way of measuring quality than with current measures such as withdrawal time.

AI in gastroenterology ahead of the rest

Associate Professor Berzin said a Nature Medicine article had shown four of the five RCTs of AI in medicine were in the field of gastroenterology.

Most were regarding polyp detection but there was ongoing work in capsule endoscopy, Barretts oesophagus, monitoring blind spots during esophagogastroduodenoscopy, and more.

An ASGE position statement, led by Associate Professor Berzin, has also outlined a framework for prioritising future work and collaborative efforts.

Balancing high clinical value and questions which were most likely to be solvable by AI, it recommends the development of AI algorithms in the detection of gastric cancer precursor lesions and dysplasia in IBD.

“In GI because of all this clinical activity we also have a very rich technology ecosystem for developing AI in GI so I am very hopeful that given these advancements, expertise and collaborations between industry and clinicians and academics, we are going to continue leading the field over the next few years,” he said.

Challenges but confidence remains

Associate Professor Berzin said a 15-minute endoscopy procedure generated about 27,000 images which was great for rich data but also labour intensive to label so many images.

He said the huge amount of human and medical effort required to obtain, curate and label endoscopic images was a limiting factor for medical AI.

“One of the challenges in this field is going to be comparing the behaviour of different AI detection softwares … and what that means for accuracy. I don’t think we have great ways to compare one software to the next.”

Associate Professor Berzin said AI required regulatory approval to move forward. While Japan and Europe had initial approval for some of these technologies, the FDA was moving slower and waiting for more data from US clinical trials.

However he was optimistic that AI had the potential to not just improve patient care but improve practice.

“The last few years have seen some really dramatic advances for AI and computer vision. We are really at a point on the technology side where we are very ready to start applying this to some of our own clinical challenges.”

“This is not AI replacing physicians. This is AI augmenting our workflow and making our attentional weaknesses a little less weak, and hopefully also helping with some of the painful work we do such as documentation that really are not MD-level work. I would love to see AI helping us to do a little bit less of that.”

Already a member?

Login to keep reading.

OR
Email me a login link
logo

© 2022 the limbic