Doctors Were Worse at Spotting Cancer After Leaning on AI, Study Finds

Artificial intelligence tools have been shown to help doctors detect pre-cancerous growths in the colon—but don’t even think about taking those tools away once you’ve introduced them. A new study published this week in The Lancet found that doctors who are given AI tools to assist with identifying potential cancer risks in patients get worse at making those same observations when they go back to doing it without AI’s help.

The study looked at four endoscopy centers in Poland, tracking the success rates of detecting colon cancer for three months before AI tools were introduced and three months after. Once AI was introduced, colonoscopies were randomly assigned to either receive AI support or not. The researchers found that doctors who gave colonoscopies without AI after having its assistance available saw their detection rates drop, producing outcomes 20% worse than what they were before AI was introduced.

Making the results all the more troubling is the fact that the 19 doctors who participated in the study were all very experienced and had performed more than 2,000 colonoscopies each. If those doctors can fall prey to de-skilling, seeing their own abilities erode because of reliance on AI tools, the outcomes from inexperienced doctors could be even worse.

There is little doubt that AI tools can help in medical settings. There have been numerous studies that suggest that AI can facilitate everything from the detection of cancers to the diagnosis of illnesses based on a patient’s medical history. Analyzing information based on a whole wealth of previous examples is kinda the bread and butter of AI (you know, as opposed to generating braindead content slop), and there is evidence that suggests that humans can augment their own abilities by using AI tools. Studies in medical settings have found that doctors who use these tools can produce better outcomes for their patients.

But no one, including doctors, is immune to the risk of shutting your brain off and relying on AI rather than their own skills. Earlier this year, Microsoft published a study that found knowledge workers who lean on AI stop thinking critically about the work they are doing and feel confident that the assistance from AI will be enough to get the job done. Researchers at MIT similarly found that relying on ChatGPT for essay writing resulted in less critical engagement with the material. In the long term, there’s a real risk that reliance on AI will erode our ability to problem solve and reason, which is not ideal when AI continues to generate bad information.

The American Medical Association found that about two in three physicians have already adopted AI to augment their abilities. Hopefully, they’re still able to identify when it does something like hallucinating a body part that doesn’t exist.

Like
Love
Haha
3
Atualize para o Pro
Escolha o Plano que é melhor para você
Leia Mais