Skip to content

Skilled practitioners find their abilities deteriorating due to reliance on artificial intelligence, according to a concerning report

Research shows that incorporating AI into medicine diminishes doctors' diagnostic skills, stirring concern among scientific professionals.

Skills eroding among doctors due to AI usage, warns startling research
Skills eroding among doctors due to AI usage, warns startling research

Skilled practitioners find their abilities deteriorating due to reliance on artificial intelligence, according to a concerning report

In a groundbreaking development, a study conducted by Polish researchers and published in The Lancet Gastroenterology & Hepatology has shed light on the potential drawbacks of artificial intelligence (AI) in the medical field. The study, which focused on endoscopy centers in Poland, investigated the impact of AI on the diagnostic capacity of experienced doctors.

The research, led by Dr. Marcin Romanczyk of the Silesian Academy, analysed over 1,400 colonoscopies and showed a 20% drop in the detection rate of precancerous lesions after just three months of exposure to AI systems. This inconsistency in international clinical guidelines reflects a fundamental tension: the need to adopt new technologies without sacrificing human professional autonomy and expertise.

The participating endoscopists were all highly experienced, with over 2,000 previous colonoscopies. The impact was specifically measured when doctors performed procedures without the assistance system, revealing a clear loss of capabilities. The decline in diagnostic capacity is directly attributable to technological dependence, as described as "de-skilling".

Dr. Catherine Menon of Hertfordshire University warned that doctors accustomed to support systems will have lower performance if these become unavailable due to cyberattacks or technical failures. This is the first clinical evidence that AI can make doctors worse at tasks they once mastered.

While some medical organizations discourage routine use of AI in colonoscopy screening, others promote it based on hypothetical patient preferences, not medical performance data. However, it's not just about introducing technology, but knowing when and how to use it to avoid creating a situation where a doctor becomes an operator of a machine.

Dr. Omer Ahmad of University College London declared that this study is a warning and that AI-based medicine needs very clear rules. AI should not replace human intelligence, but amplify it. It requires ethical design, continuous training, and clear limits. When the machine turns off, there's no one left who knows how to heal, emphasizing the importance of maintaining human professional autonomy and expertise in the face of technological advancements.

A server failure could trigger a chain of human errors in ultra-efficient but fragile health systems where AI is heavily relied upon. The case of AI-assisted colonoscopies raises a disturbing question: if the technology disappears, will the doctor still know how to look, see, and diagnose?

From 28.4% to 22.4% is the detection rate of precancerous adenomas without AI, after just three months of assisted use, according to the researchers. This suggests that while AI improves short-term results, it degrades the professional in the long run, as also suggested by Romanczyk's team's findings.

Research in other areas, such as clinical psychology or cardiology, has begun to detect similar phenomena where cognitive AI assistants cause greater degradation of human skills than conventional automated systems. This underscores the need for careful consideration and ethical guidelines in the integration of AI into clinical practice.

In conclusion, while AI holds great potential for improving healthcare outcomes, it is crucial to balance its adoption with the preservation of human professional autonomy and expertise. The study serves as a reminder that AI should be used as a tool to augment, not replace, human intelligence in the medical field.

Read also:

Latest