Conexiant
Login
  • The Analytical Scientist
  • The Cannabis Scientist
  • The Medicine Maker
  • The Ophthalmologist
  • The Pathologist
  • The Traditional Scientist
The Pathologist
  • Explore Pathology

    Explore

    • Latest
    • Insights
    • Case Studies
    • Opinion & Personal Narratives
    • Research & Innovations
    • Product Profiles

    Featured Topics

    • Molecular Pathology
    • Infectious Disease
    • Digital Pathology

    Issues

    • Latest Issue
    • Archive
  • Subspecialties
    • Oncology
    • Histology
    • Cytology
    • Hematology
    • Endocrinology
    • Neurology
    • Microbiology & Immunology
    • Forensics
    • Pathologists' Assistants
  • Training & Education

    Career Development

    • Professional Development
    • Career Pathways
    • Workforce Trends

    Educational Resources

    • Guidelines & Recommendations
    • App Notes

    Events

    • Webinars
    • Live Events
  • Events
    • Live Events
    • Webinars
  • Profiles & Community

    People & Profiles

    • Power List
    • Voices in the Community
    • Authors & Contributors
  • Multimedia
    • Video
    • Podcasts
Subscribe
Subscribe

False

The Pathologist / Issues / 2022 / Sep / Diagnosis: Uncert(AI)n
Digital and computational pathology Software and hardware Technology and innovation Research and Innovations

Diagnosis: Uncert(AI)n

The old phrase “garbage in, garbage out” still rings true

By George Francis Lee 09/06/2022 News 1 min read

Share

Diagnostic artificial intelligence (AI) is developing rapidly – but concerns that the technology may reinforce pre-existing biases are keeping pace. Although there’s still debate over when AI can perform reliably in a clinical setting, there seems to be an increasing amount of skepticism toward the concept’s overall validity, with datasets in the direct line of fire.

A notable 2021 study by Seyyed-Kalantari and colleagues identified that widespread models trained on chest X-ray datasets showed a disparity between ethic groups when it came to accurately spotting disease known to be present. Specifically, Black, Hispanic, and other underserved groups received significantly more false “healthy” classifications than their White counterparts (1). In this case, the AI models were seen to parrot – if not exacerbate – known human biases in healthcare settings.

That study was a clear catalyst for wider discourse, spurring a number of papers that sought to respond to the team’s findings. One comment raised the study’s potential limitations and the original researchers’ inability to classify a cause of bias (2), then went on to note that disparities are a likely result when using a single prediction threshold due to underlying prevalence shift, as was the case in the original study.

Like an academic tennis game, the original authors returned with a reply of their own (3). They agreed with various points raised, particularly in regards to prevalence shift, difficulties in training with biased data, and use of a natural language processing tool to denote when the AI model had “no finding”. Most notably, they reiterated the importance of their findings – that biases are present in the datasets and must be addressed before AI can be deployed clinically.

Newsletters

Receive the latest pathology news, personalities, education, and career development – weekly to your inbox.

Newsletter Signup Image

References

  1. L Seyyed-Kalantari et al., Nat Med, 27, 2176 (2021). PMID: 34893776.
  2. M Bernhardt et al., Nat Med, 28, 1157 (2022). PMID: 35710993.
  3. L Seyyed-Kalantari et al., Nat Med, 28, 1161 (2022). PMID: 35710992.

About the Author(s)

George Francis Lee

Interested in how disease interacts with our world. Writing stories covering subjects like politics, society, and climate change.

More Articles by George Francis Lee

Explore More in Pathology

Dive deeper into the world of pathology. Explore the latest articles, case studies, expert insights, and groundbreaking research.

False

Advertisement

Recommended

False

Related Content

Global Referral
Digital and computational pathology
Global Referral

January 12, 2024

10 min read

How digital pathology is transforming the delivery of remote second opinions

Cracking Colon Cancer
Digital and computational pathology
Cracking Colon Cancer

January 25, 2024

1 min read

How a new clinically approved AI-based tool enables rapid microsatellite instability detection

The (Pathology) IT Crowd?
Digital and computational pathology
The (Pathology) IT Crowd?

December 30, 2021

5 min read

The pathologist’s guide to IT considerations for digitization

Defining the Next Generation of NGS
Digital and computational pathology
Defining the Next Generation of NGS

December 31, 2021

1 min read

Overcoming challenges of the typical NGS workflow with the Ion Torrent™ Genexus™ System

False

The Pathologist
Subscribe

About

  • About Us
  • Work at Conexiant Europe
  • Terms and Conditions
  • Privacy Policy
  • Advertise With Us
  • Contact Us

Copyright © 2025 Texere Publishing Limited (trading as Conexiant), with registered number 08113419 whose registered office is at Booths No. 1, Booths Park, Chelford Road, Knutsford, England, WA16 8GS.