How has pathology training and retention changed with the introduction of AI? We connected with Amal Asar, Consultant Histopathologist at Northern Care Alliance, after her talk at DP&AI: Europe 2025 to discuss the minefields and bottlenecks in maintaining competencies.
From your perspective, what are the most significant challenges facing pathology education today?
Pathology training, like all medical training, is fundamentally hands-on. Trainees need exposure to a wide range of cases and clinical scenarios before they can practice independently with confidence and competence. However, increasing pressures are beginning to challenge this traditional training model.
Workload demands on consultant pathologists have been rising for years, driven by staff shortages and growing case volumes. These pressures have been further intensified by post-COVID recovery efforts and the need to reduce long waiting lists. As a result, structured teaching sessions have become less frequent, simply because consultants have less time available for education.
At the same time, the centralization of pathology services and increased subspecialization have made training more fragmented. In some regions, trainees may not encounter certain workflows or subspecialties unless they rotate through specialized centers – and even then, exposure may be limited to just one or two months across an entire training program.
This creates clear gaps in experience and can set a difficult precedent for the future. If trainees enter consultant roles without consistent teaching support, there is a risk that education and mentorship will become increasingly deprioritized, continuing the cycle for the next generation.
Given these resource pressures, how can we best prepare trainees for modern diagnostic practice?
Digital pathology and, eventually, AI tools may help address some of these challenges. Many trainees are already comfortable with technology, and digital workflows could expand opportunities for teaching, collaboration, and case exposure. However, this will also require new competencies within training programs, including foundational knowledge of computational pathology and the safe use of digital systems.
What specific risks or pitfalls should labs be aware of as AI becomes more integrated into pathology workflows?
When discussing AI integration in pathology, I remain cautiously optimistic. As diagnosticians, we cannot ignore the scale of change AI is already driving in other sectors. In only a short time, tools such as ChatGPT have reshaped the way many industries operate, in some cases reducing the need for human input.
A common view is that medicine will be protected from similar disruption because of safety requirements and strict regulation. However, it is reasonable to ask what the long-term future could look like. If, over the next few decades, AI infrastructure becomes fully established, adoption barriers are removed, and regulatory frameworks mature, healthcare systems may be tempted to employ fewer pathologists – using clinicians primarily to approve outputs, handle exceptions, and troubleshoot problems. AI may be seen as more efficient and cost-effective than expanding the workforce.
While this represents a worst-case scenario, we cannot fully dismiss it without robust evidence on how AI affects staffing, efficiency, costs, and diagnostic performance in real-world settings. Ethical and legal frameworks, workplace culture, and public expectations may ultimately play an important role in shaping how far automation is allowed to go.
More immediately, the most significant and likely risk is workforce de-skilling. Over-reliance on machine-generated outputs could lead to automation bias, reduced critical thinking, and diminished confidence in independent diagnosis. This needs to be addressed proactively.
How might those risks be mitigated?
Practical safeguards could include:
Audit trails and justification prompts, requiring pathologists to document why they accept or override AI results
Regular blind-read sessions, where cases are assessed without AI support
Internal quality assurance slide sets and mandatory External Quality Assessment participation to maintain standards
Minimum microscope reporting hours each year to preserve core diagnostic skills
Clear guidance and oversight from professional bodies, such as the Royal College of Pathologists, to help standardize safe use across departments
These steps would help ensure AI supports diagnostic practice without eroding professional expertise.
How should pathology training programs balance traditional diagnostic skills with emerging competencies in digital workflows and computational tools?
It is difficult to predict exactly how training will strike that balance, especially with digital and AI-based practice still in its infancy. There will likely be many iterations, and training bodies worldwide will need to learn from one another to develop a curriculum that is balanced, safe, and adaptable. However, there are clear guiding principles.
Training must preserve diagnostic autonomy. Trainees should be able to make independent diagnoses, generate appropriate differentials, and understand which ancillary tests can support a final diagnosis – regardless of whether AI or computer-aided tools are available within digital viewers.
Should qualifying examinations include AI-supported diagnosis, in your opinion?
Qualifying examinations, where AI support is not available, naturally reinforce independent diagnostic skills. However, if clinical practice becomes heavily AI-dependent while examinations remain AI-free, this creates an unhelpful disconnect. Exams risk becoming artificial obstacles rather than meaningful assessments of clinical competence.
In my view, exposure to AI-supported diagnosis should be introduced later in training, potentially from around the fourth year. By that stage, trainees should already be able to work independently, remain competent at the microscope, and manage scenarios where digital systems may not be available – such as scanner failures, power outages, or system maintenance.
Once trainees have achieved core competencies through routine supervision and annual review processes, AI can then be introduced as a supervised supplement. This would allow them to experience how AI fits into real clinical workflows, learn how to troubleshoot AI-related issues, and maintain independent judgment before transitioning to consultant practice.
What types of assessment tools or competency frameworks do you believe are needed to ensure trainees can safely and effectively use digital pathology and AI tools in clinical settings?
The emerging vision is that pathologists will be responsible for the safe use of AI tools, and that final accountability for diagnostic decisions will still rest with them. This will require maintaining the core knowledge, skills, and professional standards already built into training programs, but with an added layer of competencies suited to a more technology-driven working environment.
I do not expect current training requirements to change radically. Instead, they will likely be expanded through additional frameworks focused on AI and digital pathology, including:
AI governance competencies, such as assessing clinical utility, performing validation, and managing quality control after deployment
Legal and ethical considerations of using AI in clinical practice
Self-awareness and reflective practice, including recognizing limitations or blind spots when using AI, supported by supervisor feedback and multisource feedback
Foundational informatics and computational pathology knowledge, embedded into curricula and assessments
There will also be a growing need to attract trainees into AI leadership roles within pathology departments – people who can help implement, oversee, and maintain these tools in routine practice.
This may lead to additional training pathways for interested candidates, including placements in clinical informatics teams or industry settings, supported by hands-on experience and structured projects. These pathways could also include additional certification in leadership or computational pathology.
What opportunities does AI create for innovation in pathology education?
For trainees, AI could support more self-directed learning. Instead of searching through multiple atlases – or relying on Google for reference images – trainees could access annotated image repositories. Over time, AI could act as a teaching aid by explaining why an image fits one diagnosis rather than another, what additional work-up would confirm it, and what common pitfalls to avoid.
AI could also make teaching more efficient. For example, it could help educators build presentations more quickly, allowing teaching sessions to happen more often. With the right tools, a teacher could enter a diagnosis – or upload an image – and ask the system to retrieve similar cases from laboratory archives to create a ready-made teaching set.
In the same way, AI could generate slide collections for mock exams based on specific criteria. It may also support exam preparation by helping with question writing, calibration, and marking. Overall, AI has the potential to expand both the quality and accessibility of pathology training resources.
What skills – beyond medical knowledge – do you believe the next generation of pathologists will need to thrive in an AI-enhanced diagnostic environment?
As the professional identity of the pathologist evolves – although much of this is still speculative – there may be a greater need to engage with wider fields such as cell biology and bioethics. In parallel, stronger foundations in research methods, including study design, statistics, and data analysis, may become increasingly important so future pathologists can adapt to changing expectations and technologies.
Finally, there needs to be a willingness to engage with the technology sector. The future of pathology will likely be shaped by both healthcare systems and health technology companies. Proactive leadership and collaboration with industry will be important for guiding how digital tools are implemented and how pathology services develop over time.
What is the most important balance pathology must strike as AI continues to evolve over the next decade?
The profession has always been shaped by its leadership. If we approach the future with the goal of using AI to support and augment pathologists, and if hospital culture, regulators, and government policy keep the human clinician at the center of decision-making, we can move forward safely. The key is to let clinical purpose drive the technology, rather than allowing technology to dictate our priorities.
Like any major innovation, AI is only as good as the way it is used. The challenge will be finding the right balance between improving efficiency and maintaining professional autonomy, judgment, and integrity in medical practice.
