Executive Summary
This document outlines a comprehensive training and certification pathway for AI-enabled imaging teams in the UK healthcare context. The framework establishes clear roles, responsibilities, and competencies for radiologists and radiographers working with artificial intelligence systems in medical imaging, ensuring patient safety while maximizing the benefits of AI technology.
Phase 1 – Entry Criteria
Before anyone even opens the AI sandbox, we make sure they are already safe and competent practitioners. Radiologists must hold a CCT or FRCR so that we know they have completed specialist training, while radiographers must be on the HCPC register at Band 6 or above with at least two years’ experience running CT or MRI lists.
During on-boarding, each learner refreshes core digital-health skills—secure log-ins, image-sharing etiquette, rudimentary scripting—and demonstrates a working knowledge of the Ionising Radiation (Medical Exposure) Regulations. That IR(ME)R check is crucial: it reminds every participant that regardless of what an algorithm recommends, the legal duty to justify and optimise a radiation dose still rests with them.
A concise portfolio (CV, certificates, CPD log) is uploaded to the programme lead, and the clinical director signs an entry-clearance form that stays on file as proof that the baseline has been met.
Phase 2 – Foundation: AI Literacy (3 days)
With the regulatory ground cleared, both radiologists and radiographers dive into the essentials of artificial intelligence. Over two days of NHS Digital Academy e-learning modules—bite-sized videos and self-tests—they discover what neural networks, transformer architectures and computer-vision pipelines really do, why dataset bias creeps in, and how small technical decisions can have big patient-safety consequences.
Day three brings everyone together in a hands-on vendor sandbox, letting them tweak thresholds, adjust prompt settings and see live how outputs change. The module ends with a forty-question multiple-choice exam; scoring 80 percent or more proves that the delegate can speak the common language of ISO 42001 risk management, NHS DTAC procurement rules and MHRA software-as-a-medical-device classifications.
Phase 3 – Safety & Governance (3 days)
Next, we turn theory into the guard-rails that keep patients safe. Seminars led by the Trust’s information-governance leads, quality-improvement team and a guest speaker from the MHRA walk learners through the nuts and bolts of GDPR, the practicalities of the UK Post-Market-Surveillance Regulations 2024 and the realities of incident reporting via Datix or the Yellow-Card scheme.
Participants then write an 800-word reflective plan that explains—step by step—how they would set up a drift-monitoring dashboard for a stroke-detection model, decide when its performance has slipped and trigger a stop-use notice. By the end of the third day they understand that “algorithm maintenance” is not an IT luxury but a statutory obligation.
Phase 4 – Clinical Application Modules (4 weeks)
Theory now meets the ward list. Radiographers take the operator track, learning to launch jobs, triage studies and flag anything that looks suspicious, while radiologists follow the clinical-lead path, interrogating heat-maps, editing LLM-generated report drafts and refining local policy.
For the first fortnight everyone works inside a high-fidelity simulator; for the second, they move onto a supervised live list and log fifty consecutive AI-assisted cases. Those cases are more than numbers: each is discussed, dissected and signed off through mini-CEX and DOPS assessments. By week four, trainees can weave the AI queue into a busy NHS workflow without letting a single urgent scan slip through the net.
Phase 5 – AI-Assisted Prescribing (2 days)
Imaging AI increasingly nudges prescribing decisions—contrast doses, sedation regimes, even thrombolysis suggestions—so we devote a focused block to it. Radiologists, who already hold independent prescribing rights, and radiographers, who may administer contrast under patient-group directions, practice applying the GMC’s Good Practice in Prescribing to AI outputs.
In an ePrescribing sandbox they review AI-generated scripts, spot hallucinated drug interactions, adjust doses for renal impairment and then co-sign electronically with a virtual pharmacist. A clinical-pharmacy workshop reinforces pitfalls such as paediatric weight-based dosing and peri-operative anticoagulation. Competence is tested in an OSCE station where the candidate must defend and correct an AI-suggested prescription before an examiner—making clear that the human prescriber, not the algorithm, owns the final order.
Phase 6 – High-Stakes Assessment
Only radiologists sit this capstone examination, because it confers the title of Responsible Clinician for AI (RC-AI)—the person legally empowered to accept or override algorithmic advice and to approve software upgrades. In a three-station OSCE they navigate a breast-screening list riddled with an AI false-positive, draft a re-validation plan for a major model upgrade and merge new NICE Evidence Standards Framework guidance into a local standard-operating procedure.
An RCR examiner and the Trust’s Caldicott Guardian mark each performance pass or fail, ensuring that new RC-AIs have both the technical judgment and the ethical backbone the role demands.
Phase 7 – Certification & Scope of Practice
Successful delegates, radiologists and radiographers alike, are then formally listed on the Trust’s AI register. Each approved algorithm—along with its exact version number—is recorded against the individual’s profile, creating a clear audit trail. A digital signature card, loaded into PACS, means that AI functions only unlock when the correct, credentialed user is logged in.
Every year the practitioner signs a safety declaration confirming they have followed guidelines, reported incidents and stayed within their scope—turning accountability into an annual ritual rather than a forgotten checkbox.
Phase 8 – Post-Certification Obligations
Learning does not stop at the certificate. Radiologists commit to leading a quarterly audit that tracks model drift and to logging twenty AI-specific CPD credits every two years; radiographers maintain an anomaly log and gather ten AI CPD credits each year. Both groups keep their skills sharp through vendor webinars, journal clubs and the Royal College’s quarterly AI bulletin.
The audit findings are presented to the Trust governance board, and minutes are filed as living evidence that ISO 42001’s continuous-improvement loop is more than a slogan—it’s daily practice.
Together these eight phases move a clinician from baseline competence to accountable AI stewardship, knitting advanced technology into patient care without ever relinquishing human judgment or legal responsibility.
Key Design Features & Rationale
1. Role-Specific but Overlapping Tracks
- Radiographers become operators—running the tool, flagging anomalies, and escalating.
- Radiologists gain the legally accountable Responsible Clinician (RC-AI) credential, empowered to override AI outputs and adapt local imaging policy.
2. Limited Automation, Physician Primacy
AI may triage studies or pre-populate structured reports, but final interpretation and any cross-patient lifetime-data queries are locked behind RC-AI smart-card authentication, preserving decision authority in line with GMC good-medical-practice and HCPC scope statements.
3. Embedded Post-Market Surveillance Skills
Training emphasises building a drift dashboard, interpreting AUROC trends and initiating “stop-use” procedures—aligning with 2024 MHRA PMS amendments that become enforceable in June 2025.
4. Alignment to Emerging International Standards
ISO 42001:2023 clauses on risk, transparency and human oversight are mapped to each competence domain, ensuring individuals slot into a wider organisational AI-management system.
5. Independent Assessment & Re-Validation
Examinations are run by the Royal College of Radiologists with external lay observers, mirroring FRCR governance, and certification lapses without evidence of active practice plus CPD.
6. Scalable Delivery
Core e-learning is shareable across trusts; high-fidelity simulator time is pooled in regional imaging networks to contain cost.
Implementation Checklist for a Trust
- Establish an AI Faculty (clinical lead, physicist, IG officer, vendor liaison)
- Adopt ISO 42001 or equivalent AIMS framework
- Commission or license training content (Digital Academy / RCR)
- Procure sandbox & audit dashboards before go-live
- Populate an AI operator register in the Q-Pulse or Datix system
- Review every AI-assisted incident quarterly at the Trust Patient Safety & Effectiveness Meeting
- Report serious AI incidents to MHRA Yellow Card & NRLS within 48 hours
By structuring competence, assessment and ongoing surveillance in this way, a radiology department can make safe, regulator-ready use of powerful AI tools while keeping the final clinical judgment—and the accompanying accountability—squarely in human hands.
Duty Split for Clinical AI Between Radiologists and Radiographers
Task / Decision Point | Radiologist (RC-AI) | Radiographer (AI-Operator) | Why This Split? |
---|---|---|---|
Create & update AI guidelines | A/R – drafts, versions, signs-off, links to NICE & RCR advice | C/I – tests practicality, suggests workflow tweaks | The GMC places ultimate care-pathway accountability on the doctor; IR(ME)R expects written operator procedures authored by the Employer/Practitioner |
Day-to-day operation | I | R – runs CNN triage, launches LLM report drafts, verifies input data, flags anomalies | Mirrors existing modality supervision: radiographers routinely operate CT/MR hardware within written procedures |
Clinical interpretation & override | A/R – reviews heat-maps, edits LLM output, can ignore AI, can query full EHR | C – prepares the AI outputs for review | Keeps decision authority with GMC-licensed practitioner; limits data-minimisation risk by restricting full-record access |
Post-market surveillance | A – sets KPIs, analyses trends, escalates to governance board | R/C – extracts metrics, runs weekly dashboards, enters Datix/Y-Card when thresholds breached | Radiographers already collect QA data; radiologist interprets clinical impact & triggers stop-use |
Software upgrades / model re-validation | A/R – approves re-validation plan, signs release | R – conducts scripted acceptance tests, documents results | Aligns with IR(ME)R “employer’s procedures” where operators follow tests authorised by practitioner |
Patient communication & medico-legal disclosure | R – explains AI role in diagnosis, documents in report, responds to complaints | I | Follows GMC consent & duty-of-candour obligations |
Audit & CPD | R – presents quarterly AI audit; ≥10 CPD credits per model family | R – maintains usage log; ≥5 CPD credits per model family | Ensures both keep skills current while reflecting heavier interpretive load on radiologist |
Guideline lifecycle management system | A – owns change-control register, chairs annual guideline review | C – collates operator feedback, highlights workflow issues | Safeguards the guideline as a “single source of truth” document |
Key: A = Accountable, R = Responsible, C = Consulted, I = Informed
Training Syllabus
(Model-agnostic; aligned to UK regulations & professional standards)
Module | Duration & Format | Learning Objectives | Key Content & Activities | Assessment & Evidence |
---|---|---|---|---|
0. Pre-course onboarding | ½ day self-service | • Verify entry criteria (CCT/FRCR or HCPC reg.) • Digital-identity setup | Portfolio check; enrol on Learning Mgt System (LMS) | Signed eligibility form |
1. Core AI literacy | 3 days • 2 days self-paced e-learning • 1 day live workshop | 1. Explain CNN, ViT, transformer & diffusion architectures 2. Contrast imaging vs LLM failure modes 3. Summarise ISO 42001, NHS DTAC & GMC/HCPC expectations | • Mini-lectures • “Build a tiny CNN” hands-on • Prompt-engineering lab • Case study: RAG pipeline for lifetime EHR Q&A | 40-item MCQ ≥ 80% to pass |
2. Safety, governance & law | 3 days seminars & breakout groups | 1. Map UK MDR 2002 & 2024 PMS amendment to clinical duties 2. Design a post-market-surveillance plan 3. Apply GDPR/Data Protection Act to lifetime-record queries | • Talks from MHRA & Trust IG • Group task: draft incident-response SOP • Dashboard walkthrough | Reflective essay (1,500 words) designing PMS for two model types |
3. Model-specific clinical application | 4 weeks (160 h) split tracks but shared labs | • Operate CNN triage queue & interpret heat-maps • Edit LLM-generated structured reports • Formulate full-record RAG prompts (radiologists) | • Vendor sandbox with cases • Adversarial-image drill • Live lists on PACS under supervision | Log-book & mini-CEX/DOPS |
4. Responsible-Clinician certification (radiologists only) | 2 days | 1. Lead AI guideline authorship 2. Re-validate models after upgrade 3. Integrate NICE ESF evidence into local policy | • OSCE stations: heat-map false-positive, LLM hallucination correction, upgrade scenario | OSCE (Pass/Fail panel) |
5. Continuous improvement | 3 days workshops | 1. Maintain AI guideline “single source of truth” 2. Run quarterly audits 3. Coach radiographers on workflow tweaks | • Change-control simulation • KPI visualisation • Stakeholder-engagement role-play | Team presentation of audit & revised guideline |
6. Post-course obligations | Ongoing | • Radiographer: 10 CPD credits/yr/model family • Radiologist: 20 CPD credits/yr, quarterly audit lead | Integrated into appraisal & revalidation | Annual CPD certificate & audit report |
Teaching & Resource Stack
Element | Tools / Resources |
---|---|
LMS e-learning | NHS Digital Academy modules + RSNA AI Curriculum |
Simulator | Cloud PACS with anonymised DICOM, vendor sandbox for CNN |
Reading pack | ISO/IEC 42001 overview sheets, RCR AI Deployment Fundamentals 2024, RSNA Best Practices for LLMs in Radiology |
Assessment bank | 200 MCQ items; OSCE checklists; model-drift synthetic datasets |
Faculty & Roles
- Programme lead (radiologist) – owns syllabus, chairs OSCE panel
- AI physicist / data scientist – delivers technical labs
- Radiographer tutor – oversees operator skills log-book
- Information-governance officer – GDPR & data-ethics sessions
- Vendor liaison – sandbox maintenance
Rationale for Including IR(ME)R in Entry Requirements
1. Legal Accountability Can’t Be Delegated to an Algorithm
The Ionising Radiation (Medical Exposure) Regulations 2017 name four human duty-holders (Employer, Referrer, Practitioner and Operator) who must justify every exposure, optimise the dose and audit outcomes. These statutory duties remain in force even when an AI tool recommends or auto-protocols a scan.
2. AI Can Change Referral Patterns and Dose Indices
Decision-support CNNs and LLMs may upschedule or modify protocols, directly affecting radiation burden. Understanding IR(ME)R’s optimisation clause ensures staff spot when an algorithm’s choice breaches local diagnostic reference levels.
3. Incident Reporting and Stop-Use Triggers
IR(ME)R Schedule 3 requires employers to investigate and notify the CQC/MHRA about “accidental or unintended exposures.” AI-related mishaps must follow this pathway.
4. Aligns Operator/Practitioner Roles with New AI Duty Split
Our pathway designates radiographers as AI-operators and radiologists as Responsible Clinicians (RC-AI), mapping cleanly onto IR(ME)R’s operator and practitioner categories.
5. CQC Inspections Use IR(ME)R as Their Yard-Stick
The Care Quality Commission inspects imaging departments for IR(ME)R compliance. Demonstrable competence satisfies inspectors that introducing AI workflow will not erode radiation-protection culture.
Strategic Vision: Radiology as Guardian of Clinical AI
Artificial-intelligence systems are already matching—and, in targeted tasks, surpassing—human experts in image interpretation. More than 190 imaging-AI products have regulatory clearance, yet hospitals still struggle to move beyond small pilots. Someone must take stewardship before algorithms become an unmanaged black box.
Radiology is the only branch of medicine that has long depended on advanced computer analysis—from Fourier transforms in MRI to deep-learning CAD—making it the natural, ready-equipped specialty to carry this mantle.
Why Radiology Should Claim Guardianship
- Digital native culture: Radiology workflows are already fully digitised, desk-based and built for remote reading
- Proven data-governance expertise: Radiologists routinely balance IR(ME)R legal duties, dose optimisation and multi-disciplinary audit
- Leadership in computer analysis: Radiology has evolved hand-in-hand with advanced computation
The Master Plan
- Build a credentialed workforce with RC-AI roles and AI-Operator credentials
- Run a living governance framework with Trust AI registers and ISO 42001-aligned dashboards
- Leverage radiology’s remote model for scalable AI implementation
- Map duties to existing law with clear IR(ME)R alignment
- Embed continuous improvement through quarterly audits and mandatory AI CPD
Strategic Pay-Offs
- Patient safety & public trust: Named guardians can override flawed outputs early
- Productivity boost: AI triage frees humans for complex reasoning
- Regulatory head-start: Early stewardship meets forthcoming MHRA, CQC and ISO demands
- Professional primacy: Radiology cements itself as the NHS digital vanguard
- Exportable template: Proven governance model extends to pathology, cardiology and genomics
Conclusion
This structured training and certification pathway provides a comprehensive framework for safely integrating AI into UK medical imaging. By establishing clear roles, rigorous training standards, and ongoing governance mechanisms, we can harness the power of AI while maintaining the highest standards of patient safety and professional accountability.
The pathway ensures that both radiologists and radiographers are equipped with the knowledge, skills, and legal understanding necessary to work effectively with AI systems, while preserving human judgment and clinical decision-making at the centre of patient care.