There is no denying the rapid and exponential growth of artificial intelligence (AI) tools such as ChatGPT and Gemini, nor the increasing integration of such technology into everyday life and professional practice. As with many sectors, the use of AI within the medico-legal field presents both significant opportunities and notable risks.
In an article published by The Law Society Gazette on 10th November 2025, it was reported that a High Court judge has issued a warning to expert witnesses, stating that a solicitor who insisted on the use of an AI-generated report would be in breach of their professional duty.
Speaking at the Bond Solon Expert Witness Conference last Friday, Mr Justice Waksman described such a request as “…a gross breach of duty on the part of the solicitor.” These remarks were made only days after the publication of the updated Artificial Intelligence (AI) – Judicial Guidance (October 2025).
Why this matters to Clinical Expert Witnesses
The role of clinical expert witnesses instructed in personal injury and clinical negligence cases is to provide independent, evidence-based opinion. They are not a conduit for a solicitor’s preferred outcome.
If used incorrectly, or relied on entirely, AI could replace the professional judgment of the expert, challenge the integrity of their opinion and lead to the entire process being compromised.
What the October 2025 Judicial Guidance emphasises
The latest guidance highlights several fundamental points which are directly relevant to expert witnesses:
- Before using AI tools, ensure you have a basic understanding of its capabilities and potential limitations.
- Uphold confidentiality and privacy, do not share any case specific confidential or sensitive material.
- AI tools may be helpful for administrative tasks (such as document summarising) but are not suitable for legal research or analysis that requires professional judgment.
- Any output from AI must be carefully verified. AI “hallucinations” (for example, made‑up cases or authorities) are specifically flagged.
- Using AI does not relieve the expert of their responsibility, the expert must remain accountable for the work produced in their name.
Practical Implications for Clinical Expert Witnesses
In the first instance, expert witnesses should always maintain their independence and produce reports based solely on their clinical expertise, the facts, and the available evidence. Any requests from a solicitor that appear to compromise this independence should raise a red flag.
Whilst the use of AI tools may be acceptable for non-opinion administrative tasks, care must be taken to ensure that these are accurate and that any output is subject to the expert’s professional review, correction, full understanding, and final sign-off.
Citations and sources suggested by AI tools should always be checked for authenticity and verified. Earlier this year, the High Court cautioned UK lawyers against the misuse of AI after fabricated case law citations were submitted to the courts, some of which were entirely fictitious or contained invented passages.
All cases must be handled with the highest levels of sensitivity and confidentiality to protect all parties involved. Inputting confidential information, privileged documents, or case-sensitive material into AI tools carries a genuine risk that such data could be exposed or published online.
If AI tools are used to assist with report preparation, it is good practice to document the process clearly, so that the expert can confidently explain how AI was used if questioned by solicitors or in court.
Delivering Premier Medical Expert Services
The clinical expert witnesses at McCollum Consultants provide independent expert witness reports for legal cases and deliver clear, objective analysis.
With a commitment to excellence, we maintain the highest standards of expert testimony, dedicated to clarity, accuracy and integrity.
To instruct an expert witness, please contact our case coordinators:
Tel: +44 (0)161 218 0223
Email: info@exp-w.com