Exploring the use of ChatGPT by Damages Experts

Anna Kelly

Senior Consultant

ChatGPT: Revolutionary tool or dangerous shortcut?

We asked ChatGPT, an advanced artificial intelligence (AI) language model developed by OpenAI, to write an article on the use of ChatGPT by damages expert witnesses. The result was 650 words of coherent and contextually relevant text produced in mere seconds. In the introduction, ChatGPT boldly claimed, “the emergence of advanced language models like ChatGPT has revolutionized the way expert witnesses operate” and “ChatGPT has become an invaluable tool for damages expert witnesses.”

While HKA does use certain AI tools, HKA has never used ChatGPT in its work and has no intention of doing so. This article explores the potential benefits and risks of using AI models, such as ChatGPT, in expert reports for legal proceedings, including legal and ethical considerations.

Through our conversations with ChatGPT, it told us it could assist damages experts in the following ways:

  • Efficiency and speed – ChatGPT can provide rapid responses to technical queries related to damages calculations or financial theory, increasing productivity and reducing cost;
  • Aid damages calculations – although ChatGPT cannot compute damages calculations itself, it can provide guidance and generate relevant text. For example, the expert could ask ChatGPT to explain the formula for discounting cash flows;
  • Real-time case preparation – ChatGPT can simulate cross-examinations, allowing the expert to practice responding to challenging questions and strengthening their arguments; and
  • Improved communication with non-experts – one of the challenges faced by damages expert witnesses is effectively communicating complex concepts to non-experts, such as arbitrators, judges, or opposing counsel. ChatGPT’s ability to generate human-like text can aid in simplifying technical terminology and presenting it in a more accessible manner.

In an industry producing complex written reports and industry research, often under time pressure, the temptation for individuals to use a tool such as this is clear. However, user beware, its use has many limitations, including potentially inaccurate information. 

Despite this risk, it is unclear to what extent ChatGPT may already be widely used in the legal industry. For example, the BBC reports that millions of people have used ChatGPT since it launched in November 2022.[1]  Further, a US lawyer is facing sanctions following the submission of a filing that was found to include several “bogus judicial decisions with bogus quotes and bogus internal citations“, which the lawyer later admitted to having researched using ChatGPT.[2]

Expert witnesses have a duty to declare to the Court or Tribunal that the information and facts in their reports are true and accurate and that any opinions expressed are impartial and are their own and within the areas of their own expertise. Therefore, it is unclear if and how ChatGPT could be ethically incorporated into their reports. The answer to this might depend on whether the expert is: i) using the tool purely as a time-saver, where generated text will be verified and reviewed for accuracy by a human with the relevant expertise or; ii) covering for a gap in their knowledge. Nevertheless, as a text generator, the lines between research tool and plagiarism are blurred. Therefore, expert reports that have relied on ChatGPT ought to include it as a source.

There are also critical data privacy risks. In order to simulate a cross-examination, an expert witness would need to provide ChatGPT with relevant background information about the case and the specific topic to be addressed. ChatGPT learns from user input; therefore, if a user inputs confidential information into ChatGPT there is potential for confidential information to be shared with third parties.

As usual, technology has advanced more quickly than regulation and policy. Companies, experts and legal teams need to consider implementing ChatGPT policies, including whether they will accept the use of ChatGPT and, if so, what safeguards they have in place.



This publication presents the views, thoughts or opinions of the author and not necessarily those of HKA. Whilst we take every care to ensure the accuracy of this information at the time of publication, the content is not intended to deal with all aspects of the subject referred to, should not be relied upon and does not constitute advice of any kind. This publication is protected by copyright © 2024 HKA Global Ltd.


Follow HKA on WeChat


HKA WeChat