1. LLMs are at a greater risk of providing wrong information by hallucinating or making errors, given the limited access they have to accurate and up-to-date information about the law. One study found that general-purpose LLMs hallucinate on legal queries on average between 58% and 82% of the time. Relying on legal information provided by an LLM without checking it is accurate would be very dangerous. In the case referred to at the start of this course, ChatGPT invented cases in their response to a legal query. 2. LLMs are not reasoning tools. Even where they have access to accurate and up-to-date information about the law, they will not be able to apply the law to a complex set of facts in the same way that an adviser or lawyer would do. This can lead to the wrong advice being given about how the law applies to a given query. 3. LLMs are programmed to be as helpful as possible, and will write in a confident and persuasive way. If wrong information is given to the LLM in the prompt, they will not necessarily correct this. When they are asked to check whether the advice given was accurate, they will typically confirm that it was, even where it is fabricated. Again, this happened in the case referred to at the start of the course – the lawyers asked ChatGPT whether the cases were real, and it replied that they were! 4. Many LLMs are developed by American companies and use American data in their training. The legal advice they give can default to American law, which is very different from the law of England and Wales. A 2023 study found that 20% of queries about English law initially produced outputs based on American law. 5. Information put into the LLM can be re-used by companies as part of the tool’s training data. Legal queries often contain very personal, sensitive information which you would not want to be made publicly available. In particular, legal professionals have a duty to keep client information confidential. 6. Solicitors, barristers and chartered legal executives are regulated by professional bodies, which require relevant training and expertise and provide compensation if legal professionals provide incorrect advice. LLMs are not regulated and, therefore, if they provide the wrong information or advice, there is no organisation which can help or provide a remedy.