And if you aren’t careful, this could end up costing you your business and your hard-earned reputation.
Risks posed by LLMs
LLMs are well-known to produce “hallucinations” – false or illogical responses which can fabricate content or, in a legal context, cite non-existent cases. LLMs can also be prone to so-called regressive bias – favouring old and outdated information simply because that information appears more frequently in the web-based dataset drawn on by the LLM to produce a response.
In that sense, it is important to remember that LLMs can only ever produce responses based on patterns learned from the data they are trained on. If an LLM’s dataset does not include sufficient information on the topic or area of law you want investigated to address your business or personal issue, then the LLM’s response might seem reasonable enough, but it will be based on an incomplete or outdated picture of the actual state of the law. The LLM will have essentially guessed what the response might be. Your guess is probably just as good!
Aside from the risk of inaccuracy, don’t forget that in relying exclusively on ChatGPT for legal advice, you miss that key human element – a chance to engage with a real person, to ask follow-up questions, and to build a genuine rapport with someone who can join you on your journey to build your business, protect your assets and your legacy for your family, and act as a trusted advisor when things go wrong. Having that “go to person” just a phone call away who you can trust and depend on is priceless. AI lacks the ability to understand the nuances of personal relationships, and your unique concerns and interests. It is also not in any sense a human brain, and cannot apply forensic and strategic scrutiny to deliver the kind of bespoke solution that a real lawyer can.
In speaking to a real lawyer, you can also have peace of mind that lawyers are subject to strict professional and ethical responsibilities, and are officers of the court. They are required to maintain the utmost standards of honesty and integrity, and they also have the benefit of a comprehensive professional indemnity insurance regime in providing legal services to clients. The rigour of the regulatory regime applying to lawyers is for the ultimate benefit of you as a client – and this is something you will never get from AI. Lawyers have accountability, and ChatGPT simply does not.
Another key issue to be mindful of – before using a LLM to address your legal query – is the risk to your confidentiality, and the loss of ownership of potentially sensitive information (such as trade secrets and personal data), which you may experience depending on the detail of the material you provide in your query to the LLM. Anything that goes into the LLM effectively enters the public domain, because it will form part of the LLM’s dataset that will be drawn on to produce responses for other users in future. So be careful before being so ready to accept the fact that your business and personal information will likely be shared with unspecified third parties if you rely on a LLM to answer an important legal question.
Real world examples
Some real-life examples of where it can all go wrong from relying on AI as a complete replacement to engaging a real lawyer are:
- The self-represented sole trader who receives a letter of demand from a supplier and chooses to defend the case themselves in court – relying on ChatGPT to find supporting legal authorities that turn out to be fake. This is readily apparent to the supplier’s lawyers, and the court, resulting in the sole trader not only losing the case, but being ordered to pay the supplier’s costs.
- The small business owner who wants to enter into a joint venture arrangement with another party, and uses a LLM to draft a joint venture contract. The contract includes standard clauses that are weighted in favour of the other party, and require the small business owner to act in the best interests of the other party in relation to any key business decision. The small business owner does not “read the fine print” and later finds he is unable to terminate the joint venture to pursue other more profitable opportunities, effectively locking him into an arrangement for the foreseeable future even as his business loses money by the day.
- The company that wants to acquire an interstate business with a view to expanding its services beyond Queensland, but to save on legal costs uses a LLM to conduct due diligence on the interstate business. In the process, the LLM misses unresolved litigation pending against the business and various licensing breaches. The company proceeds to acquire the business, and then is forced to assume liability for the outstanding claims and breaches.
Takeaways
You want the best for your business, your family and your own personal life. So why make potentially life-changing decisions based on something produced by an illusory and artificial source that carries no risk for the LLM or its operators, but instead offloads all the risk to you, and could wind up costing you your business, your house or worse?
We are not suggesting that LLMs should never be used. On the contrary, we can all appreciate the powerful ways in which LLMs, and AI more broadly, can improve our lives. When used appropriately, LLMs can empower us, improve our efficiency and communication, enable us to reimagine how we access and process complex information, aid our learning and understanding, and spark new conversations and lines of inquiry.
However, it is best to treat content produced by ChatGPT and other LLMs with a degree of healthy scepticism – a useful starting point to see what your options are in a particular matter, but far from the end of the story.
You should closely scrutinise any purported “legal” content produced by a LLM, and in all cases have the content independently verified by a professional (and human!) lawyer before making a decision that could have a serious impact on your life and your business.
The simple message is: proceed with caution, and always talk to a real lawyer!

