Artificial intelligence has developed rapidly over the last few years. Twelve months ago, many of us were unfamiliar with Chat GPT, Google Gemini and Copilot as an everyday life assistant. Today, for many of us, these tools have replaced Google as the “go‑to” source for answers to everyday questions. Some lean heavily on AI to provide instruction and guidance where traditional search engines might have fallen short.
Generative AI derives its outputs from a combination of publicly available information, licenced content and data produced by human trainers. It distils that material into responses that might at first glance appear helpful and authoritative. This can save significant time and provide immediate direction that would’ve otherwise been incredibly difficult to locate or digest.
However, AI is only ever as good as the information it is drawing from and the prompt that is put into it. This becomes critically important when people attempt to rely on AI to short‑circuit solutions to legal problems.
There has been much publicity about AI making things up, or hallucinating. There have been several reported cases of lawyers relying too heavily on AI and citing legal authorities that simply do not exist. In England and Wales, barristers have been referred to the Bar Standards Board and solicitors to the Solicitors Regulation Authority for presenting fabricated case law to the courts.
While AI has undoubtedly come a long way, it routinely falls short of some of the most basic elements that are second nature to experienced family lawyers and critically important to clients.
AI Does Not Know Your Life
Through the process of acting for clients, the kind of nuance that makes a difference to the outcome of a divorce case is often obvious to an experienced family solicitor but completely unknown to a client at the outset. It is only through detailed discussion about the history of the relationship, informal arrangements the couples have made in the past, power imbalances and a myriad of other variables that can be relevant to different degrees in different situations that family solicitors learn of the lived reality of any client. That is something that no AI model dependent on patterns and source information can ever replicate.
The Discretion of the Family Court
This extends to the family court, whose jurisdiction in England and Wales is discretionary and someway off being formulaic. It does not apply rigid rules and two cases that might look very similar on paper can often produce very different outcomes owing to factors, that again, AI would struggle to identify. Whilst AI might produce what it considers to be the correct solution; there are always going to be a range of possible and legitimate outcomes.
Disclosure
In family law, what has not been disclosed can be just as important as what has. Courts are alert to inconsistencies, red flags and gaps in evidence. An AI system that can only work with the information it is given will struggle to identify what is missing — often the most important question of all.
Over‑confidence
AI can also have a tendency to be overconfident in a position in order to seem helpful and authoritative. That confidence can often conceal errors and opinions that warrant more nuance or less oversimplification. One of the most important traits of a junior family lawyer is to know when they don’t know something and to not be afraid to say as much. That humility is roundly missing from AI systems, and the resulting over‑confidence can be contagious on the user.
Children Issues
Particularly when it comes to the welfare of children, safeguarding and emotional dynamics, AI is an incredibly inappropriate tool to navigate risk, and vulnerability. Decisions affecting children require a human understanding of welfare and the impact of outputs and decisions, something which AI is roundly missing.
Missed Markers
Divorce settlements are shaped by strategic judgement as much as by law. It is affected by the parties, their lawyers and what has gone before – subtle things like nuance and tone can be incredibly important. This is very much missing from AI and makes overreliance on it incredibly risky.
Accountability
If AI gets something wrong, there is no accountability. There is no professional duty, no insurance and no recourse when it produces a bad output. Lawyers, on the other hand, are regulated, insured and responsible for the advice they give. There are consequences if it is wrong.
There are already areas of everyday family law where there is little question AI is creeping in in a worrying way. I am aware of cases where it has been used to poor effect by litigants in person, providing them with the answers they want to hear rather than those they need to. I also know of one occasion in which a solicitor asked an AI model to assess prospects of success and, having been told they were poor, abandoned what was in fact a very strong case.
We should all be incredibly careful around the use of AI in legal matters. Placing blind faith in this technology when dealing with such diverse and very human problems can lead to false confidence, wasted costs, disappointment and fall out. Particularly in family law — where the profession works hard to reduce conflict and fallout — we would do well to develop a stronger voice around the risks it brings.
If you need to talk to a lawyer regarding family matters then please get in touch with our Family Law team.