top of page
Adults'
All features
Training
Children's

Could AI improve the experiences of families and professionals in the family justice system?

A new briefing from the Nuffield Family Justice Observatory explores how artificial intelligence could improve experiences for families and professionals in the family courts, while warning that concerns around bias, accuracy, privacy and accountability must be carefully addressed.

06/03/26

Could AI improve the experiences of families and professionals in the family justice system?

A new briefing has highlighted both the potential benefits and significant risks of using artificial intelligence in the family justice system, urging policymakers and practitioners to consider how the technology could be used safely to improve the experiences of families and professionals.

The report, published by the Nuffield Family Justice Observatory, aims to prompt discussion about how artificial intelligence (AI) might support the family justice system while ensuring appropriate safeguards are in place.

The updated 2026 briefing builds on a first edition published in 2024 and examines emerging uses of AI in areas including improving families’ experiences of legal processes, increasing administrative efficiency and supporting decision making.

The authors note that while AI technologies are increasingly present across society, the family justice sector is still at an early stage in considering how they might be applied. They argue that more reflection is needed about both the opportunities and the challenges the technology presents.

According to the report, AI could improve families’ experiences of the justice system in several ways. Some organisations are already experimenting with digital tools designed to help people navigate legal processes or access information more easily.

For example, the Children and Family Court Advisory and Support Service (Cafcass) has introduced an online virtual agent on its website to help users find guidance and answer common questions. The tool uses algorithms and natural language processing to provide responses based on a defined set of information and refers users to a human adviser when it cannot answer a query. Data from December 2024 suggests it was responding to around 2,500 queries each month.

The report also highlights the potential for large language models to make legal documents more accessible by translating complex legal language into plain English or other languages, which could help families – including children – better understand court proceedings.

Beyond supporting families directly, the briefing suggests AI could help reduce administrative burdens within the justice system. Potential uses include reviewing large volumes of documents, transcribing hearings or meetings, summarising evidence and supporting case management processes.

Some courts have already begun introducing AI-supported transcription tools to civil and family proceedings, with the aim of speeding up the production of transcripts and improving public access to hearings.

The briefing also explores how AI might support decision making in areas such as dispute resolution, predictive analysis and risk assessment.

Online dispute resolution platforms are one example. These systems can use AI tools to help parties negotiate settlements by analysing information about a dispute and suggesting possible compromises based on historical case data. One example cited in the report is Modria, a platform used in some jurisdictions to help resolve disputes relating to issues such as divorce, separation, employment or housing. The system gathers information from users, identifies areas of agreement and disagreement, and proposes potential solutions before moving to mediation or arbitration if necessary.

The authors also discuss predictive analytics systems that analyse historical data to estimate the likelihood of certain outcomes, such as a child requiring social care intervention or a young person becoming vulnerable to exploitation. While predictive modelling has been used in social care for some time, newer machine-learning approaches are increasingly being explored.

However, evidence about their effectiveness remains limited. Research published by What Works for Children's Social Care in 2020 found that a series of machine-learning models developed to identify children at risk using local authority social care data did not perform well against the researchers’ success measures.

Risk assessment tools using machine learning have also produced mixed results. Some studies suggest they can improve the accuracy of certain assessments, such as evaluating the risk of domestic abuse. But the report notes that algorithmic systems have also produced discriminatory outcomes in some contexts, including systems that have been shown to overestimate risks for particular demographic groups.

The authors warn that these examples illustrate wider concerns about bias within AI systems. Because many tools are trained on large datasets that may contain historical or societal biases, they can reproduce or amplify inequalities if not carefully designed and monitored.

Other risks highlighted in the briefing include issues relating to privacy, accountability and transparency. Family justice cases often involve highly sensitive personal information, meaning that using generative AI tools could create significant data protection concerns if confidential data is entered into systems that store or reuse information.

Accuracy is also a key issue. Generative AI systems can produce incorrect or fabricated information – sometimes described as “hallucinations” – and research cited in the briefing suggests such errors can occur in between 3% and 30% of outputs depending on the model used.

The authors note that inaccuracies are not limited to written responses. AI transcription systems may also struggle to capture accents, tone or non-standard speech patterns, meaning human review is essential before outputs are relied upon.

The report also raises concerns about unequal access to AI-supported services. Data from Ofcom suggests around 2.8 million people in the UK do not have internet access, while others may lack the confidence or digital skills needed to use online tools. This could create new inequalities if AI-enabled services become more widely used in legal processes.

Trust and transparency are also highlighted as important factors. Surveys cited in the briefing suggest the public is more comfortable using AI for relatively routine legal tasks, such as reviewing contracts, than for emotionally complex issues such as divorce or child arrangements.

The authors emphasise that any adoption of AI in the family justice system must therefore be accompanied by strong governance and clear accountability. Although the UK currently has no single law regulating AI specifically, existing legislation – including data protection law – already applies to many aspects of its use.

Ultimately, the briefing concludes that artificial intelligence could offer benefits for children, families and professionals if used responsibly, but warns that the stakes are particularly high in family justice because of the impact decisions can have on people’s lives.

The authors argue that developing “human-led” approaches to AI will be essential if the technology is to support a family justice system that is easier to navigate and understand, while maintaining fairness, transparency and public trust.

Read the full report: https://www.nuffieldfjo.org.uk/news/news-ai-in-the-family-justice-system-2026

You can also attend The Social Work Show, a free conference for professionals, with a number of topics discussing the use of AI in various social work settings: https://www.compassjobsfair.com/Events/Birmingham/Book-Tickets

Paint on Face

Stoke-on-Trent City Council

Senior Social Worker - Children's Support and Safeguarding

Job of the week

Sign up for an informal interview for this role today

£45,091 - £48,226

SWT_SideAd1.png

Featured event

Featured jobs

Stoke-on-Trent City Council

Senior Social Worker Court Team

Stoke-on-Trent City Council

Senior Social Worker/ Social Worker - CSS Duty & Assessment teams

SWT_Online_Events_ad.png

Most popular articles today

Could AI improve the experiences of families and professionals in the family justice system?

Could AI improve the experiences of families and professionals in the family justice system?

Government plans child cruelty register to strengthen monitoring of offenders

Government plans child cruelty register to strengthen monitoring of offenders

Commission calls for consistent standards across Scotland’s child mental health units

Commission calls for consistent standards across Scotland’s child mental health units

New study maps strengths and gaps in evidence on outcomes for children in care

New study maps strengths and gaps in evidence on outcomes for children in care

Sponsored Content

What's new today:

Supporting social work students with additional needs during their placement

About Us

Social Work Today is an online platform, developed to give professionals a sector-specific space that creates the networks to provide them with social work information, webinars, jobs and CPD from across the UK and wider global community.

Advertise with us

There are a number of options to promote your organisation on Social Work Today, from banner and advertising spaces, to job postings that are uniquely personalised to effectively showcase your message.

Click here to find out more

  • Instagram
© Social Work Today 2022
bottom of page