top of page
Adults'
All features
Training
Children's

‘Biased outputs’ and ‘hallucinations’: Ethical risks in social workers using AI

New research from the Ada Lovelace Institute finds widespread enthusiasm for AI-powered transcription in social care, alongside growing concerns about accuracy, ethics and a lack of robust evaluation.

16/02/26

‘Biased outputs’ and ‘hallucinations’:  Ethical risks in social workers using AI

The rapid roll-out of AI transcription tools across social work is bringing clear efficiency benefits for practitioners, but current approaches to ethics, evaluation and governance remain limited and light-touch, according to new research from the Ada Lovelace Institute.

The study explored how social workers and senior managers are adopting AI-powered transcription software to record case notes and meetings, with many reporting significant time savings in pressured services. However, researchers warned that a focus on speed and efficiency risks overlooking wider impacts on people who draw on care, including concerns around bias, accuracy and so-called “hallucinations”, where AI systems generate incorrect or misleading information.

Researchers said the summarising feature in AI transcription tools held a risk of biased outputs that would be harmful in social care contexts. For example, one study looking at summaries of long-term care records from an English local authority found that some models consistently downplay women’s health issues and needs compared to those of men, with summaries about men much more likely to contain descriptors like ‘disabled’ and ‘complex’.

In another, social workers described cases where AI-generated documents included hallucinations that would have significant implications for the people they support. One social worker recounted an instance where, when using an AI transcription tool to create a summary, the tool had incorrectly “indicated that there was suicidal ideation”, but “at no point did the client actually, you know, talk about suicidal ideation or planning, or anything.”

Speaking about the findings, Lara Groves, Senior Researcher at the Institute and co-author of the report, said:

“AI may be able to help make aspects of public services more efficient, and so policymakers should absolutely be looking at how technology can help public sector workers. However, delivering time savings is not necessarily the same thing as delivering public benefit, especially if these come at the cost of inaccuracy or unaccountability. In the rush to adopt AI in the public sector, it is essential that policymakers don’t lose sight of the wider risks to people and society and the need for responsible governance.”

Researchers interviewed 39 social workers from 17 local authorities who had experience using AI transcription tools, alongside senior staff involved in procuring and evaluating the technology. Their responses were thematically analysed to develop five key insights.

These included findings that financial pressures in social care are driving widespread piloting and adoption of AI tools, with local authorities largely evaluating success through efficiency measures rather than outcomes for people receiving support. While almost all social workers reported meaningful benefits, the advantages were not experienced evenly, and responsibility for accuracy and oversight often fell squarely on frontline practitioners.

The study also highlighted wide variation in views about reliability and when it is appropriate to use AI transcription in statutory processes, pointing to a lack of shared standards or clear guidance across the sector.

Oliver Bruff, Researcher at the Institute and co-author of the research, said the current approach to testing and evaluation is too narrow:

“The safe and effective use of AI technologies in public services requires more than small-scale or narrowly scoped pilots. Ensuring that AI in the public sector works for people and society requires taking a much deeper and more systematic approach to evaluating the broader impacts of AI, as well as working with frontline professionals and affected communities to develop stronger regulatory guidelines and safeguards.”

Alongside its findings, the report sets out a series of recommendations aimed at strengthening oversight and responsible use of AI transcription tools in social care.

These include requiring local authorities to record their use of such technologies through the Algorithmic Transparency Reporting Standard, expanding government pilots across a wider range of settings, and establishing a dedicated What Works Centre for AI in Public Services to build evidence on impact.

The researchers also call for greater collaboration between policymakers, civil society, communities and regulators to assess systemic effects and develop clear guidance for the use of AI in statutory processes, alongside stronger expectations for councils to define intended outcomes when procuring new technologies.

While enthusiasm for AI transcription is high among social workers facing heavy workloads, the Institute warns that without robust governance and evaluation, the sector risks embedding tools that prioritise efficiency over accountability, accuracy and public benefit.

Read the full report: https://www.adalovelaceinstitute.org/report/scribe-and-prejudice/

Attend the Social Work Show conference for free with a number of sessions discussing AI and social work ethics: https://www.compassjobsfair.com/Events/Birmingham

Paint on Face

Stoke-on-Trent City Council

Senior Social Worker - Children's Support and Safeguarding

Job of the week

Sign up for an informal interview for this role today

£45,091 - £48,226

SWT_SideAd1.png

Featured event

Featured jobs

Stoke-on-Trent City Council

Senior Social Worker Court Team

Stoke-on-Trent City Council

Senior Social Worker/ Social Worker - CSS Duty & Assessment teams

SWT_Online_Events_ad.png

Most popular articles today

‘Biased outputs’ and ‘hallucinations’:  Ethical risks in social workers using AI

‘Biased outputs’ and ‘hallucinations’: Ethical risks in social workers using AI

Review urges tighter controls on sex offenders and information-sharing when families move

Review urges tighter controls on sex offenders and information-sharing when families move

Care experienced girls 5.7 times more likely to enter criminal justice system

Care experienced girls 5.7 times more likely to enter criminal justice system

Sector welcomes chance to shape adoption support despite concerns

Sector welcomes chance to shape adoption support despite concerns

Sponsored Content

What's new today:

Supporting social work students with additional needs during their placement

About Us

Social Work Today is an online platform, developed to give professionals a sector-specific space that creates the networks to provide them with social work information, webinars, jobs and CPD from across the UK and wider global community.

Advertise with us

There are a number of options to promote your organisation on Social Work Today, from banner and advertising spaces, to job postings that are uniquely personalised to effectively showcase your message.

Click here to find out more

  • Instagram
© Social Work Today 2022
bottom of page