Can Artificial Intelligence be used to identify perpetrators of domestic abuse?
A research project from four universities looks to test whether machine-learning can be used in analysing data such as text messages and messages on social media to bring cases more quickly to court and address concerns of misogyny in police forces.
A new research project is looking at how artificial intelligence (AI) might identify perpetrators of domestic abuse through coercive control, by analysing the text messages they send.
The research might help bring rape cases to court more quickly, and address victim’s concerns regarding misogyny within the police.
The project is a collaboration between four universities – London South Bank (LSBU), Brighton, De Montfort, and Edge Hill – and has been backed with a grant of £115,046 by the Home Office.
A member of the research team, Tirion Havard, Associate Professor of Social Work Department at LSBU, said that it takes on average 1000 days for a rape case to come to trial.
“In that time, many survivors withdraw as they cannot face the re-traumatising effect of going to court. One reason it takes so long is that there is so much digital data, that it takes ages to analyse it all.
“This project is a feasibility study to use AI and we want to test if it will work in the way we hope. We take texts sent to women from mobile phones and analyse them.”
“The idea is that by using AI to analyse the words in those text messages, we can see whether a sentence in a text is written sarcastically, and then decide the emotion attached to it – whether it is written in jest, or in anger and so on.”
Professor Havard said that the project is also looking at the feasibility of using AI to address concerns about misogyny in police forces. A recent investigation by the Independent Office for Police Conduct concluded that misogyny is a major problem in the Metropolitan Police, and cultural change within the force is needed to address it.
Professor Havard said that women are often reluctant to give their phones to police for investigations, and that AI could address this. The research will look at ways of analysing and, in the early stages of an investigation, anonymising data before it is passed to police.
“In that way the victim’s identity would be hidden and the computer, not the police, would do the initial analysis. It would be very, very quick. The information on the phones is converted into a series of digital characters, and this would mean that the identity of the victim would be hidden from the police in this initial analysis.
“We need to have survivors [of sexual violence] on board to see what reservations they have about this approach, and work with them to address those.
“We have launched a short survey asking survivors of domestic abuse to tell us -- anonymously -- about their own attitudes to their phones being used in this way, and would appreciate any help social workers can give us by passing on the details to those who might want to complete the survey.”
The LSBU survey for survivors of domestic abuse can be found online at https://lsbu.onlinesurveys.ac.uk/using-artificial-intelligence-to-identify-perpetrators-of
£38,223 to £40,221
Most popular articles today