EduTech Real Estate

Is it right to use AI to identify children at risk of harm?

Is it right to use AI to identify children at risk of harm?

Technology has advanced enormously in the 30 years since the introduction of the first Children Act , which shaped the UK’s system of child safeguarding.

Copyright by www.theguardian.com

 

SwissCognitiveToday a computer-generated analysis – “” that produces – can help social workers assess the probability of a child coming on to the at-risk register. It can also help show how they might prevent that happening.

But with technological advances come dilemmas unimaginable back in 1989. Is it right for social workers to use computers to help promote the welfare of children in need? If it is right, what data should they draw on to do that?

Maris Stratulis, national director of the British Association of Social Workers England, first voiced concerns last year. She remains worried. “Machine learning in social care still raises significant issues about how we want to engage with children and families,” she says. “Reports on its use in other countries, such as New Zealand , have shown mixed results including potential unethical profiling of groups of people.”

Stratulis is also concerned at the role of profit-making companies in the new techniques. “Rather than focusing on learning from machines and algorithms, let’s focus on good, relationship-based social work practice,” she says.

Machine learning is an application of (). Computer systems enable councils to number-crunch vast amounts of data from a variety of sources, such as police records, housing benefit files, social services, education or – where it is made available – the NHS. In children’s services, a council may ask for analysis of specific risk factors which social workers would otherwise not know, such as a family getting behind on the rent, which can then be triangulated with other data such as school attendance.

“We don’t decide what databases to trawl – the client does,” says Wajid Shafiq, chief executive officer at Xantura, a company he set up 11 years ago which has recently been working with Thurrock council and Barking and Dagenham council in east London. “And the public sector is very aware of the ethical issues.”

Most councils trialling predictive analysis are using commercial organisations to set up and run the analyses. Only one, Essex, is known to be using its own purpose-built database collection. Thurrock is working with Xantura in using data analytics to help, in the words of a council spokesperson, “better identify those most in need of help and support, and to reduce the need for statutory interventions”.

Such is the sensitivity of the issue, however, that all councils dipping their toes into the machine-learning water are at pains to stress the caution they are adopting. “It is important to emphasise that data analytics systems are only part of the process,” says the Thurrock spokesperson. “Further verification and checks are carried out in line with statutory requirements prior to any intervention.” […]

 

Read more – www.theguardian.com

1 Comment

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.