RSA publishes report calling for early engagement with people in an effort to build confidence in the technology’s use.
A majority of people are uncomfortable with the idea of artificial intelligence being used in decision making in public services, according to the results of a poll carried out for the Royal Society of Arts, Manufactures and Commerce (RSA). It has also urged in a report that the public should be engaged early about the increased use of in an effort to build confidence in the responsible use of the technology.
A survey carried out for the RSA by polling firm YouGov, which took in responses from 2,000 people, showed that small numbers were aware of the role of automated decision systems in public services: 9% for criminal justice, 14% for immigration, 18% for healthcare and 19% for social support.
On being asked if they supported its use, the numbers were only a little higher, or lower in one case: 12% for criminal justice, 16% for immigration, 20% for healthcare and 17% for social support. For each service area there was a majority of 50-60% saying they opposed its use with the rest saying they were unsure, although for healthcare the figure in opposition was 48%.
The main concerns about the increased use of is that it does not have the empathy required to make important decisions affecting people and communities (61%), that it reduces people’s responsibility and accountability (31%) and there is a lack of oversight or government regulation of the decisions (26%).
Even with the introduction of various safeguards – such as the introduction of common principles and the right to request an explanation of a decision – only a third or less of the respondents said they would increase their support for using in a range of public and private sector services.
When asked if they were comfortable with the idea of automated systems being used more over time as their accuracy and consistency improve, only 26% said they were comfortable to any degree, with 64% saying they were uncomfortable and the remainder being unsure. […]