Government Research

How to build AI that society wants and needs

While holds great promise for society, the speed of its advancement has far outpaced the ability of businesses and governments to monitor and assess the outcomes properly. The relative ambiguity of regulatory oversight throughout the world prevents from directly reflecting society’s needs. It is important that organizations take steps to enable and showcase trustworthiness to all stakeholders and build the reputation of the organization’s .

Copyright by

SwissCognitive, AI, Artificial Intelligence, Bots, CDO, CIO, CI, Cognitive Computing, Deep Learning, IoT, Machine Learning, NLP, Robot, Virtual reality, learningTrust in starts with stakeholders understanding that a particular organization uses responsibly. It is unlikely that external stakeholders will identify individual systems as “trustworthy” or “untrustworthy”; rather, an organization is considered trustworthy or not and systems inherit the organization’s reputation. In the same way that an organization’s human staff showcases the organization’s values, the behaviours of the system are both a manifestation of and an influence on the organization’s reputation.

Training staff is a familiar challenge to most organizations, but the challenges of implementing ethical and trustworthy are new and different. They are, however, well documented with more than 90% of surveyed companies reporting some ethical issues. How can an organization do better?

In the current ambiguous regulatory environment, the complexity of systems drives organizations to seek new means to support their development. Most sophisticated technology companies, at a bare minimum, indicate they have the fairness, ethics, accountability and transparency (FEAT) principles at the centre of development and more than 170 organizations have published and data principles. However, many organizations tend to “build until something breaks” without consideration for who is affected by the break.

We believe that tangible progress toward the responsible use of technology can be made by taking advantage of people, process and technology to bridge these gaps. Here’s how:

1. People

Organization leaders – from managers to the C-suite and board of directors – often have little understanding of the assumptions and decisions made throughout the development and implementation of . Regardless of leaders’ understanding of the , they own the reputational and financial outcomes (both positive and negative). Data scientists, on the other hand, can find it challenging to take all the guidelines, regulations and organizational principles into account during the development process.

In both cases, the challenge is not generally a lack of understanding of what it means to be responsible; it is a lack of insight into what factors are important at different levels of the organization and how they affect outcomes. Finding ways (processes and tools) to bring disparate groups together can be highly effective. […]

Read more: