When we talk about “robots”, people might think of humanoid C3PO-like figures. But the robots taking over many aspects of modern workplaces are better explained as robotic process automation (RPA) – software that can be programmed to do basic, repetitive tasks.
copyright by www.cityam.com
This has been part of many organisations for decades, but increasing levels of adoption, combined with the development of artificial intelligence, mean it is now one a key issue for many businesses to consider.
One aspect has hitherto been neglected: how do you audit robots?
The benefits of automation are fairly evident. It can reduce cost, help eliminate errors, and increase profitability. As wasteful manual processes are replaced by automated ones, margins can increase. Robots can execute processes that otherwise consume thousands of staff hours – freeing up skilled employees to work on more complex, added-value tasks – but are also scalable, meaning it is easier to cope with increased demand. For example robots can be deployed 24/7 if necessary. Moreover, automation offers a secure alternative to outsourcing or offshoring, as it does not create potential supply chain problems, and is more easily subject to internal controls. But there are inherent risks to increasing automation, and auditors need to develop ways to provide assurance that they are being managed to an appropriate level.
How many robots are too many robots?
One issue is how many robots an organisation can realistically manage. Multiple departments creating and maintaining their own robots, with varying standards of risk and control, could mean a fragmented – and potentially vulnerable – business, and IT departments cannot be responsible for them all.Strong internal governance frameworks, perhaps with centralised oversight, would be key to ensuring this does not become a business risk. Another is how to decide which processes are suitable for automation. Internal auditors need to consider areas where levels of subjectivity, complexity or variability mean it is better to have people in charge. Even the best artificial intelligence cannot yet “think” like humans, mixing instinct and intuition with logic. They can’t make “common sense” checks, and do not assess irregularities or anomalies in the same way. Nor can they spot where something operationally useful might be socially or ethically disastrous. […]
read more – copyright by www.cityam.com
When we talk about “robots”, people might think of humanoid C3PO-like figures. But the robots taking over many aspects of modern workplaces are better explained as robotic process automation (RPA) – software that can be programmed to do basic, repetitive tasks.
copyright by www.cityam.com
This has been part of many organisations for decades, but increasing levels of adoption, combined with the development of artificial intelligence, mean it is now one a key issue for many businesses to consider.
One aspect has hitherto been neglected: how do you audit robots?
The benefits of automation are fairly evident. It can reduce cost, help eliminate errors, and increase profitability. As wasteful manual processes are replaced by automated ones, margins can increase. Robots can execute processes that otherwise consume thousands of staff hours – freeing up skilled employees to work on more complex, added-value tasks – but are also scalable, meaning it is easier to cope with increased demand. For example robots can be deployed 24/7 if necessary. Moreover, automation offers a secure alternative to outsourcing or offshoring, as it does not create potential supply chain problems, and is more easily subject to internal controls. But there are inherent risks to increasing automation, and auditors need to develop ways to provide assurance that they are being managed to an appropriate level.
How many robots are too many robots?
One issue is how many robots an organisation can realistically manage. Multiple departments creating and maintaining their own robots, with varying standards of risk and control, could mean a fragmented – and potentially vulnerable – business, and IT departments cannot be responsible for them all.Strong internal governance frameworks, perhaps with centralised oversight, would be key to ensuring this does not become a business risk. Another is how to decide which processes are suitable for automation. Internal auditors need to consider areas where levels of subjectivity, complexity or variability mean it is better to have people in charge. Even the best artificial intelligence cannot yet “think” like humans, mixing instinct and intuition with logic. They can’t make “common sense” checks, and do not assess irregularities or anomalies in the same way. Nor can they spot where something operationally useful might be socially or ethically disastrous. […]
read more – copyright by www.cityam.com
Share this: