As the trust gap around data and analytics (D&A) and artificial intelligence (AI) continues, just 35 percent of executives say they have a high level of trust in the way their organization uses D&A. Moreover, concerns over the risks of D&A and AI are high: over 65 percent of executives have some reservations or active mistrust in their data and analytics, and 92 percent are concerned about the negative impact of D&A on corporate reputation. That said, a majority of senior executives (62 percent) say that technology functions, not the C-level and functional areas, bear responsibility when a machine or an algorithm goes wrong, according to a new survey from KPMG International.
KPMG International’s Guardians of trust report* suggests that the growing interrelationship between human and machine calls for stronger accountability at the C-level rather than with the tech functions, and proactive governance with strategic and operational controls that ensure and maintain trust. As companies make the shift to fully digital, analytically-driven enterprises, the management of machines is becoming as important as the management of people, the survey also asserts.
“Once analytics and AI become ubiquitous, it will be imperative and more difficult to manage trust,” said Thomas Erwin, Global Head of KPMG Lighthouse - Center of Excellence for D&A and Intelligent Automation. “With the rapid take-up of predictive analytics, we should prepare now to bring appropriate governance to this Wild West of algorithms. The governance of machines must become a core part of the governance of the whole organization, with the goal being to match the power and risk of D&A with the wisdom to use it well.”
The research, which surveyed 2,190 senior executives from nine countries, also found that the US and the UK are least likely to trust their D&A, with the US distrust at 42 percent and the UK at 43 percent. Only small percentages in Brazil and India mistrust their D&A (15 and 8 percent, respectively)
Who’s responsible when things go wrong?
Even with the low confidence over the reputational and financial risks of analytics errors or misuse, respondents were not clear about who should be accountable if a poor business decision results in financial loss, or the loss of customers.
In addition to the 62 percent who said the primary responsibility should lie with technology functions within their organizations, 25 percent thought it was on the shoulders of the core business, and 13 percent felt it should be regulatory and control functions.
Taking a closer look at which roles within the C-suite should hold the blame when analytics go wrong, the broad distribution of responses suggest a lack of clarity: only 19 percent said the CIO, 13 percent said the Chief Data Officer, and only seven percent said C-level executive decision makers such as the CEO.
“Our survey of senior executives is telling us that there is a tendency to absolve the core business for decisions made with machines,” said Brad Fisher, US Data & Analytics Leader and a partner with KPMG in the US. “This is understandable given technology’s legacy as a support service and the so-called ‘experts’ in all technical matters. However, it’s our view that many IT professionals do not have the domain knowledge or the overall capacity required to ensure trust in D&A. We believe the responsibility lies with the C-suite.”
What should good governance look like?
Respondents’ uncertainty about who is accountable begs the question of what proactive governance should be in place to ensure and protect the use of analytics.
“As organizations begin to think about behavior of machines as parallel to the behavior of people, they should also consider new models of governance to support the leaps in trust that the human-machine workforce requires,” Erwin said. “At a fundamental level, accountability of machines must be held firmly by the CEO and functional leaders.”
Based on recommendations from respondents, there is a strong indication that any governance framework should include standards and controls beyond the technical, covering strategic, cultural and ethical areas – the domains of the C-suite.
The five recommendations for building trust within an organization according to respondents in the survey are:
1) Develop standards to create effective policies and procedures for all organizations
2) Improve and adapt regulations to build confidence in D&A
3) Increase transparency of algorithms and methodologies
4) Create professional codes for data scientists
5) Strengthen internal and external assurance mechanisms that validate and identify areas of weakness
“Building and ensuring trust across the analytics/AI lifecycle requires organized, scalable and distributed approaches,” Erwin said. “We are seeing a lot of businesses experimenting in this area which will likely drive future standards and new governance frameworks.”
For additional information or to view the report “Guardians of trust - Who is responsible for trusted analytics in the digital age?” click on the link https://goo.gl/ZwB7oG.