One year on from the launch of ChatGPT, another war and more record-breaking weather events, risk leaders around the world have much prioritising to do. Recent research into risk culture by ACCA reveals a growing concern about leadership in the face of today’s sudden surprises and exponential change. It reveals the extent to which ethics is at the core.
Throughout 2023, our survey responses have shown a steady decline in trust in the board’s and senior management’s ability to define their sustainability and generative artificial intelligence (AI) strategies.
‘The underestimated risk is the potential market disruption caused by AI to parts of our customer base and our lack of readiness for the impacts,’ said an ACCA member from New Zealand.
‘Everyone is in constant learning mode, and there are major challenges for leaders’
A member in Saudi Arabia, where there is tremendous investment in AI, reported: ‘There are skills and capability shortages across the sector, at all levels, including the board.’
Keeping up
The amount of change and innovation over the past year or two is equivalent to what we’d normally expect in 20 years, according to a chief risk officer (CRO) in the Middle East in response to ACCA’s recent banking sector risk report, the first in a series of industry-specific reports due out next month.
‘Everyone is in constant learning mode, and there are major challenges for leaders from generative AI, ChatGPT and the metaverse,’ says the CRO. ‘Banks are not just competing with each other anymore but with big payment systems, whether Apple, Google or Amazon.
‘There is some serious decision-making that is going to take place in our industry’
‘Boards need to be thinking about how they are going to tackle the implications and how these affect their existence – what talent will be wiped out over the next few years and what talent will be needed. There is some serious decision-making that is going to take place in our industry.’
A member in the UK banking sector said it would ‘take some time to really understand how AI systems and their use cases, including for ESG [environmental, social and governance] reporting, can be used responsibly and ethically.
‘We need to think about the ability to implement them at scale, with levels of precision in integrating them into our business processes, and how that will withstand the discipline and rigour of the first through to the third line of defence.’
Culture can support
As with sustainability in recent years, there’s bound to be a tidal wave of regulatory changes and reporting requirements relating to AI. Risk culture and governance can help point organisations in the right direction as all this pans out.
‘Boards need to have the visibility to understand how to implement AI-led businesses’
It is vital for human values and corporate culture to be aligned with governance. Because accounting professionals are trained to understand how business models work, how to assess risks, and how to ensure there are effective controls and governance in place, all underpinned with our codes of ethics, we are well placed to support embedding the mindset shifts needed to steer the organisation in the right direction.
‘Boards need to have the visibility to understand how to implement AI-led businesses, how to assess the opportunities, and of course the risks, and how to support management in such implementation while, importantly remaining sceptical and providing the necessary oversight,’ said Alan Johnson FCCA, former chair of the International Federation of Accountants and new chair of the Good Governance Academy (GGA) at the GGA’s colloquium, ‘The risks and opportunities with generative AI’.
They can also strike the balance between what the numerical data tells us versus what is happening in practice, which is invaluable for informing major decisions and maintaining trust and transparency during rapid disruption.
The challenge is understanding how business models are being forced to transform
For example, as the 2030 emissions targets draw near, organisations that are not on track to comply will be under increased scrutiny to explain what they are or are not accomplishing.
Moral compass
The board has a fiduciary duty to create its organisation’s moral and risk compass, and define the plan of action for their organisation. Accounting professionals can help them weigh the different scenarios, good and bad, and short term versus long term. But getting conversations about the risks and ethics cascaded down and elevated up the organisation is the secret sauce of a successful risk culture.
Ethics is central to building strong risk cultures – not only in terms of creating guardrails but also in maintaining curiosity and critical thinking. The challenge is understanding how business models are being forced to transform, and how to get everyone to adhere to an ethics code so that organisations roll out their products and services in the way we desire and in line with financial statements.
Organisations need to create transparency and accountability for the use of AI applications, ensuring that there is human control around the data going in and what comes out, and find a way to measure and incentivise a risk culture that everyone owns and benefits from.
More information
Listen to ACCA’s risk culture podcast series. In episode 3, Dr Roger Miles, a member of ACCA’s risk culture special interest group, says accounting professionals have a very important job ‘to forage for the truth’.
Watch on demand the ‘Accounting for the Future’ conference session ‘Fraud risk: who’s in charge?’