Companies need to govern AI risks as adoption outpaces oversight: KPMG, IIA Singapore

KPMG and internal auditors’ body launch a guide on AI organisational transformation

Meera Pathmanathan

Published Tue, May 12, 2026 · 04:43 PM
    • Jonathan Ho, partner and head of risk consulting at KPMG, and Ong Wei Han, vice-president on the board of governors at IIA Singapore, at the launch of the playbook.
    • Jonathan Ho, partner and head of risk consulting at KPMG, and Ong Wei Han, vice-president on the board of governors at IIA Singapore, at the launch of the playbook. PHOTO: KPMG IN SINGAPORE

    [SINGAPORE] Companies are adopting artificial intelligence at a pace that governance and internal audit functions are struggling to match, noted a report by the Institute of Internal Auditors (IIA) on Tuesday (May 12).

    This is raising concerns about firms’ abilities to manage the risks emerging from the use of AI.

    The 2026 Risk in Focus Singapore report by the IIA highlighted that nearly three in four respondents viewed digital disruption and AI as severe risks to companies, but only about half believed internal audit coverage was sufficient to address these risks.

    “The question is no longer just whether control exists, but whether oversight can keep pace with the speed of AI,” said Jonathan Ho, partner and head of risk consulting at KPMG in Singapore.

    He added that the AI-related risks that organisations viewed as most severe were “precisely those that their internal audit functions are least prepared to address”.

    “In Singapore, the gap is especially stark,” he said, at the launch of The Agentic Opportunity: Governing AI for Trust, Integrity and Impact, a playbook jointly developed by KPMG and the IIA Singapore for AI organisational transformation.

    DECODING ASIA

    Navigate Asia in
    a new global order

    Get the insights delivered to your inbox.

    AI adoption accelerating across organisations

    Ong Wei Han, vice-president of the board of governors at IIAS, said AI adoption had become widespread in organisations and was accelerating rapidly.

    “We have greater reliance on AI-generated outputs. Everyone is relying on some form of AI-generated outputs with less independent challenge on those outputs that appear structured and consistent,” he said, adding that human oversight has become increasingly important.

    The report found that while 57 per cent of respondents believe that human capital poses a severe risk to companies, only 8 per cent believe there is sufficient internal audit coverage to assess these risks. 

    “It is important for us to continue to emphasise the need for oversight – human oversight, not AI oversight. It changes how oversight is exercised, especially in how rigorously outputs are questioned, validated and retested,” said Ong.

    Calling AI adoption “a workforce shift, not a technology shift alone”, he noted that internal audit functions would play an important role as AI becomes embedded in business decision-making.

    “The focus is no longer on controls only. You need clarity, accountability and sound judgment in the decisions organisations make,” he said. “This is where, as a profession, we continue to matter in helping organisations navigate change with confidence and integrity.”

    Companies know the risks, but execution remains a challenge

    The annual Risk in Focus report surveys senior internal audit leaders on the top risks facing their organisations. The findings show that while many companies recognise AI as a significant risk area, the difficulty lies in translating awareness into practical governance and risk management measures.

    “We know that’s a risk, but the challenge is how we are going to address these risks,” said Sunita Kaur, technical director at IIA Singapore. She added that the playbook was designed to help organisations bridge this gap.

    Among the recommendations outlined in the playbook was the creation of a unified AI trust framework that integrates AI risk and security.

    Eunice Tan, director of Governance Risk & Compliance at KPMG Singapore, said the unified framework would enable organisations to establish a consistent approach to AI oversight.

    “There needs to be that single-layer framework to drive consistency,” she said.

    She added that AI governance should not be viewed solely as an information technology responsibility.

    “It’s not just a pure IT or personnel responsibility. It is everyone in the business, and this is a business decision,” she said. “If we do it well, we’ll get a competitive advantage.”

    During the session’s fireside chat, panellists discussed how internal audit functions would need to evolve beyond traditional assurance roles to become strategic partners.

    Cynthia Cheong, founder of C3 Consulting Services, said governance structures needed clearly defined responsibilities and accountability.

    “With clear responsibilities, you have accountability,” she said. “We need to expand to be a strategic partner. Both assurance and advisory are just as important.”

    The playbook has also been selected for use at Workforce Singapore’s programmes this year, extending the conversation into broader organisational and workforce transformation efforts across Singapore.

    Decoding Asia newsletter: your guide to navigating Asia in a new global order. Sign up here to get Decoding Asia newsletter. Delivered to your inbox. Free.

    Share with us your feedback on BT's products and services