On April 14, 2026, KPMG International and the INSEAD Corporate Governance Centre jointly released the AI Governance Principles for Boards, a globally scoped framework designed to help directors ask the right questions, balance opportunity and risk, and exercise meaningful oversight without overstepping into management. The release lands at a moment when the pressure on boards to demonstrate AI competence is no longer theoretical. Proxy advisors, institutional investors, and regulators are watching, and the gap between boards that are prepared and those that are not is widening fast.

The Development

KPMG and INSEAD developed the principles through a collaboration between KPMG’s global AI practice and INSEAD’s Corporate Governance Centre, informed by experienced board members across industries and jurisdictions. The framework is designed to be adaptable, recognizing that regulatory obligations differ by region while the governance challenges boards face are increasingly shared. The principles address AI’s implications across three dimensions: corporate strategy, operational deployment, and the board’s own functioning as a governing body. The launch reflects a growing institutional consensus that AI governance is no longer a technology question. It is a leadership question, and boards are being held to account.

Why It Matters to the Board

The data behind this launch is sobering. KPMG’s own Global AI Pulse Survey found that nearly three-quarters of boards are perceived to have only moderate or limited AI expertise. A separate analysis revealed that only 39 percent of Fortune 100 boards have any formal AI oversight structure, whether a dedicated committee, a director with AI credentials, or an ethics board. Across the broader S&P 500, only 13 percent of companies have at least one director with AI-related expertise. Glass Lewis, the influential proxy advisory firm, has identified AI oversight as the defining theme of the 2026 proxy season. That means what was once a best practice is now becoming a standard of fitness that shareholders will use to evaluate director elections. Boards that cannot articulate how they oversee AI procurement, deployment, and monitoring face a credibility problem that will surface publicly in proxy filings and shareholder meetings.

The Risk If You Wait

The reputational and fiduciary risk of inaction is no longer speculative. When AI systems fail, whether through biased outputs, operational errors, or uncontrolled agent behavior, liability questions travel directly to the board. Directors who cannot demonstrate that they asked the right questions, received adequate reporting, or maintained appropriate oversight will face increased scrutiny. Axios reported in early April 2026 that AI governance gaps in the boardroom are becoming a visible reputational liability. Companies perceived as ungoverned on AI are facing valuation discounts and talent headwinds as executives and investors alike signal preference for organizations with credible governance structures. Waiting for a crisis to force the conversation is a strategy that has already cost several boards dearly. The KPMG-INSEAD principles give boards a documented starting point. Boards that ignore it will have no equivalent defense.

What Other Boards Are Doing

The boards moving fastest are not waiting for regulation to mandate action. Leading organizations are forming dedicated AI oversight committees with external expert advisors, commissioning independent AI risk assessments, and requiring quarterly management briefings that cover AI deployment scope, failure incidents, and mitigation status. The National Association of Corporate Directors has launched AI oversight certification programs specifically for sitting directors. Several Fortune 500 boards have added AI fluency as an explicit criterion in director recruitment. Microsoft’s February 2026 Security Blog noted that 80 percent of Fortune 500 companies are now running active AI agents across their operations. Boards that are not receiving structured reporting on those agents are governing blind.

The Governance Question

The central question this framework puts before every board is direct: Do we have the information, the expertise, and the processes in place to govern AI responsibly on behalf of our shareholders? Answering yes requires more than having a technology committee or citing the CTO’s credentials. It requires that the board itself can evaluate AI risk reports, challenge management assumptions, and make informed decisions about AI-driven strategy. The KPMG-INSEAD principles provide a structured path to that capability. They are not a compliance checklist. They are a governance architecture. The boards that treat them as such will be better positioned to protect enterprise value, maintain stakeholder trust, and demonstrate the kind of informed oversight that institutional investors are beginning to demand as a condition of continued support.

Intelligence Bottom Line

The release of the KPMG-INSEAD AI Governance Principles for Boards on April 14, 2026 is the most consequential board governance development of this proxy season. It establishes a public, peer-reviewed framework that will be used to benchmark board performance on AI oversight by proxy advisors, institutional investors, and regulators for years to come. Directors who have not yet placed AI oversight on the board agenda are now operating behind a documented standard. The organizations that move immediately to assess their current posture against this framework will gain a governance advantage that compounds over time. The organizations that do not will find themselves explaining the gap under conditions they did not choose. The framework is available through KPMG International and the INSEAD Corporate Governance Centre. The time to read it is now.

Forensic Discovery × Close

Strategic Reality

Select a pillar to review the forensic discovery and economic correction mandate.

Governance Mandate Sovereignty Protocol

Please select an asset to view framework analytics.

Begin Forensic Audit Review Full Executive Leadership Playbook