Governing AI – Six Questions Every Board Must Answer

If your board cannot answer these six questions with confidence, you are not negligent. But you are exposed. And that exposure is growing every week.

Over the past few months, I have outlined the key AI risks that boards tend to underestimate – from shadow AI to competitive drift, from diffused accountability to the gap between policy and practice. This blog brings it all together into something practical: six questions that every board should be able to answer, and a short diagnostic to test how exposed you really are.

I originally framed this as four questions, but as my work with boards has deepened and the landscape has shifted, I have sharpened it into six. These are not theoretical. They are the questions I ask in boardrooms, and the ones that most often reveal the governance deficit.

The Six Questions

One: Where is AI already influencing decisions, formally or informally? This is always where I start. Most boards can describe their approved AI initiatives. Far fewer can account for the informal use happening across the organisation right now.

Two: Who is explicitly accountable when those decisions are challenged? If you cannot name the person, the accountability is diffused. And diffused accountability is no accountability at all.

Three: Is our governance keeping pace with AI adoption, or lagging behind behaviour? Be honest. If your governance processes still run on annual cycles while AI adoption moves weekly, the answer is already clear.

Four: Do directors have sufficient AI literacy to provide informed challenge? Delegation is not the same as understanding. A board that cannot challenge AI decisions is a board that has outsourced its own judgement.

Five: Where might shadow AI be emerging due to unclear or restrictive governance? If people believe AI use is prohibited or politically sensitive, they will experiment quietly. The absence of safe territory creates invisible risk.

Six: What value or efficiency gains might we be missing through hesitation or inertia? This is the one boards forget to ask. The cost of inaction is just as real as the cost of getting it wrong.

Boards that cannot answer these questions with confidence are not failing. But they are exposed. And the purpose of AI governance is to make that exposure visible and manageable before it becomes a crisis.

 

A Diagnostic Reflection: How Exposed Are We?

I designed this short diagnostic to provoke honest reflection rather than provide comfort. It works best when each director considers it individually first, and then the board discusses it collectively. That sequence matters, because group discussion too early tends to smooth over the discomfort, and the discomfort is where the value lies.

Could we list all material AI use cases across the organisation today? If the answer is no, governance is operating without a complete picture.

Do we know which decisions are influenced by AI outputs? This is about understanding where AI shapes judgement, not just where it automates tasks.

Is accountability for those decisions named, documented, and understood? Not assumed. Not implied. Named.

Would directors feel confident explaining our AI oversight to a regulator or a journalist? This is a good stress test. If the answer creates hesitation, that tells you everything.

Are people clear about where AI use is encouraged, restricted, or prohibited? Ambiguity is the breeding ground for shadow AI.

Have we quantified the cost savings, time gains, or efficiency improvements AI could unlock? If you have not measured the opportunity, you cannot govern it or pursue it strategically.

Are we actively choosing our current pace of adoption, or defaulting into it? There is a difference between a deliberate decision to proceed cautiously and simply not having made a decision at all.

 

If these questions feel uncomfortable, that discomfort is useful. It signals exactly where governance attention is needed.

I often say to boards that diagnostics are not about judgement. They are about clarity. And in an AI-influenced world, clarity is what allows boards to act with confidence rather than react under pressure.

The governance deficit is real, and it is growing. But it is also entirely within the board’s power to close. That is what good governance has always been about – moving from conformance to performance.

This is the conversation we need to have. The AI Wake-Up Call is a one-day immersive experience that equips board members to answer these six questions with confidence, using governance as the foundation for accountable AI adoption.

Speak To Our Expert

Newsletter
Location & Social Media

Company Number: 16359543