News analysis

Can I use AI in the boardroom?

Can I use AI in the boardroom

Can I AI in the boardroom? Like really? It’s the biggest innovation to hit corporate governance in years, but how much can we really depend on it?

A recent article in Financial Times spoke about boards appointing bots to sit amongst them, and how close we are to that reality. While not every AI-related conversation will be that dramatic for directors, it is important to have a realistic conversation about where artificial intelligence can be a practical help. 

So, let’s dive into some of the most common boardroom tasks and see where AI fits.

Can I use AI for that?

1. Digesting board packs

Board packs are essential collections of information (data, narrative background, etc.) provided to directors before a board meeting. They’re supposed to aid decision-making, but a persistent grievance among directors is that they’re too long, with too much to digest in too little time. 

Enter AI – a tool that can summarise even the most complex documents in seconds. With the right configurations, any standard AI like ChatGPT, Gemini or CoPilot could not only give you a brief synopsis of a board pack, but can also compare them to other documents, say the minutes from a previous board meeting, and highlight discrepancies or issues that remain unresolved. 

Verdict: Can I use AI for it? Yes, with limitations. 

AI-generated summaries help to contextualise and give you a zoomed-out view of the big picture, but do not give you the freedom to ignore the actual board pack. 

No matter how good you prompt your AI tool, its summary may miss key data or background information, or simply come out “vague” because that’s exactly how the board pack was written in the first place. 

Moreover, failing to read the board pack for any reason puts you straight into “breach of fiduciary duty” territory, because you’re not an informed director in the traditional sense. 

If you continue to have problems with the board pack’s length, the best solution is to raise this as an issue with the chair and the company secretary. There may be opportunities to condense it traditionally. If there isn’t, AI is unfortunately not the golden bullet that solves the problem.

2. The financial forensics

Boards, and their sub-entities like the audit committee, are increasingly relied on for their financial oversight. This means more numbers and more data to wade through. Often, a lack of suitable personnel within the ranks means this data can seem overwhelming. 

Verdict: Can I use AI for it? Yes, with limitations. 

AI tools can be excellent for spotting odd trends, mistakes, red flags for fraud and other financial risks. Historically, humans have spotted these things through occasional forensic analysis and, failing that, random samples. AI isn’t limited by that; it can forensically check everything in a few minutes and flag anything that looks out of the ordinary. 

But that’s as far as the opportunity goes. After that, financials are absolutely a human responsibility. Every financial statement and decision must be approved by humans, based on information that has been humanly verified. There is zero wiggle room here. Any indication that financial matters were signed off based solely on AI work would land directors in serious legal trouble. 

3. A digital director

You may well have read about this in the news because it’s the most content-worthy angle: an AI-powered director with a seat at the board table. Some versions of this idea even describe it as a physical android sitting amongst colleagues, joining in the decision-making. 

Verdict: Can I use AI for it? Yes, but no votes.

If you’re eager to explore the potential, you could use an AI model to make recommendations around key policy areas, provide strategic context to decisions, summarise data, or even just provide structure to a board meeting. 

But AI tools cannot act as full voting directors. There’s no legal basis for it. Changing the law to allow it will likely take years. In the meantime, just don’t go there. 

And be warned: using AI in the boardroom this way means that you have to be really good at prompting the tool to act the way you want it. AI has vast potential, but that potential is wasted if you don’t train the model properly. Any AI-based boardroom participant would need training data on governance processes, company strategy, and any other core elements of decision-making that you want information on. 

And then, as always, any information you get from it must be verified.

4. Scenario modelling

Your company might be on the brink of a major strategic decision and wonder if it’s making the right choice. This is especially difficult in cases where there’s no analogue of precedent, such as an acquisition or merger. 

A common answer is scenario modelling, which can be a huge task. 

Verdict: Can I use AI for that? Yes, for simulations, with limitations.

AIs can be programmed to stress-test different scenarios with the best available data. Different outcomes like market responses and integration difficulties can become obvious with this kind of scenario planning, giving boards something to work off. 

However, AIs are also inherently backwards-looking. The “best available data” is always from the past, and will ignore any black swan events or other shocks that don’t have historical equivalents. The 2020s seem to be full of shocks like this, so it’s best to remember that if you ever use AI in scenario planning.

5. Culture/well-being oversight

Some companies are toying with the idea of using AI to take a zoomed-out look at corporate culture through whatever metrics they consider relevant. 

The goals here may be to gauge the overall mood of employees or other stakeholders, or simply pinpoint areas of “cultural rot”. 

Verdict: Can I use AI for that? No, not really

It might sound like a nice idea at the top, but using AI to do a very human task will inevitably give the impression of carelessness, coldness and a lack of engagement. Stakeholders might feel like they’re being subject to some kind of “AI spy”, designed to monitor rather than gather feedback. 

In particular, you should always avoid AI to carry out tasks related to individual performance, whether it’s employees, C-suite or fellow board members. This kind of monitoring is a human activity and should remain so. If AI ever comes into play here, it should be for simple reasons, like processing sales or revenue numbers.

Remember

Did you notice that most of the points above discussed AI with a “with limitations” caveat? Get used to it. AI will always have limitations when it comes to a high-stakes area like corporate governance. 

Ultimately, human directors are the crucial part of the governance machine. While AI might seem like a perfect answer to problems of capacity, data-heaviness and burnout, it will only ever be a tool that you need to ensure full control of. Ultimately, it’s very important to state where the boundary lies.

About this author

Dan Byrne MA BA is a journalist, writer, and editor specialising in corporate governance and ESG topics. As the Content Manager at The Corporate Governance Institute, Dan creates engaging, insightful content designed to inform and educate global audiences about the latest developments in corporate governance and sustainability.

With a strong focus on research and analysis, Dan consistently delivers compelling narratives that resonate with industry professionals and stakeholders interested in responsible governance and environmental, social, and governance (ESG) issues.

Tags
  • AI governance
  • AI in the boardroom