
This article originally appeared in our September 25 edition of the Diligent Minute Newsletter. For more insights like these, delivered straight to your inbox, subscribe here.
Since the rise of consumer-facing generative AI tools, many corporate directors have been watching the headlines, navigating the risks, tracking the regulatory landscape and discussing ways to incorporate GenAI into business strategy.
Board members have also started using GenAI to prepare for board meetings— which spurs a few questions. Has the board been provided with the right tools and policies around "acceptable use" of GenAI for board work? How well are directors minding those rules of the road? And how can companies ensure the board is keeping its AI journey on a safe, sustainable track?
Diligent Institute’s recent “pulse check,” a focus-group style survey conducted with Corporate Board Member, revealed some interesting things.
First of all, a full 64% of respondents listed AI advancement and adoption as their top business priority. This rank — ahead of M&A and supply chain diversification — is saying something given the investor-pleasing nature of many deals right now and the ongoing adventure today’s supply chain has become.
Another big development: Corporate directors have moved from cautious experimentation to more frequent use in their own AI habits. Two-thirds of survey respondents told us that they employed some form of AI in their board activities. This is a significant advancement from 2023, when less than half (44%) said their organizations were using AI at all.
We also asked directors how they’re using AI. Half of our survey respondents reported using AI tools to prepare for meetings. Roughly 4 out of 10 (39%) use it for intelligence summarization and 26% for benchmarking.
At least for now, comparatively few (only 13%) cited using GenAI for advanced strategic functions, like predictive analysis or real-time risk monitoring.
And here are the findings that concern me most: Nearly half (46%) of directors using AI for board work reported using a system like ChatGPT or Gemini, which both have free or consumer-facing models. Less than one quarter (22%) reported having formal AI governance, ethics or risk policies in place. And a full one third haven’t even addressed the topic yet.
Keith Enright, who co-chairs the Artificial Intelligence Practice Group at Gibson Dunn, explained in our report how such a scenario — prevalent use of free or consumer-focused AI services, insufficient attention to governance — is risky business.
Imagine a busy director faced with a long flight, a 700- page board packet and a board meeting the next day. It’s tempting to “use one of the popular free consumer-facing AI tools to summarize the packet,” Enright said. But it’s a bad idea for board members, given the sensitive nature of board work.
“There are a number of legal risks to be considered,” Enright pointed out about corporate directors’ vulnerability. “They may risk inadvertently waiving attorney-client privilege. They might unknowingly share sensitive confidential company information with that third-party service provider without appropriate legal protections in place.”
Samantha Kappagoda, an independent director with Credit Suisse Funds, offered steps a board could take to mitigate their risk.
“By extension of the company’s technology adoption policy, directors should consider a policy that governs board use of Generative AI, including which system(s), and for what purpose(s),” she said.
Kappagoda also recommended “a robust discussion on potential discoverability of such use,” along with consideration of how AI tools are deployed: on employer-provided or personal devices either at work or at home (including VPN deployment), for example. “I’d also add choice of platforms, data security, and regulatory compliance to this discussion list, along with continued board education on AI risks and responsible use,” she said.
“Generative AI is evolving very quickly, so it’s also reasonable to expect that board directors will also continue to expand beyond their current use,” Kappagoda cautioned. She advised her peers on corporate boards to “more carefully consider the types of AI tools that they deploy, and for what purpose, and the implications of those choices.”
Read more from Enright, Kappagoda and our survey respondents in the full report: A pulse check of AI in the boardroom.