Florian Rutter, chief AI officer at Ivanad, warns that there is a growing disparity between organizations with boardrooms that specialize in creative artificial intelligence and those that need to play catch-up.
“I’m a little concerned that we’re going to see this building replaced, and something left behind,” Rutter says during a virtual conversation. good luck In partnership with Diligent for The Modern Board Series.
Avanade, an IT services and consulting firm, has worked with hundreds of organizations and some boardrooms are getting “quite sophisticated in terms of using AI themselves” in those conversations, Rutter says. Some of the use cases that have been deployed include relying on generative AI to better prepare for board meetings, piloting and prototyping activist investor exercises, and AI-powered tabletop exercises to better plan for business risks. including doing
Risks without proper AI governance
But as board members dive into implementing generative AI into their workflows, it can pose some risks to companies if proper AI governance isn’t in place. It includes clear guidelines on how to follow security, policies and procedures without disclosing sensitive company information. Over the past two years, employers have had to urgently set policies around the safe use of AI, especially after the explosion of consumer interest in AI following the launch of chatbot ChatGPT.
The thinking was that employees were going to use generative AI whether it was blessed by management or not, so HR and IT teams created restrictions, advanced classes, and other forms of training, as well as internal AI games. Fields had to be established to allow conservation. Research experts say the same logic should apply to board members as well.
“I think what we’re seeing is definitely a need for a more fundamental understanding of the basics with the board,” says Nithya Das, chief legal officer and chief administrative officer at Diligent, a governance, risk and compliance company. are “You have to assume that they’re going to find their tools, and that can raise various security and privacy concerns for you as an organization, given the sensitivity of the board’s work and the board’s content. is.”
Das says the training classes could be helpful to get boards up to speed on AI, similar to the education that has been done in recent years after cyber security threats came into focus. One such course recommended by Rotar is Stanford University’s “The AI Awakening: Implications for the Economy and Society.”
AI is a growing priority for corporate directors.
Diligent previewed a soon-to-be-published survey from the company’s research arm showing that generative AI will rank sixth on the priority list for board directors of U.S.-based public companies in 2025, developing lags behind in advancing and improving finances, but outpaces cyber security and manpower. Planning
A sixth might not sound like much, but Das says it’s an indication that AI is top of mind. Leaders are still sorting out how well-versed their management team is on AI, working through concerns about data privacy, and concerns about fraud, which can occur when AI models are untested. Create false information based on valid data.
“We think most boards and companies are at the beginning of their AI journey, but they are definitely very curious about AI,” says Das. “We expect this to continue to be a focus for 2025.”
Even at digitally native companies, management needs to explain to their boards the difference between generative AI technologies and more traditional uses of AI and machine learning, says Fiona Tan, chief technology officer at e-commerce furniture and home goods retailer Wayfair. It used to fall. pre-deployed.
“For the board, it’s really realizing some of the nuances between the prediction… what are the creative capabilities, what are the capabilities of the big language model, and what are the risks,” Tan says. From this point, they can think about where to deploy generative AI. For a company like Wayfair, this can include tailoring content to each specific shopper’s needs and creating more personalized content.
Tan says the management team should be responsible for looking at the various opportunities to grow the business with creative AI and articulate that vision to the board. It should also include a closer look at AI startups that are emerging and developing solutions that may be better bought than built internally from scratch.
Looking for ways to disrupt your company.
“For the board, it’s pushing to make sure we’re taking a bit of an outside approach,” Tan says. “Where do we need to go and spoil ourselves?”
Umar Khawaja, chief information and security officer at Databricks, a data and AI software company, says that board members and management should be avid users of AI without understanding how these systems work and impact the business. can be applied.
In fact, a trap I often see boards and other leaders fall into is, ‘I’ve used AI, I know how it works, it’s been three months, you’ve done x, y and z. Why not solve the problems magically? ” says Khawaji.
He likens this common miscalculation on AI development to watching cooking videos on TikTok. It may only take a few minutes to watch a dish being prepared by an influencer, but it may take hours to perform the same task at home.
“Management and governance, governing and curating, and the challenge of managing your data is where 90% of the work is,” says Khawaji. The rest, he says, is about training a model and leveraging it with appropriate use cases.