8 steps to make your organization AI resilient
This article is brought to you thanks to the collaboration of The European Sting with the World Economic Forum.
Author: Andrea Bonime-Blanc, Founder and CEO, GEC Risk Advisory
- How does the world govern for the unknowns of exponential tech, including generative AI (GenAI)?Building organizational resilience around the development and deployment of GenAI is essential.From embedding GenAI metrics into performance measurement to fostering understanding of GenAI at the highest levels of the organization, here are eight ways to build organizational AI resilience.
Here are two quotes to consider around the development of Artificial Intelligence (AI):From Kent Walker, C-suite member at Google: “AI is much more than a chatbot. It’s a once-in-a-generation technology shift, one that may be on par with electricity as a general-purpose tool that will improve lives across the globe.”From Tristan Harris, co-founder of the Centre for Humane Technology: “With AI, it’s now the race to deploy and onboard humanity onto your AI platform faster than your competitors. It means don’t do the safety testing, don’t do the risk analysis, because if you wait too long, you just lose market share (…) it’s a race to recklessness.”So, which is it? It’s both. Let me explain.
How is the World Economic Forum creating guardrails for Artificial Intelligence?
In response to the uncertainties surrounding generative AI and the need for robust AI governance frameworks to ensure responsible and beneficial outcomes for all, the Forum’s Centre for the Fourth Industrial Revolution (C4IR) has launched the AI Governance Alliance. The Alliance will unite industry leaders, governments, academic institutions, and civil society organizations to champion responsible global design and release of transparent and inclusive AI systems.
In the past few years, we have witnessed a steady infusion of exponential technologies – artificial intelligence (AI), crypto, blockchain, synthetic biology, nanotech and more – into every walk of life and are bracing ourselves for more soon.The release of ChatGPT and other generative AI (GenAI) into the global wild in late 2022 underscored this phenomenon.As a non-technologist focused on the governance, risk, ethics and impact of tech trends and megatrends, including what I recently coined the , I ask: how does the world govern for the unknowns of exponential tech? How do businesses, governments, NGOs and communities manage the risk of the seemingly ungovernable while simultaneously leaving room for creativity?In this article, I apply my model of organizational resilience from Gloom to Boom: How Leaders Transform Risk into Resilience and Value, which I have deployed frequently to the phenomenon of GenAI.
Suppose we apply the notion of organizational resilience to GenAI. In that case, we might come up with the following definition: GenAI-resilience is an organization’s ability to understand and sustainably maintain, build and deliver intended AI-enhanced business outcomes both in terms of products and/or services using AI and GenAI tools and techniques and by deliberately avoiding risks, downsides, and adverse AI and GenAI infused events (such as the use and abuse of GenAI tools to create and deploy more effective and powerful cyberattacks).
Building organizational resilience for GenAI
As with cybersecurity, building organizational resilience around the development and deployment of GenAI is essential. While in the case of cybersecurity, we are talking largely about risk management and liability prevention, in the case of GenAI, we’re talking about a more complex and multifaceted phenomenon filled with opportunity as well as risk.
The eight steps to GenAI organizational resilience1. ‘Lean-in’ GenAI governance and leadership
The tone is always set from the top, but the best tone is set from a top that is emotionally intelligent and continuously educated about the promises and perils of the marketplace, including relevant exponential tech and GenAI. Leaders must lean in to understand how GenAI affects their business and strategy and how it can increase capacity, productivity, talent, workforce, products, services and the business model itself.
2. Create a deliberate culture of AI and tech ethics
Every organization must consciously and proactively consider, identify and integrate the ethical and cultural implications of its tech footprint – including its purchase, use, development and deployment of GenAI. At the top and throughout the entity – through performance management metrics and incentive systems – several critical internal actors (starting with the CEO, ethics, compliance, talent and/or risk officers among others) must lead in the design of a measurable culture of ethics when it comes to technology.
3. Understand GenAI stakeholders
Another vital element in understanding the issues, risks and opportunities embedded in the deployment of GenAI within your organization is to identify the stakeholders and understand their positions, expectations, fears, red lines and deepest desires when it comes to the use of their and others’ data and your use of that data in algorithms, products and services. An organisation’s reputation risk and opportunity may well depend on this dynamic.
All organizations – small, medium, and large – must have some form of systematic risk management customized to the entity, tuned into the changing environment and adaptable in an agile and savvy way. This means having a structured system of enterprise risk management that picks up the nuances of technological developments relevant to the business and can identify and triangulate the most likely and impactful substantive issues, risks and opportunities relevant to the company.
5. Integrate GenAI into ESGT/business strategy
If your organization does not yet encompass the full spectrum of relevant ESG plus tech (or what I call ESGT) in its strategy development, and implementation, it’s time to change that. Integrating an entity’s most relevant and material ESGT issues, risks and opportunities into business strategy is now a critical exercise, not an optional one. Depending on your sector and business footprint, GenAI and other relevant tech issues, like cyber and data privacy and collection, need to be part of the overall strategy brainstorm.
6. Embed GenAI metrics into performance management
The World Economic Forum recently published a major contribution to measuring digital trust in an organization encompassing GenAI. Entities must develop and implement relevant digital trust metrics to ensure proper guardrails and policies are in place. These metrics should address both organizational and individual performance – from the CEO to newly onboarded staff.
7. Deploy tech-related crisis preparedness
Any resilient organization will have a set of policies, practices and teams that deal with – and are continuously scanning the horizon and rehearsing – scenarios including the most likely or consequential digital risks that could turn into crises, including considering scenarios involving the downsides of exponential tech and its impact on crucial stakeholders. For example, Google deploys a “red-team” approach to understanding the downsides of insecure AI. They call it “Google’s Secure AI Framework,” which consists of six practices, from expanding strong security foundations to harmonizing controls across the ecosystem.
8. Harness a continuous improvement and innovation ethos for GenAI
A final element in building a GenAI resilient organization is to ensure that throughout all previous steps, there is a feedback loop for lessons learned, root cause analysis and the sharing of data. This feeds into continuous improvement of enterprise policies and management, and performance, as well as an innovation ethos that creates improved and new products and services.As exponential technologies infuse our everyday life, we need to act nimbly and thoughtfully on their governance, risk, ethics and impact implications. Consciously and conscientiously building these concepts into the eight steps of organizational resilience can provide a useful roadmap to accomplish
this.