Banks and credit unions are exploring the impact of AI at different paces. But as we go through this process as an industry, it is important to remember that involvement with AI is not necessarily limited to a financial institution’s (FI’s) initiatives. FIs do not exist in a vacuum, they are part of an ecosystem of solutions and technology providers. So, even if an FI doesn’t currently have an AI strategy, that doesn’t mean it has no exposure. Third-party AI risk refers to the risk a bank or credit union faces through use of AI by its vendors.
According to bankers we’ve talked with, many vendors have implemented AI capabilities, and they don’t always notify their clients or allow them to turn those capabilities on and off. This exposes FIs to risks like bias in algorithms and data security issues they may not be aware of. It’s a problem especially pertinent for FIs that control their own infrastructure, but implications for mainstream institutions are there, as well. As stated in the Financial Stability Oversight Council’s 2024 Annual Report:
“Financial institutions’ relationships with service providers may introduce new risks or amplify existing ones, in part because reliance on a third party may reduce an institution’s access to, and its direct control and oversight of its data or systems.”
It then goes on to say:
“Smaller firms, such as community banks and credit unions, may have lesser negotiating power to obtain certain contractual rights, fewer resources and ability to conduct due diligence on and monitor a service provider’s practices (e.g., information security, internal controls, assurance testing), and lesser ability to terminate and substitute services in case of operational challenges. And yet, due to their relatively small size, these institutions are increasingly relying on third parties for essential lending, compliance, technology, and operational-related matters.”
As time goes on, this issue will only grow more apparent for FIs of all sizes. While many FIs may depend on a third-party review, some will conduct their own audits of their vendor ecosystems and put processes in place for testing and approval. Regardless of approach, here are a few considerations, based on our research:
While you may be able to argue that not every bank or credit union needs an AI strategy (yet), executives should not ignore the implications of the technology permeating through their ecosystems. For some institutions, the risk will be lower because they have fewer vendors and therefore fewer exposure points, but it is unlikely to be zero. A good first step is to create an AI governance policy that, at a minimum, defines how employees use tools like ChatGPT and Microsoft’s Copilot, if available. Meanwhile, for progressives, taking stock of your environment, and sooner rather than later, is a worthwhile exercise.
Phone: +1-480-744-2240 • Contact Us