Banks Aren’t Prepared for the AI Compliance Challenge

Banks Aren’t Prepared for the AI Compliance Challenge

Banks Aren’t Prepared for the AI Compliance Challenge

Banks Aren’t Prepared for the AI Compliance Challenge

January 30, 2024

By: Tyler Brown

Financial institutions (FIs) need to consider a risk, legal, and compliance framework for artificial intelligence (AI) to meet regulators’ expectations. Officials from the Office of the Comptroller of the Currency (OCC), Federal Deposit Insurance Corporation (FDIC), Consumer Financial Protection Bureau (CFPB), and the Federal Reserve stressed this at a symposium two weeks ago. Per the ABA Banking Journal, they said that banks will need to ensure their use of the technology is in line with existing law and responsibility lies with them for how it is deployed — regardless of vendor involvement.

Regulators’ expectations are a challenge for two primary reasons:

  1. Few FIs are equipped to handle technological needs without the support of vendors, much less cutting-edge technology that’s in the early stages of development. And, as the National Institute of Standards and Technology (NIST) notes in its guidelines for AI risk management, among other things, risks to users of AI come from third-party software, hardware, and data.

  2. FIs are still on the hook for risk and compliance obligations based on government rules and regulations, no matter the measures the FI and its vendor can or can’t agree on. Third-party risk then poses a huge problem as AI-driven services are implemented across the industry.

Risk tolerance for AI may differ between partners in the development of an AI-based product, and that can be complicated by how an FI or other organization integrates and uses AI products and services, according to an analysis by Thomson Reuters. Moreover, managing that risk can be a challenge because risk frameworks used by a vendor for AI systems may not be the same as those used by the organization, according to NIST. An organization and its vendors may not even agree on metrics to measure AI-related risks, perhaps, according to NIST, because at this stage it’s hard to quantify those risks in the first place.

FIs sit in an uncomfortable spot. They most likely know they need to be ready for risk and compliance issues AI may introduce and are aware of general risks and measures related to vendor management. But at this stage, a majority probably do not have a firm idea of what specific AI-related risks are. And they are likely relying on vendors’ expertise to help them understand how to make use of the technology. As a result, many FIs may not know what they don’t know when it comes to ensuring everyone is on the same page from a risk and compliance standpoint.

The goal then should be to get educated. After several years of compliance lapses related to relationships with fintechs, FIs are on notice that new technologies demand robust risk and compliance management. That’s particularly the case when a partner is responsible for expanding the bank’s capabilities and business interests. FIs will thus need to make sure they take the steps necessary to fully understand AI and develop a strategy and framework for how to use it that includes the right controls. By ensuring that your organization has the right expertise to pursue AI, an institution can help reduce reliance on vendors and advocate for its own risk appetite.

Leaders in Bank Consulting

Subscribe to our Insights