Financial institutions (FIs) need to consider a risk, legal, and compliance framework for artificial intelligence (AI) to meet regulators’ expectations. Officials from the Office of the Comptroller of the Currency (OCC), Federal Deposit Insurance Corporation (FDIC), Consumer Financial Protection Bureau (CFPB), and the Federal Reserve stressed this at a symposium two weeks ago. Per the ABA Banking Journal, they said that banks will need to ensure their use of the technology is in line with existing law and responsibility lies with them for how it is deployed — regardless of vendor involvement.
Regulators’ expectations are a challenge for two primary reasons:
Few FIs are equipped to handle technological needs without the support of vendors, much less cutting-edge technology that’s in the early stages of development. And, as the National Institute of Standards and Technology (NIST) notes in its guidelines for AI risk management, among other things, risks to users of AI come from third-party software, hardware, and data.
FIs are still on the hook for risk and compliance obligations based on government rules and regulations, no matter the measures the FI and its vendor can or can’t agree on. Third-party risk then poses a huge problem as AI-driven services are implemented across the industry.
Risk tolerance for AI may differ between partners in the development of an AI-based product, and that can be complicated by how an FI or other organization integrates and uses AI products and services, according to an analysis by Thomson Reuters. Moreover, managing that risk can be a challenge because risk frameworks used by a vendor for AI systems may not be the same as those used by the organization, according to NIST. An organization and its vendors may not even agree on metrics to measure AI-related risks, perhaps, according to NIST, because at this stage it’s hard to quantify those risks in the first place.
FIs sit in an uncomfortable spot. They most likely know they need to be ready for risk and compliance issues AI may introduce and are aware of general risks and measures related to vendor management. But at this stage, a majority probably do not have a firm idea of what specific AI-related risks are. And they are likely relying on vendors’ expertise to help them understand how to make use of the technology. As a result, many FIs may not know what they don’t know when it comes to ensuring everyone is on the same page from a risk and compliance standpoint.
The goal then should be to get educated. After several years of compliance lapses related to relationships with fintechs, FIs are on notice that new technologies demand robust risk and compliance management. That’s particularly the case when a partner is responsible for expanding the bank’s capabilities and business interests. FIs will thus need to make sure they take the steps necessary to fully understand AI and develop a strategy and framework for how to use it that includes the right controls. By ensuring that your organization has the right expertise to pursue AI, an institution can help reduce reliance on vendors and advocate for its own risk appetite.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
Banks Aren’t Prepared for the AI Compliance Challenge
Banks Aren’t Prepared for the AI Compliance Challenge
January 30, 2024
By: Tyler Brown
Request for Proposal
By: Tyler Brown
Financial institutions (FIs) need to consider a risk, legal, and compliance framework for artificial intelligence (AI) to meet regulators’ expectations. Officials from the Office of the Comptroller of the Currency (OCC), Federal Deposit Insurance Corporation (FDIC), Consumer Financial Protection Bureau (CFPB), and the Federal Reserve stressed this at a symposium two weeks ago. Per the ABA Banking Journal, they said that banks will need to ensure their use of the technology is in line with existing law and responsibility lies with them for how it is deployed — regardless of vendor involvement.
Regulators’ expectations are a challenge for two primary reasons:
Risk tolerance for AI may differ between partners in the development of an AI-based product, and that can be complicated by how an FI or other organization integrates and uses AI products and services, according to an analysis by Thomson Reuters. Moreover, managing that risk can be a challenge because risk frameworks used by a vendor for AI systems may not be the same as those used by the organization, according to NIST. An organization and its vendors may not even agree on metrics to measure AI-related risks, perhaps, according to NIST, because at this stage it’s hard to quantify those risks in the first place.
FIs sit in an uncomfortable spot. They most likely know they need to be ready for risk and compliance issues AI may introduce and are aware of general risks and measures related to vendor management. But at this stage, a majority probably do not have a firm idea of what specific AI-related risks are. And they are likely relying on vendors’ expertise to help them understand how to make use of the technology. As a result, many FIs may not know what they don’t know when it comes to ensuring everyone is on the same page from a risk and compliance standpoint.
The goal then should be to get educated. After several years of compliance lapses related to relationships with fintechs, FIs are on notice that new technologies demand robust risk and compliance management. That’s particularly the case when a partner is responsible for expanding the bank’s capabilities and business interests. FIs will thus need to make sure they take the steps necessary to fully understand AI and develop a strategy and framework for how to use it that includes the right controls. By ensuring that your organization has the right expertise to pursue AI, an institution can help reduce reliance on vendors and advocate for its own risk appetite.
You Might Like These, Too
Fintech Case Study: Immediate Commercial Payments
Are Your Vendors Helping or Hindering Your Strategic Goals?
What Talent Plans Say About Strategy
Has Fintech Funding Hit Rock Bottom?
Leaders in Bank Consulting