Print Friendly, PDF & Email
CCG Catalyst Commentary

Whose Data Is It? 

February 17, 2026

Banks and the Fight for Data Ownership, Extraction, and Modernization

Banks and credit unions generate enormous volumes of customer and transaction data every day. That data “properly leveraged” is the foundation of AI-driven analytics, personalized customer experiences, open banking integrations, and every meaningful modernization initiative. The problem is that most community, regional banks, and credit unions cannot freely access, extract, or use their own data. Core Service Providers (CSPs) hold it in proprietary formats behind restrictive APIs and contractual walls, effectively turning a bank’s most strategic asset into a source of dependency. In this fifth installment of our series on the OCC’s Request for Information (RFI), I examine the data ownership challenge, the barriers to extraction and modernization, and what the OCC and community banks themselves can do to break the cycle.

The “Data Trap,” in principle, the data belongs to the bank and its customers. In practice, the CSP controls it. Legacy core systems store data in proprietary formats that resist extraction without significant mapping, cleansing, and validation work. APIs, where they exist, are often limited in scope, throttled, or priced in ways that make comprehensive data access prohibitively expensive. Contracts may technically acknowledge data ownership but impose fees, timelines, or procedural barriers that make exercising that ownership impractical.

This is not an abstract governance concern. It has direct operational consequences. A bank that wants to deploy AI-powered credit analytics needs clean, comprehensive, real-time data. A bank pursuing a cloud migration needs to extract and restructure decades of historical records. A bank exploring open banking or fintech partnerships needs data in formats that modern systems can consume. In each case, the CSP’s data architecture becomes the bottleneck and the CSP has limited economic incentive to remove it. Data lock-in reinforces vendor lock-in, and vendor lock-in is good for providers, not for banks.

The cost and complexity of getting your data out is often more expensive than most banks expect. When banks do attempt data extraction or conversion, the barriers are formidable. From our advisory work, the challenges fall into predictable categories. Technical complexity is the first hurdle, legacy data often contains inconsistencies, duplicates, and format variations accumulated over decades, requiring extensive cleansing before it can be used in modern environments. Mapping proprietary schemas to standardized formats is labor-intensive and error-prone, and the risks during migration such as data loss, corruption, operational disruption, are real.

Cost is the second. Data extraction and conversion projects routinely run into six figures, encompassing consulting, technology, testing, and provider deconversion fees. For a bank or credit union, these expenses compete directly with lending capacity, staff investments, and customer-facing initiatives. Provider contracts often compound the problem through opaque fee structures for data access, penalties for early termination, or charges for delivering data in non-proprietary formats. The economics are designed to discourage movement.

The third barrier is fear of regulatory scrutiny. Banks have told us and told the OCC in its listening sessions, that they worry data conversion projects could invite heightened examiner attention. The TPRM Guidance emphasizes thorough due diligence and oversight of any significant changes in third-party relationships, and banks fear that a conversion that encounters problems could generate supervisory criticism. This creates a perverse incentive, banks defer modernization not because the business case is weak, but because the regulatory risk feels disproportionate to the reward.

Some banks are exploring alternatives to CSP dependency by developing their own technology solutions, through subsidiaries, joint ventures, or shared service entities. The logic is sound: if the provider market will not deliver what you need, build it yourself. But the path is not straightforward. Most community and regional banks lack the scale or capital to build proprietary technology independently. Regulatory uncertainty about permissible investment structures adds friction. And even banks that want to invest face questions about how far they can go without triggering heightened supervisory expectations around operational risk, model governance, or capital allocation.

Joint ventures and consortia offer a more practical path. When multiple banks pool resources to develop shared platforms, open-source tools, or collaborative data infrastructure, the per-bank cost drops dramatically and the risk distributes. We have seen early-stage models of this approach succeed, particularly where banks partner with fintechs on specific capabilities rather than attempting to replace the entire core. The challenge is coordination, governance, and again, regulatory clarity about how such arrangements is supervised.

Can your regulator help? The short answer is Yes… the regulator (OCC, FDIC & FRB) has meaningful levers they can pull. From CCG Catalyst perspective:

  1. Issue safe harbors for data extraction and modernization that meet predefined risk management criteria, such as standardized protocols for secure transfers and phased testing. This would directly address the fear of supervisory blowback that deters many banks from acting.

  2. The regulator should also clarify that banks may invest in or form technology-focused subsidiaries and joint ventures, publishing interpretive letters that affirm permissibility under the National Bank Act with proportionate risk management expectations.

  3. Reduce redundant reporting requirements, as the regulator has already begun doing with BSA/AML adjustments would free resources that banks could redirect toward data projects. For example, programs like the OCC’s Project REACh could expand to support consortia-based data modernization, providing a supervised but innovation-friendly framework for collaborative investment.

  4. On the provider side, the regulator should use its Bank Service Company Act examination authority to scrutinize data access practices, including fees, format restrictions, and contractual barriers as part of routine oversight.

  5. My last recommendation – create an open (public) searchable database of provider data portability practices would inject transparency into a market that currently operates in the dark.


Data is the raw material of modern banking. Banks that cannot access, extract, and leverage their own data will be unable to adopt AI, pursue open banking, personalize customer experiences, or modernize operations in any meaningful way. The current dynamic where providers hold the keys to banks’ most valuable assets is unsustainable. Breaking the cycle requires action from all sides: banks must negotiate harder on data rights and pursue collaborative alternatives; providers must recognize that data portability is a competitive differentiator, not a threat; and the regulator must create a regulatory environment where modernization is encouraged rather than feared.

CCG Catalyst helps banks & credit unions develop data strategies, negotiate extraction terms, and plan modernization initiatives that align with both business goals and regulatory expectations. Reach out to our team for tailored guidance. Stay tuned for the next installment in our series: interoperability, API barriers, and cybersecurity challenges.

Subscribe to our Insights