AIの探求において、データを上手く切り離す
Here at Celent we have just published the results of our Celent Technology Insight and Strategy Survey, 2023. For the corporate banking market, we asked executives from 214 banks around the world to rank their top priorities for technology and product strategy. The results are available to Celent clients in the full report, Corporate Banking Global IT Priorities and Strategy in 2023, but a headline synopsis of themes is available to anyone via a brief webinar.
We ran this survey at the end of Q1 into early Q2 2023. Of course, at that time there were breaking news stories about generative AI and ChatGPT almost every day, and the impact to this day on a few high-flying tech stocks is pronounced! Perhaps unsurprisingly, advanced analytics and ML (1st overall), and AI and NLP (4th overall) were both ranked very highly as corporate banking technology priorities.
As several of my colleagues have written, AI has huge potential to transform banking—but we can’t overlook the role of the underlying data. For many industries, data is a byproduct of operations—where some physical activity occurs to build, distribute, or sell inventory—and data is used to measure activity and predict business performance. For banks, their data is the business. Banking operations consist of live data origination and processing to create and modify financial transactions. That data is the lifeblood of the operational organization.
Data strategy/data management objectives are not merely to warehouse data for reports and analytics, and to expose that data through insights and analytics tools. It is essential that data management strategy and architecture recognize the difference between transactional/operational data in use for high-volume transaction execution, versus data at rest used for reporting, business intelligence (BI), and analytics. AI can be applied to both scenarios. Defining and implementing an infrastructure and engineering architecture to support advanced technological applications that combine transactional processing with AI/ML is perhaps the hardest state to achieve. It requires the agility of advanced analytics in a real time environment and at industrial strength.
I recently dug into this with Detangling Data: the Art and Science of Managing Banking Data. Whilst encouraging to see banks eagerly push deeper (and more broadly) into AI technologies, to do so effectively requires a well-defined and executed data strategy aligned to data management policy and standards. The promise of AI cannot be achieved at scale without significant effort in data engineering and data stewardship. If data is considered the “raw material” of banking, data management can be considered the “scaffolding” framework within which banks can protect and elevate the value of their data assets through sound governance and good technology decisions.
Furthermore, better data quality at source reduces the effort to manage and fix issues. As banks evaluate business solutions for acquiring data (such as through document automation, for KYC validation, or loan underwriting and risk calculations), they should consider assessing how these solutions help support overall data strategy and data quality objectives. This could be a potential criterion for business case evaluation as banks aim to improve data quality and analytics capabilities. Leading banks, therefore, place high importance on the convergence of product management, operations, data strategy, and architecture—all aligned to a data management strategy for protecting data assets and leveraging data and AI for growth.