Clouded Judgement: Key Considerations for Financial Institutions Still Debating Migrating to Cloud

Last week, Suki Dhuphar and I talked about the surge of new public cloud-partnerships with financial services that took place in July (HSBC & AWS, National Australia Bank & Azure, Deutsche Bank & GCP). While the month appears to have been a turning point for cloud migrations and cloud adoption, Suki raised an important question – for those financial companies that haven’t made the change, what’s stopping them?

Here are a few considerations that we believe are top of mind.

  1.   Uncertainty about data sovereignty

Unsurprisingly, the issue of data sovereignty remains top of mind, particularly for our customers in Europe. Even if cloud providers are used, the physical location of the data centre needs to be considered to comply with data privacy regulations in each country. In 2019, the European Banking Authority issued outsourcing guidance noting, quite vaguely, that “institutions should adopt a risk-based approach to data storage and data processing location(s) and information security considerations.” GDPR taught us that while initiatives might be homed in the union, privacy and regulatory decisions have far reaching implications for global players with European exposure. In February, the EU released a proposal for the creation of a European single market for data under the European Data Strategy, part of a three-part package of the EU’s Digital Strategy. The central theme is data sovereignty and shows the continued will within the EU to reduce the dependency on foreign data infrastructures and net states – digital, non-state powerful actors such as Google, Amazon and Microsoft.  A key solution we offer our customers is  is data classification – to ensure compliance, the financial institutions must be aware of the contextual use of the data and look to accurately classify it.  Traditional rules-based approaches to labelling simply don’t scale to reflect the breadth of systems that encompass a cloud transition. While data governance has many elements to consider, good data quality is at the foundation.

  1.   Talent acquisition

For financial institutions who have long focused on waterfall methods for data management and competed with the tech-giant cloud providers for talent, internal enablement is not easy. The major investment banks have been heavily investing in cloud-focused engineers. In 2018, JP Morgan Chase launched an engineering hub in Seattle, the homeplace of Amazon and Microsoft, to focus on the recruitment of cloud security developers. A migration to the cloud underlines the broader need to collect, store, analyze, use and disseminate data easily at scale – the need for a DataOps mentality. We have been working with our customers to help them adopt an agile approach to building next-generation data infrastructure.

  1.   Up-front costs

While cost savings are often a primary motivator behind many cloud migrations at financial institutions, the initial numbers might not appear so attractive. Legacy infrastructure takes time and resources to transition, which means there will be a tail to achieving return-on-investment (ROI). The sector is expected to continue to face major financial headwinds through the second half of 2020 and loan loss provisions skyrocketed in the Q2 announcements. Such pressure can cause, understandably, short sightedness and cuts or delays to project budgets, even if the marginal costs are favourable.

However, delaying the move to cloud is ultimately delaying the path to potential return, as the hidden costs and business model burden of legacy systems mount. According to Bloomberg, Deutsche Bank expects to make a cumulative return on investment of €1 billion (US$1.1 billion) from its partnership with Azure – it is not a one-year calculation. Financial institutions must fight the urge to focus on the sunk expenses and take into account the full cost and opportunities enabled by its data infrastructure choices. While it’s easy to do an apples-to-apples comparison of current data usage between on-prem, hybrid and public cloud, it undersells the value of cloud because the current-status quo of data use shouldn’t be assumed going forward. The elastic compute infrastructure that cloud enables empower banks to be nimble in deploying machine learning at scale. In this turbulent year, it is time for CIOs and CDOs to act and use the short-term focus on digitization and margins as a pivot to increase public-cloud usage for long term sustainability.

How Tamr Helps Financial Institutions Benefit from the Move to the Cloud

The pay-as-you-go cost advantages of cloud might be the initial driver behind the move. However, the most important value it unlocks is the speed at which the data can be utilized to solve end-business problems using machine learning models, given the flexibility of compute. This relies upon the data being mastered. Siloed, messy data, whether it be on prem or in cloud, will struggle to address business challenges. Data mastering through Tamr has enabled Scotiabank to address regulatory challenges for AML, one of the world’s largest money managers to integrate external reference data sources and Santander to gain a holistic customer view for credit lending decisions.

We believe our customers should always have the flexibility to choose the data storage strategy that best fits their needs, whether it’s on-prem, public cloud or a hybrid. However, from our experience to date, helping customers master data in the cloud helps them realize value faster than ever before. Tamr is a cloud native vendor – we are banking on a cloudy outlook.