I recently attended the Snowflake Financial Services Data Summit, where some of the most influential people in financial services discussed how “organizations meet ever-changing customer expectations while automating routine and mission-critical processes, meeting and exceeding regulatory requirements, and transforming their business models with data to unlock new revenue streams.”
Speakers at the summit included industry veterans like Shelly Swanback (President of Product & Platform at Western Union), Stacey Cunningham (President at NYSE), Sudhir Nair (Global Head of the Aladdin Business at BlackRock), Naresh Shetty (Chief Architect at Cognizant), Spiros Giannaros (EVP at State Street) and Marc Rind (Chief Technology Officer of Data at Fiserv).
While the speakers shared lots of great content and wisdom, a few themes stood out. Below are my five key takeaways about how these industry leaders, product visionaries, and technical specialists use the power of data to redefine their businesses and tackle the data challenges faced in their industry.
1. Always possess a customer-centric mindset
Understanding your customers at a granular level is key to success, especially when providing services directly to consumers. Western Union is a good example. Their core business is helping people and businesses move money around the globe.
Western Union explained how they use dynamic pricing and sophisticated models to leverage more than 2,000 variables from internal data sources and several external data sources. At the event, they revealed that they are in the process of developing data models that understand where the customer friction points are so that their 500,000+ agent locations can provide the best experience for their 150 million retail and digital customers.
Possessing a customer-centric mindset in creating and managing all of their data projects gives them a competitive edge over their peers.
2. Enrich data with external sources
Using external data to drive business results in financial services is not new; it began with hedge funds but has quickly expanded to other alternative asset investors. Early adopters tried to mine every type of data source they could get their hands on in an attempt to beat the market. Nowadays, this quest for data and insights is spread across the entire spectrum of financial services firms. According to McKinsey: over 90% of life insurance policies are now underwritten using external or third-party prescription histories and according to Gartner, by 2022, 35% of large organizations will either sell or buy data via formal online data marketplaces.
As the President of NYSE, Stacey Cunningham, explained at the Snowflake financial services event, the key is to provide a data marketplace that’s flexible and accessible to everybody in the firm because it’s hard to predict what data could be valuable and how it could be used.
3. Prepare your architecture for multiple regions and multiple clouds
Many financial services firms operate globally and are subject to different regulatory requirements in other jurisdictions, which legally mandate how they process and store data. As a result, putting data physically into one place might not be feasible or desirable. Firms need to plan for these localized variables and idiosyncratic details when designing their data architecture.
From a governance perspective, firms need to have a global framework in place for data governance. The framework needs to engage local teams, understand their different requirements, and implement different policies accordingly.
From an engineering perspective, as their data needs to reside in different clouds or regions, they need to account for it by providing a data marketplace for analysts from the headquarters or other regions to access the data they need. In addition, data replication services and respective data pipelines need to be in place to move data around without affecting local operational and analytical use of the same data.
4. Consolidate data to break down silos
This point may seem contradictory to the previous point, but financial institutions should put data from different sources into one place for consumption.
We don’t need to go into the decades-old debate about aggregated vs. federated data strategies for breaking down silos. The vital point is that siloed insights for financial institutions will prevent them from assessing enterprise risks accurately, and it can be disastrous for the whole market. That’s also why the SEC implemented the Consolidated Audit Trail, one of the many regulations on data consolidation for exchanges, vendors, and brokers to find bad actors and manage their risks.
NYSE is helping their clients analyze the climate risks of certain zip codes by overlaying different sources of data together and allowing them to combine ADP (payroll) data across business lines to provide better risk management.
Another example of this idea in action, State Street provides investment firms and asset owners a unified and actionable view of their internally generated data, enriched with external third-party sources. In short, financial institutions need to provide a zero-friction environment to blend and join data together with high performances to compete with digital native firms like Square and Upstart.
5. Move everything to the cloud
According to Tamr co-founder and Turing award winner, Dr. Michael Stonebraker, not moving everything to the cloud is the number one blunder in big data analytics. And according to Forbes, 83% of the workload will be on the cloud by 2021. Modern cloud database systems are designed to scale out natively and simplify operations and maintenance of large quantities of data.
From a technical perspective, because of the raw compute power and cheap storage on the cloud, data engineers can eliminate risky and unnecessary ETL and do ELT instead (see ETL vs. ELT). Data analysts can collaborate through data sharing and accelerate their projects through faster iterations and experimentations.
Overall the Snowflake Financial Services Data Summit did a great job highlighting some of the pressing challenges faced by the industry, and bringing forward some potential solutions — both technical and strategic — to address them. It’s obvious that the largest firms recognize the value of data and are looking to maximize their potential to grow their businesses. Some of these same financial services organizations use Tamr to master their business-critical data to better serve existing customers, expand their business, and help reduce risk.