As those of you in the data business know, last week was the annual Gartner Data & Analytics Summit. If you attended in person, we hope you had a chance to visit our booth and used our flight simulator to become a data maverick. The summit was not only a venue for people to share their learnings, but also a celebration of all the achievements made by the data and analytics industry.
While the speakers shared great content and wisdom across more than 200 sessions, a few themes stood out. Below are my seven key takeaways about how these industry leaders, product visionaries, and technical specialists use the power of data to redefine their businesses and tackle the data challenges faced in their industries.
- Analytics and BI are Becoming More Automated
In a recent survey, Gartner asked data leaders “what is the most important emerging capability you’ll require when selecting a new analytics and BI platform?” The response: 44% answered “automated insights” and 25% replied “data storytelling.” According to Rita Sallam, Distinguished VP Analyst at Gartner, “by 2025, data stories will be the most widespread way of consuming analytics, and 75% of stories will be automatically generated using augmented analytics techniques.”
On the one hand, business complexity is increasing, demanding more of everything: more uncertainty, more disruption, more data volume, more diverse data, and more complex scenarios. In addition, more people, often with less skill, are requiring more advanced types of analysis and faster responses. On the other hand, the world is becoming consumerized and highly contextualized, and analytics and BI must follow suit.
The recommendations? Gartner suggests augmenting the consumer by incorporating new, “beyond the dashboard” capabilities into your strategy and operating model. Companies should expand their impact by making data literacy and analytics skills not just a basic requirement, but a mandatory one for ABI access. Fostering collaboration and building communities is another recommendation that Gartner provided. From an organizational perspective, they suggest reducing the impact of change management by planning for possible changes to roles, responsibilities, and skills.
- Be Value-Centric in Your Data Strategy
According to Gartner, “by 2025, 70% of public companies that outperform competitors on key financial metrics will also report being data and analytics centric.” As we have said before, making the business case for data management is essential for the success of any innovation program.
Do not fall into the trap of only expressing IT value. Define, measure, and communicate data and analytics value creation and its impact on critical business KPIs. Work with the CFO to report data and analytics (D&A) business impact. Make business outcomes the center of your D&A strategy. Establish a foundation of financial governance and build prioritization models to determine budget allocation. And connect the cost of workloads to business value.
Implement a framework and tool for assessing “net business value” to rank, prioritize, and select the best group (portfolio) of data and analytics initiatives so you can optimize for business value and align with strategic priorities. Use value stream mapping to identify potential business collaborators. Establish cross-functional teams within value streams and agile practices and evolve to fusion teams and measure results.
- Master Data Management (MDM) is Front and Center of Data Innovation
MDM is a critical enabler of digital transformation strategies. And it was clear others agree as there were multiple sessions dedicated to MDM by Gartner analysts, industry leaders, vendors, and practitioners. According to Gartner, MDM is a dependency of any digital strategy. Data and analytics leaders must clearly articulate MDM value to ensure stakeholder support.
Our Chief Product Officer, Anthony Deighton, together with Harveer Singh, the Chief Data Architect and Global Head of Engineering at Western Union, shared how trusted data powers customer services initiatives with Western Union’s more than one billion custom records.
Making a connection between investments in master data and the delivery of business outcomes is a requirement for MDM program success. Both IT and the business benefit when you have well-defined metrics of MDM program success.
Gartner recommends focusing on measurable business outcomes and prioritizing your MDM roadmap based on business rewards. Be agile and take an MVP approach minimizing time to value. Collaborate with stakeholders to build a business case for each phase of the program. And continually measure, collectively build and widely communicate the value of MDM.
- Speed up your MDM Implementation with Best Practices
In a world of digital acceleration, speed matters. Traditional MDM programs can take months or even years to launch, but it doesn’t have to be that way. A sound strategy built on fundamental data management best practices can not only help organizations embark on a much-needed MDM journey – but also accelerate it. Avoid letting perfection get in the way of MDM progress. Staying razor-focused on time to value will ensure you make the right choices.
Gurpinder Dhillon and Joseph Santos from Dun & Bradstreet also shared their recommended MDM best practices so that your MDM journey does not have to be complex. By assessing master data readiness and benchmarking data quality early, you can help establish a baseline of understanding. Create a common identifier and entities so you can build a common, trusted view of a business entity throughout the relationship lifecycle. Implement deliberate hierarchies so you have clarity for critical business relationships and can provide a holistic view of these key entities. Create connections between legacy, modern, and new architectures, platforms, and agendas. And ensure that real-time access to value-added data exists to support more informed decision-making.
- Operationalize or Marginalize
According to Peter Krensky, Director Analyst, and Farhan Choudhary, Analyst, at Gartner, “By 2023, ease of migration, interoperability, and coherence will be deciding factors in 90% of data science, machine learning and AI platform buying decisions” and “by 2024, organizations that lack a sustainable data and analytics operationalization framework will have their initiatives set back by up to two years.”
New opportunities and risks will elevate our need to shift and scale quickly. Enterprises report spending 90% or more of their time on data engineering. Not only that, many data quality, MDM, and data catalog initiatives create silos. That’s why analytics and AI projects fail: because operationalization is an afterthought.
The top barrier to scaling is integration complexity. Market immaturity leads to overlapping capabilities for analytics and AI pipeline operationalization tools and techniques. Shifts to the cloud and D&A market convergence create new opportunities to leverage coherent ecosystems to drive value, allowing companies to avoid overreliance on a single commercial platform.
- Data Mastering, Data Mesh, and Data Fabric
Nghi Ho, Head of Data & AI Platform and Enterprise Data Governance at Gilead Sciences, introduced their architecture and journey of data mesh. According to Nghi, monolithic data platforms and approaches do not scale, they need a data mesh strategy; top-down governance does not work, we need a federated governance approach.
What’s inherent in data mesh is the belief that data is more distributed. And because it’s more distributed, you’ll need a way to resolve differences in the data. This is really hard. Data fabrics and data mesh help provide the semantic view across the silos but they lack centralized governance. A data mastering strategy unifies data by entities and shares agreements on the data fabric and data mesh.
According to Ehtisham Zaidi, VP Analyst at Gartner, “by 2025, active metadata assisted automated functions in the data fabric will reduce human effort in half and quadruple data utilization efficiency.” To elevate a data-driven culture and drive new revenue, you need to treat data and analytics as an asset across the enterprise to help deliver and democratize access to trusted data and analytics – with speed and efficiency
- Data + Technology for the Win
Our very own Matt Holzapfel, together with Liz Barrette from Dun & Bradstreet, showed how Tamr and Dun & Bradstreet are working together to make Master Data as a Service a reality. MDM began when technologies like machine learning (ML) were science fiction. Today, that’s no longer the case. Tamr’s machine learning is a proven technology that helps clean and curate data in record time. Combined with Dun & Bradstreet’s world-class commercial enrichment data and their expert advisory services, there has never been a faster path to clean, curated master data.
For several weeks, Dun & Bradstreet and Tamr have been pre-training the ML model with Dun & Bradstreet data and sample customer data as well as pre-configuring the platform to create an out-of-the-box solution. Dun & Bradstreet’s advisory services will be the human guides observing the out-of-the-box model and, when appropriate, will optimize the Dun & Bradstreet match and tune the clustering model.
Overall, this year’s Gartner Data & Analytics Summit did a great job highlighting some of the pressing challenges faced by the industry and bringing forward some potential solutions — both technical and strategic — to address them. It’s obvious that the world’s leading organizations recognize the value of data and are looking to maximize their potential to grow their businesses. In fact, some of these same organizations use Tamr to master their business-critical data to better serve existing customers, expand their business, and help reduce risk.