Managing the New Currency: Developing a High-Performing Data Management Organization

Don Loden, Managing Director Data and Analytics

These days, the hot analogy in the analytics industry is that “data is the new oil.” Like oil, data must be found, extracted, refined and distributed. More companies are investing heavily in cutting-edge technologies like machine learning, artificial intelligence and big data processing to help them harvest, groom and apply data to the correct use case for maximum value. So why, then, in the midst of this prospectors’ rush, do studies of business intelligence (BI) implementations repeatedly indicate that 60 to 85 percent of BI projects fail?

While tech is changing rapidly, the nature of most data management efforts has stagnated. Traditionally, the IT team has been seen as an all-knowing and all-capable “data priest,” producing the exact report requested by the business. We’ve seen businesses put a lot of focus on acquiring and storing data as cheaply as possible, while neglecting the equally important business use case and governance aspects. Because of this, we often see that data management organizations (DMOs) are not able to withstand the waves of change from sources such as new technology, organizational drivers and government regulations like the General Data Protection Regulation (GDPR).

Armed with that historical knowledge, I want to offer a few considerations for organizations to take into account when analyzing their DMOs.

The Business Knows Best

As many businesses can attest, end users who do not understand the data they receive from an analytical solution will take matters into their own hands, often with chaotic consequences. This is because they either do not trust the data or cannot use it for their particular case. We have found that bringing business experts who use the data frequently into the planning and implementation of a BI initiative is fundamental to that effort’s success. Not only do people in the business create the data, they are also the ones who can accurately dictate use cases and predict future needs. This, however, is a major cultural shift in traditional data management as a company’s IT department moves from the role of a data developer to one of enabling data access for end users.

The experts to be included in a high-performing DMO are individuals who not only understand the data and how it is organized but also carry business-process expertise. As the amount of data in today’s organizations skyrockets, these experts must be able to separate the relevant information from the background noise. Having a structured data stewardship program is also crucial for maintaining data standards, definitions and accountability. Finally, executive buy-in from the top of the organization is a key success factor to any initiative, particularly this one.

Data Quality Is King

No matter how well-structured a data organization may be, poor data quality can severely impact end-user adoption. After all, what good is data if it cannot be used? Organizations should not overlook issues like data duplication, normalization inconsistencies, stale data or conversion errors. In fact, fixing these issues should be recognized as a priority that will pay dividends down the road, measurable over time. Data standardization can also be a huge boon to a DMO, and business users can finally stop wondering how sales revenue is calculated once and for all.

In our practice, we work with clients from a variety of backgrounds, and we deploy a number of tools to address data quality challenges, including:

  • SAP® Data Services – A quick and easy extract, transform, load (ETL) tool to help cleanse and standardize data. A large number of data connectors and easy workflow make this a well-rounded tool.
  • SAP Information Steward ­– An enterprise-grade data quality tool that configures business rules to measure data on specific conditions and track those scores over time.
  • Informatica Cloud Data Quality Radar ­– A full-range tool with cloud-based features that creates scorecards and business rules and measures data quality scores over time.

Focus on Managing the Change

The BI and data quality tools on the market today will continue to evolve and grow in number. However, the one constant not likely to vary in years to come is the need for organizational change in data management. Protiviti’s approach to change management outlines the critical steps necessary to launch a successful, large-scale change initiative using a shared risk, shared gain approach. Some of the steps seem commonsensical, but those are often the ones that get overlooked. It is important to ensure that each of these steps, listed below, is followed so the organization can not only implement the changes it desires but also welcome them and make them stick.

  1. Identify and Align – Identify the goal and key components of the new initiative, ensuring that it aligns with everyone’s needs and the organization’s vision as a whole.
  2. Document and Map – Involve cross-functional teams in mapping and documenting processes.
  3. Design and Develop – Perform a fit-gap analysis and assess change and implementation readiness.
  4. Implement and Adopt – Deliver training and support and monitor adoption levels.
  5. Feedback and Improvement Develop a governance plan; establish metrics and a method for feedback and improvement.

In Closing

Most organizations already have the awareness and desire for intelligent data management; the rest is a matter of making it a priority – today. If data is, in fact, the new oil, then “analytics is the refinery, and (business) intelligence is the gasoline which drives growth,” according to Tiffani Bova, global customer growth and innovation evangelist at Salesforce. Once a DMO is empowered, it becomes the driving force that will safely lead businesses into the future of analytics.

Narjit Aujla of Protiviti’s Data Management and Advanced Analytics practice contributed to this content.

Add comment