It is said that poor data quality is costly. How costly?
According to IBM’s estimate, in 2016, it lost $3.1 trillion due to poor-quality data just in the United States alone. And according to Perspectium, a data integration firm, flawed data costs the national economy $3 trillion per year and costs companies an astounding 20% of their revenue.
More CIOs are investing additional time into gaining control over organizational data. As an analyst, it can be very challenging to answer questions if the data isn’t accurate, especially when the information is used for business decisions.
Here are a few thoughts to consider when it comes to improving data quality:
• A simple question isn’t so simple.
One of an analyst’s duties is to anticipate need. One question usually turns into two or three. A question regarding how many customers are served in the West is followed up by one about what they’re buying. If that POS data is in another system with different coding, the answer might take longer to find—especially if it’s merged with a CRM tool.
• Being proactive will serve you better than procrastinating.
I hear a lot of "Oh, it will take longer to find another way" and "I am used to this way" or "I will worry about implementing better data quality later because I have to answer this now."
Well, later is now!
What if you spent 15 minutes turning a 10-minute process into a 1-minute process? Yes, you will be in the hole a few more times when you run the process, but you’ll likely get that request four more times in the coming weeks so the new process will eventually save time, which can be invested in other tasks.
• Poor data quality can be prevented.
Circumstances that create data issues happen in every business. While this is normal, it doesn’t mean we can’t put forth an effort to improve the quality of our data by implementing processes and procedures and then executing on those action items. These processes have to be part of a company’s strategic plan. If it isn’t a priority, it won’t happen.
• Ensuring data is accurate is also a team effort.
The people implementing systems need to hear from those pulling the data. Those who pull the data need to understand how it will be used and who will be using it. Those entering the data need to understand how the data will be entered so it is consistent.
• Implementing better data quality can’t be done in silos.
You’ve got data coming from every direction. First, identify who should be involved and why. If you’re thinking about leaving someone out, think again. Anyone who touches data in any way, shape, or form needs to be involved. Then break the process into pieces. Your CEO may not need to be involved in the actual development of the system, but he or she does need to give input on expectations. Does your analyst need to be involved in the memory storage setup? Not really, but he or she should be involved in discussions about the product to ask the right questions about reporting and analysis capabilities.
If your company approaches better data quality initiatives with these things in mind, it can be done.
Attack your plan in phases—and remember: It’s a team effort.
Using your own data is good, but it only tells you how the market is interacting with your company. To complete the picture, you need market intelligence about the entire market, and the Market Data Program from NAED is the answer.
The NAED Market Data Program is the only solution in the electrical industry that provides a way for electrical distributors to submit their monthly point of sale data in an easy, secure, and anonymous way, and then view their results alongside aggregate data of the market.