“Why didn’t this part get ordered yesterday? Who changed the setting on this field? When this happens we need to hold that person accountable!”
We were trapped in another frustrating meeting, a cross-functional team searching for the root cause behind operational failures in our facility. Some items were lost in the warehouse, others were suddenly running out of stock, orders weren’t going out to customers on time, and answers were hard to find.
The top dog at the table didn’t like the fact that our ERP system couldn’t tell him when and how data had changed over time. The purchasing manager couldn’t say why a flag had been changed that caused an item not to be ordered, and the warehouse manager didn’t know who had picked a location clean without alerting someone that we were out of stock.
Our site manager was surprised that we couldn’t easily find that information, assuming that since something happened on a computer system, we should be able to track all the details around who did what, and when.
While I can get as paranoid as the next guy, computer systems don’t always provide that tracking by default. It’s certainly possible to track user activity, but those functions have to be developed or configured. That takes time and money, for the initial setup as well as ongoing data storage and maintenance.
Data Management Isn’t Cheap, Nor Should It Be
These tracking questions usually come up when an organization hasn’t developed a plan around data governance ahead of time. They aren’t sure who has update access to master data, and under which circumstances updates can occur.
The solution is to invest the time and resources necessary to build in data quality controls up-front, establishing business rules and supporting systems that dictate who can make updates to critical information, when they can be made, and how quickly they should be made when requested. If users get stuck waiting on some other department to make a system change before they can proceed, sooner or later they will figure out a way to avoid that bottleneck, and that rarely works out well for the broader organization.
Is it worth investing in the quality of data in your ERP? Of course it is. In tightly integrated ERP solutions, data quality is essential to smooth operation. Once data quality slips, people begin to question what they are seeing and lose confidence in the system. Eventually they start managing things on their own, running the business on spreadsheets, which leads to a further erosion of ERP data quality, and the downward cycle continues.
Starting a Data Governance Practice
So when an operational manager comes to me with a request to figure out who’s making changes to their data, I’ll say “sure, we can do that, but first…” and launch into a conversation about data management, which might start with items such as:
- Do they have any idea how dirty their master data is? Most places haven’t even considered that analysis.
- Have they audited which users have update access in their system? Often you’ll see accounts for users who left the company long ago being used regularly, because they shared a password with a coworker.
- Is it clear which groups own data in the system, and are responsible for performing updates? Are those updates being performed correctly and in a timely manner?
- Are data dependencies well understood and communicated (i.e. if Field A = X, then Field B should be 0 or 1, but not 5)?
Data quality problems are rarely the result of people actively attempting to cause harm. Instead, users are simply trying to get the task in front of them done as quickly and effectively as they can, often unaware of the consequences their actions can have on other departments. Rather than create reports searching for someone to blame, managers would be better off establishing a robust Data Governance practice which allows users to trust the data the see, and rely upon each other to keep things running smoothly.
Leave a Reply