Friday, August 26, 2011

ABC and DQ: Codependent Initiatives?

Activity Based Cost and Data Quality: Codependent Initiatives?


Summary


Activity Based Costing, or ABC, is an exercise where costs are assigned to business activities required to support critical business operations. While it is often used in support of a business process redesign (BPR) effort, it can also serve an important role in data quality (DQ) initiatives.

In order to conduct a data quality initiative, a significant investment is required.  There are costs to purchase hardware, software, support and implementation resources.  Considering DQ efforts are usually associated with "fixing" a previous investment, these costs are not generally accepted as capital investments.  They are usually viewed as negative consequence to a failure to comprehensively implement a system.

 


One way to mitigate this perception is to demonstrate, in monetary terms, the return likely to be realized.  Demonstrating this return is most affective by linking cost avoidance associated with increased business operations.  In order to do this, you need costs associated with these operations. 

As a result, DQ becomes dependent on ABC in order to justify the expense.

The proof is in the pudding.  And by pudding, I mean operation


So let's talk about ABC efforts and how they can be used to help justify why an organization needs to invest in data quality.  Rarely, does an organization know how long it takes someone to perform everyday business operations.  After all, this time is often viewed as a necessary expense of doing business and anytime invested is required so why analyze it?  Given this approach, it is not surprising that very little is known about how long it takes to perform a business operation when there are exceptions to the norm. 

In other words, when there are data related discrepancies very little is tracked in terms of impact and cost.  However, this is a legitimate and every day reality.  Often reports do not reflect the same aggregation and someone spends countless hours tracking down the root cause for the discrepancy.  This time is money.  It is also lost opportunity, which results in further cost to the organization.  Boiling this type of event down to a cost of resolution can form the basis for justifying investments in prevention.

 


For example, if there is a report that provides the status of product inventory and another that provides a summary of sales these two reports should decrease and increase in direct proportion.  Inventory should decrement at the same rate as sales increments.  The chart below illustrates this relationship.

Inventory vs. Sales Report Relationship with High Quality Data
However when one of these two systems suffer from poor data quality, you end up with a report that looks like the chart below.

Inventory vs. Sales Report Relationship with Low Quality Data

 This doesn't take long to look at before you start realizing there is a problem, especially for someone who is responsible for generating this report and is familiar with much more normal looking analysis.  Logically, this individual will start down the path of determining the root cause.  At this point, the meter starts running.  However, not only one meter but two are racing toward an unexpected cost.  One of the meters tracks the time, and hence money, spent fixing the issue.  The other meter tracks the time and money not spent performing duties that would have been performed had the issue not been there in the first place. 


If you boil down this individuals compensation down to dollars per minute and track the time taken to resolve this issue and time not spent producing in other areas, you can start calculating the cost of poor quality data.  In all liklihood this individual does not resolve the issue alone, so you can start adding in additional dollars a minute for supporting members.  Not to mention, sales people don't have accurate inventory numbers and this impacts their ability to maximize their activities!  Before long, you start to get a clear picture that poor quality data is costing you, in a recurring nature, a lot more than the cost of fixing it. 


Although disturbing, this is the key to ending the vicious cycle of waste and becoming a more streamlined organization. 



Conclusion


It is a difficult decision to spend money on fixing broken operations and systems, however, it is an easy decision to spend money to end waste and increase productivity.  It just a matter of perspective.  An experience data quality professional will help you see this effort in the right light and even help you put real numbers behind it. 


If all this sounds familiar, maybe it is time to find that data quality professional and start saving some money!

5 comments:

  1. Reading new @dqchronicle blog post: "ABC and DQ: Codependent Initiatives?" - http://t.co/Obv2FA0 #DataQuality

    ReplyDelete
  2. Via @ocdqblog: Reading new @dqchronicle blog post: "ABC and DQ: Codependent Initiatives?" - http://t.co/wmfwNpX #DataQuality

    ReplyDelete
  3. Reading new @dqchronicle blog post: "ABC and DQ: Codependent Initiatives?" - http://t.co/wOwIERe #DataQuality via @ocdqblog

    ReplyDelete
  4. ABC and DQ: Codependent Initiatives? http://j.mp/raUb6J #activitybasedcosting #dataquality #in

    ReplyDelete
  5. ABC and DQ : Codepent intiatives ? Src : @dqchronicle #DataQuality http://t.co/oOPUVRXs

    ReplyDelete

What data quality is (and what it is not)

Like the radar system pictured above, data quality is a sentinel; a detection system put in place to warn of threats to valuable assets. ...