Friday, April 22, 2011
Data Quality ROI = Address Validation and Duplication Consolidation
I have had conversations recently with fellow data quality gurus which centered around DQ ROI. We all know how important it is to tie a DQ initiative to a return on the investment. This is even more true of an initiative with long-term implementation objectives. During the course of the conversation I pointed out that I believe DQ ROI is all about validating addresses and consolidating duplicates and there seemed to be a cathartic agreement that made us all feel like we weren't crazy (even if it was only a brief feeling of sanity).
Address validation provides a return by increasing revenue assurance and target marketing delivery. In short, mailing to a valid and deliverable address shortens the bill to cash cycle. In addition, it provides a cost avoidance on return mail charges and provides assurance on bulk mail delivery status. Address validation also increases the potential for and accuracy of house-holding efforts which can significantly reduce marketing initiatives.
Duplicate consolidation has a similar effect on cost which in turn provides a return on investment. Consolidating duplicates reduces billing errors incurred due to discrepancies between customer data (duplicate records does not always mean exactly the same data). It also reduces the number of marketing pieces sent to the same customer, an obvious cost avoidance.
A rough ROI calculation can be determined by totaling measures like cost per marketing piece, cost of return marketing pieces, lost marketing opportunity, lost revenue due to bill return, cost of billing remediation, lost revenue due to the loss of ability to bill and multiply these by the number of invalid addresses and number of duplicate customers. The exact formula is more formal than this, of course, but you get the idea about how much cost can be avoided by implementing a DQ initiative.
Subscribe to:
Post Comments (Atom)
What data quality is (and what it is not)
Like the radar system pictured above, data quality is a sentinel; a detection system put in place to warn of threats to valuable assets. ...
-
Answer by Alex Kamil: Prerequisites Unix shell basics: http://www.amazon.com/Uni x-Progr... C: http://www.amazon.com/Pro grammin... OS basic...
-
While most organizations have data quality issues, not every organization has a budget for software to monitor, report and remedy data qua...
Reading @dqchronicle blog- Data Quality ROI = Address Validation and Duplication Consolidation http://t.co/HwAN4b4 << Good one!
ReplyDelete