There are Ten Commandments of marketing database content management that were discussed in my February 1 and May 1 columns:
- The data must be maintained at the atomic level.
- The data must not be archived or deleted.
- The data must be time-stamped.
- The semantics of the data must be consistent and accurate.
- The data must not be over-written.
- Post-demand transaction activity must be kept.
- Ship-to/bill-to linkages must be maintained.
- All promotional history must be kept.
- Proper linkages across multiple database levels must be maintained.
- Overlay data must be included, as appropriate.
As we have established, these Ten Commandments lead to what we at Wheaton Group refer to as Best-Practices Marketing Database Content. Everything within reason must be kept, even when its value is not immediately apparent. Best-Practices Marketing Database Content enhances data mining, often dramatically. This, in turn, allows the deep insight into behavior that is required for effective data-driven decision-making.
Today's Failures
Many of today's databases do not come close to best-practices content. A key reason is that most database developers are not deep-dive data miners. Instead, they are pure technologists. Therefore, they have no first-hand experience in the nuances of what is required to support best-practices data mining. Couple that with client companies that have experienced neither Best-Practices Marketing Database Content nor best-practices data mining and you have the blind leading the blind. This is a sure path to sub-par content and the inevitable sub-par data mining. Remember, data mining is only as good as the underlying data.
Consider the following excerpt from my February 1 article, a cautionary tale of what happens when The Second Commandment of Best-Practices Marketing Database Content "The Data Must Not Be Archived or Deleted "is not followed:
Data mining can be severely hampered when the data does not extend significantly back in time. One database marketing firm experienced this when it tried to build a model to predict which customers would respond to a Holiday promotion. Unfortunately, all data content older than thirty-six months was rolled off the database on a regular basis. Remarkably, it was not even archived. For example, the database would only reflect three years of history for a customer who had been purchasing for ten years.
The only way to build the Holiday model, of course, was to go back to the previous Holiday promotion. This reduced to twenty-four months the historical data available to drive the model. More problematic was the need to validate the model off another Holiday promotion; the most recent of which had "by definition "taken place two years earlier. This, in turn, reduced to twelve months the amount of available data. As you can imagine, the resulting model was far from optimal in its effectiveness!
Besides predictive modeling, limiting data history to thirty-six months also have a negative impact for any marketer attempting most forms of customer clustering and merchandise affinity analysis. For the company in this example, a good portion of the merchandise is durable in nature, which exacerbates the problem. Within many merchandise categories, a customer will spend hundreds of dollars at a time, but then not purchase again for quite a few years. By deleting all history older than thirty-six months, many of the customers with historical category activity will not appear to have ever made a purchase. It is tough to do state-of-the-art target marketing when your database does not accurately reflect many of your targets.
What I did not mention in the February 1 article is that the database was built by one of our industry's largest data management firms. The development team included the full complement of project managers, database architects, database engineers, and subject matter experts from the client's marketing and IT departments. Unfortunately, no one knew that limiting the database to three years of customer history would severely hamper most data mining efforts. Sadly, many of the data management company's other marketing databases contain only two years of data.
Best-Practices Technology vs. Best-Practices Content
Last October marked my 25th anniversary in the direct and database marketing business. During that time, I have witnessed many remarkable advances, including modern marketing databases and the rise of the Web as a promotional and order channel. Today's business intelligence and campaign management software, with its GUI interfaces and eye-catching output, have revolutionized the ways that marketers experience their data.
However, all too often this impressive technology is constructed upon the rotten foundation of inferior database content. With inferior content, the most advanced business intelligence and campaign management tools in the world will not get your company where it needs to go. This is because a marketing database is only as valuable as its underlying content.
Creating and maintaining Best-Practices Marketing Database Content is hard, ugly work. There is nothing glamorous about the considerable time spent sifting through every data source you can get your hands on, in order to organize, fix and enhance it. Or, to implement and religiously adhere to quality-assurance procedures during all subsequent database update cycles.
Now, do not get me wrong. I am a huge proponent of the wonderful tools available to today's database marketers. In fact, our company includes a well-known software suite of data management tools in our solution set. The power that such software brings to database marketers is remarkable. However, it must be coupled with Best-Practices Marketing Database Content. Cutting-edge tools with sub-par content are a dangerous combination. It is analogous to a builder's taking a pile of lumber, turning on some power tools, and throwing them onto the pile. It is certain that he will not end up with a house!
Unfortunately, the message of Best-Practices Marketing Database Content does not always resonate in the marketplace. It is frustrating to see a potential data management client's eyes glaze over when describing its virtues. All too many are impatient to see the sexy GUI interfaces and output. I have found, however, that exceptions are potential clients who have experienced sub-par content for themselves. This is especially true for those who are looking to upgrade a database that does not lend itself to best-practices data mining.
Back to the Future
Some of our firm's work is pure data mining and therefore does not involve data management. All too often in such cases, I am frustrated because of the limitations of the databases that we are called in to leverage. Incredibly, it can be impossible to do what I was able to accomplish almost twenty years ago with mainframe databases that employed non-table-based, proprietary technology. This is inexcusable!
During the antediluvian days of the late-1980's and 1990's, I worked for a data management firm "Wiland Services "that no longer exists. Eventually, Wiland was purchased by another company that no longer exists, and then run into the ground. But, that is another story. What is relevant is that, back then, Wiland was building systems with better content than many that are in operation today. For example:
I took over Wiland's data mining and consulting group in January 1991. A major retailer was one of my group's clients. Wiland maintained a customer database for the retailer that was composed of eleven million active customers.
The database offered no real access except for green bar-reports and extracts to statistical software packages such as SAS. But, the system met the modern standards for Best-Practices Marketing Database Content. Therefore, the past-point-in-time customer views ("states") that are essential for virtually all meaningful data mining could be easily recreated. This allowed my group to develop a large number of specialized regression models to drive the retailer's sophisticated target marketing programs, and to fine-tune or rebuild the models at will.
During every database update, over 12,000 lines of SAS Code cost-effectively evaluated all of the billions of dollars of atomic-level order, item and promotion history information across all of the 11 million customers. In addition, the code could be altered significantly and then put into production with less than one day of lead time.
The marketing database also automatically maintained up to thirty-six historical point scores and segment codes for each model. This allowed the creation of longitudinal velocity measurements, such as an indication that a group of customers was declining significantly in month-over-month scores and therefore was ripe for palliative measures. As a result of the target marketing programs supported by the database's Best-Practices Marketing Database Content, the retailer was able to decrease general advertising expense by 25% while increasing revenue significantly.
Is it too much to ask that, in terms of content, today's marketing databases match the capabilities of a system built almost twenty years ago? As a data mining professional, I have no use for some of the modern databases our firm runs into, where powerful, GUI-based business intelligence and campaign management tools are driven by sub-par content. And, neither should you!