Accelerating Analytics: Decrease The Cost of Asking Useful Questions

After twenty years in the business, I am giving up on the idea of asking brilliant questions. They don’t exist. Ironically, most of the questions which have delivered serious money in the past tended to look like relatively dumb ones…

The first set of significant wins I had in my analytics career was in direct marketing, where I moved the campaign analysis process for a $5MM/year program in-house. From a technical perspective, this was pretty straightforward: write a SAS program to merge our mailing list with our customer file then aggregate response and sales data. Since a common key existed on both files (finders file number), it was a simple matter to join the files and summarize the data into an Excel Pivot table. Intern level stuff.

Except in this case, we didn’t have an intern. We didn’t even have a dedicated analyst working in this problem space. The entire direct mail campaign analysis process was managed by our ad agency, who did a sloppy name-address based merge once a year and sending us a thick hardcopy binder of standard reports. The report formats in the binder were carved in stone, with any new report costing $500 – $5000 to run due to “data archiving” and “custom coding”… enough to stop most casual explorations.

This simple pivot table changed everything. Our cycle time to generate custom report views went from days to seconds, with our cash cost going to zero. More importantly, if we needed to pull additional fields into the report or perform advanced calculations in the SAS script, we could generally turn that around within a couple of hours. Since our analysts could code, we didn’t have to wait for IT approval or resources.

Basic economics: when the cost of something goes to zero, demand tends to increase. The same applies to asking questions – when answering them is cheap, you ask more Not surprisingly, we started finding stuff. Everything from dropping junk names off our mailing list to identifying audience segments who were likely to respond and buy large orders. Some findings were counter-intuitive: certain types of junk names were actually GREAT prospects and several high response rate segments were actualy the worst prospects on a sales-per-solicitation basis. This process generated enough insights about the program to deliver double-digit annual growth in campaign ROI for four years.

I was asked in an interview a few years ago about what was my approach to adding value as an analytics leader. My answer:”Reduce the cost of asking useful questions”. By cost reduction, I refer to not only the direct cost of the technical effort (to pull data) but also the cycle time and effort involves in navigating the organizational bureaucracy between the business leader and the technical team.  The latter is more insidious than the former: given the typical company’s ADHD personality, the longer you wait to start a project, the higher the risk of that project being quietly killed by competing distractions.

The concept of “useful questions” is also worth a few words. With few exceptions, every question should point towards a possible change in the organization’s process/policy standards or how those standards are enforced. “Segment my customer base to find accounts with a low probability of future purchases” has many immediate applications. Long term in-actives can be “reacquired” via high value offers, highly loyal customers can be approached with more intricate programs to deepen the relationship, and the middle deciles can be steered into retention programs. On the flip side, I’ve struggled to make money from broad demographic survey views in the past: while often intellectually interesting, they often have little relevance to short term marketing efforts and are better reserved for guiding product/brand positioning decisions with a very slow path-to-cash.

In a perfect world, the analytics environment would be structured such that the process owner or their assigned analyst can run the request directly from their desk, without any technical support. The big win here is cycle time: less time spent documenting requests and less time spent waiting for an ad-hoc developer to be available to execute a query. This is related to my view that every analyst should learn to code. The shorter the cycle time, the faster you can iterate through the potential solutions to a marketing problem.

It’s also the only objective I can claim to be making progress on with a straight face. Reducing the cost of asking questions is fundamentally an engineering issue, solvable with process analysis and software development. Predicting which questions are likely to pay off generally requires an Ouiji board….

Leave a Reply