August 11, 2014

When Customer Analytics Attack: How Comcast Screwed Up

Title slide bg wide

"I'm really ashamed to see you go to something that can't give you what we can."


The first thing many people thought when they heard Ryan Block's recording of a Comcast "retention specialist" was something along the lines of "wow, what a jerk". It's just common sense that if you're arguing with a customer, you're probably doing something wrong.



But while Comcast tried to defuse the situation by blaming one lone, poorly trained employee, it increasingly looks like that's not what happened at all. The Verge has a fascinating collection of testimonials from current and former Comcast tech support employees who describe -- in great detail -- a systematic approach to mixing support with sales that encourages exactly the kind of behavior they allegedly disapprove of. By building their operating procedures around an incredibly shortsighted interpretation of metrics, and then creating strong incentives to adhere to these procedures at all times, Comcast brought this PR nightmare entirely on themselves.

"TELL ME WHY YOU DON'T WANT OUR SERVICE!!!!!"


Perhaps this is simply the inevitable result of being a company like Comcast. They're a for-profit organization with little competition, and provide an essential service, so it's possible that the temptation to simply tighten the screws on customers at all times is just too much to resist.

I'm not so sure about that, though. While customers may not be able to leave Comcast for a viable competitor in many cases, the company is currently facing an enormous PR challenge as they attempt to win approval for an extremely unpopular merger with Time Warner. Customers can't necessarily leave, but they can vote -- and at this crucial time for Comcast, there's no way they're stupid enough to barge into a PR dumpster fire like this on purpose.

How does this happen to an organization like Comcast?


As any operation grows, it eventually becomes impossible to handle everything as a one-off decision. You need basic policies to operate at any kind of scale, whether you're a freelancer who's taken on more clients than they can manage, a realtor who just hired their first administrative assistant, or an enormous telecommunications & media company. As you do this, data becomes incredibly helpful for designing, implementing, and validating your procedures. No, you can't examine every customer interaction or business operation in great detail, but you can at least identify the analytics that represent a good one, and seek to recreate them.

Every growing company goes down this path eventually, which is why highly personal, hard-to-scale solutions like Amazon's "Mayday" feature attract so much attention as outliers. When you're managing millions of customers, and thousands of very similar calls every hour, you simply have to be able to model an effective call and fix deviations from the norm if you want to maintain effectiveness. You can do still do more personal outreach campaigns, but they're impossible to scale to everyone. The true test of your support won't be how many customers you can invite to throw at the first pitch at a Blue Jays game; it'll be the quality of the mundane, "I can't log in to my account" call you get a thousand times a day.



Your numbers may vary, but in general, if you can get that average call to take less than a minute, solve the problem 99.5% of the time, and make customers aware of some behavior that can prevent future problems, you're probably doing a pretty good job operating customer service at scale.

What Comcast has in common with bad internet marketing


Comcast went wrong in the same way a lot of internet marketers lose their way attempting to respond to Google's search updates. At a philosophical, non-modeled level, figuring out how to make Google happy is pretty easy. They only really want two things --

  • people who use Google to see relevant ads in their travels, and click on those ads

  • people who use Google to find what they want, quickly, so they're happy and come back to use Google again


Sometimes Google forgets this like any other large company with competing interests (ahem, promoting Google+ results, etc.), but most of the time they're pretty good at it. So far, they've used data effectively to model true customer satisfaction, at scale, and keep people coming back for more. Google's data collection is horrifying enough to me that I don't use it as a default, but when I REALLY need to find something, I go to Google. When it comes to search, they know what they're doing.

However, some of the people who market on or via Google remind me of Comcast. Every time Google changes the way it works, something becomes all the rage -- keyword stuffing, guest-blogging, Google Authorship, whatever. Sometimes it's not even Google-related; first we were obsessed with getting Facebook likes, and then Twitter followers, and now it's shares on these services. Then, we suspect that the model changes, and we run around tearing down most of the stuff we've put time, money, and effort into building.


We got lots of clicks! Aaaaaand.... they're gone.



The reason these things remind me of Comcast is that, like everyone's least favorite telecom juggernaut, they're dependent on extremely poor models of a good customer experience (in this case, discovery and reading). In a share-dependent model, for instance, you assume that people who share are engaged and successfully "reached", and that people who don't share are not. This is not only a wild simplification, but in many cases, an outright incorrect one. Lots of people share things just to attract followers of their own. "Lurkers" on message boards, blogs, and forums -- who read but rarely post anything -- are a huge part of a healthy community who are extremely difficult to really measure.




Do people talk about what you're writing offline? Do they email things to each other? Do they write their own content that references yours, because they respect it? All of this is missing from the model, but because we want a model to exist so badly ("Google rank = traffic = money"), we overemphasize the things we CAN measure, and then optimize our customer experiences to make those things happen at scale, sometimes without noticing how ridiculous or nonsensical that experience really is. If you could increase the occurrences of whatever your "conversion event" is by doing something annoying, would you do it? How about if it wasn't just annoying, but deceitful, or predatory? What if it caused people to resent you, or actively hate you? Would you still do it then? And would you even recognize when it crossed once of these lines?

"Tell me why you don't want the fastest internet."


In Comcast's case, the answer is obviously either "yes" (to the first question), or "no" (to the second), because they had a whole handbook of things to do that are abjectly customer-hostile. The problem is, while those things are hostile to the actual customer, they're not detrimental to the customer model they were built around, which IS more likely to remain a paying customer, and IS more likely to purchase additional services, and IS ultimately more profitable to Comcast than they would be if not subjected to these "enhanced customer service techniques".



In other words, someone at a high level looked at company goals, and built a model of a successful customer interaction. That model was handed down to the mid-level people in charge of determining procedures, and they simply followed the model when doing so. Finally, the resulting ridiculous procedures were enacted by actual support and retention specialists, and this flawed model was used to determine (and enforce) what a customer interaction should look like. All of this looked like science from the boardroom, even when in reality it was completely insane to anyone on the ground, INCLUDING many Comcast employees.

What does this say about analytics?


First of all, don't blame the tools. In fact, this is really just another example of how powerful metrics can really be -- Comcast itself has reaped massive short and medium-term financial benefits from accurately calculating the most efficient way to flip their customers upside down and violently shake every last coin from their pockets. From that vantage point, the math didn't fail -- it succeeded spectacularly!

This data point seems like it should be relevant.


Unfortunately, the way Comcast used this spectacularly productive math resulted in something very, very bad, which is not really the math's fault. Analytics don't inherently do anything good or bad -- they don't care about your customers, protect your reputation, or even maximize your profits. It's up to you to use stats and analytics in a way that encourages short, medium, and long-term outcomes you really and truly want, while (a) keeping in mind what elements of your business you're currently unable to track, and (b) taking into account what will happen when your metric goals are taken to their logical extreme. Don't think it won't happen to you -- that's probably what some ill-fated Comcast executive thought a few months ago.

Analytics allow us to automate certain parts of our businesses, which isn't just awesome -- it's actually necessary if you want to grow what you're doing past a certain size and scope. But things like benchmarks, KPIs, and other stat-based measures of company health need to be frequently reassessed for effectiveness. If you can't commit to doing this, you may actually be better just staying away from analytics -- at least complex ones -- altogether, lest you end up building the same horrible Frankenstein Comcast did.