“Managing by the numbers” is a tired phrase.
Really, it is.
“You can’t manage what you don’t measure,” is a phrase that we all hear, and we all say it, like a prayer that a nun murmurs while counting rosary beads. Many people argue about who said it first, Peter Drucker or Edward Deming. We hear it at the beginning of presentations at professional conferences, the obligatory first slide of many tired and boring PowerPoint presentations.
Over the years, I have worked for a number of bosses who believed in using data to make decisions. These bosses commanded their minions to get the data, present the data, and then ask for more data, struggling to find data that released them from their responsibility for making a bad decision. Unable to find the necessary comfort to make a judgment call, these bosses asked for more data. Off the minions went to toil at their computers, looking for more data that answered the vague questions their bosses murmured. Even in the 1980s, the volume of data could be vast, so bosses asked for summary data. Analysts, with new toys like FoxGraph and something called Excel, created three-dimensional colored charts and graphs to present the data, images that sometimes defied logic and understanding but still looked cool.
Eventually, under internal duress, the boss made a decision. Or didn’t. I remember one executive who sat on a crucial decision for almost a year until the organization exfoliated the guy for not making decisions. If you have worked in any large company in a middle-management position, you are well aware of the phrase “the paralysis of analysis.” My own experience taught me to dread hearing, “I would like you find a little more data on this,” from a boss who simply could not make a judgment call.
“I want to present the data so the decision is self-evident,” another boss chanted. Dutifully we generated the data, presenting charts and graphs, only to hear that the presentation did not go well because the participants ignored the data, discounted the data, or disputed the data.
“They present evidence in court to prove a case, and that is what we will depend on to sell this idea to the team,” another boss would say. This came from a man who never sold a thing in his life, who did not understand that the decision to buy something comes from emotion, the feeling of wanting something. Again we developed the data, and he presented that data as evidence to support the decision he wanted to make, only to face indecision from a leadership group unmoved to decide.
Three to four times a year, other consultants will call me up asking for help. Either they’ve won a project only to discover that they can’t do it, or they are bidding on a project that they know they can’t do. Recently, I got a call from a consultant asking for help. I told him to send me the project scope and required deliverables. He sent not only what I asked for, but also a Dropbox link so I could look at the data they had collected so far.
After reading the requirements for this supply chain project, spending time with the scope and the deliverables, I looked at what was in the Dropbox files. My effort uncovered two fundamental problems: the scope failed to define the desired outcome of the effort, and the consultants asked for and got too much data.
There is a trend toward collecting data in order to make data-driven decisions. That trend goes back far, perhaps to the first application of electronic computers in business. The big data movement is afoot. To me, big data means big headaches, because people don’t really know what to do with all the data they can get their hands on. More data sometimes means bad decisions, or no decisions.
The consultant tied himself up in the data that he’d actually asked for, and data the client had insisted he use. At first, I thought the consultant had asked for more data than he needed. That was not the case. The consultant shared with me that the client had used significant resources to generate all this data before the consulting engagement. The client’s management team could not figure out what the data meant, and asked the consultant, who was working on another project, to take on this additional project.
The consultant felt obliged to use the data the client had given him. As we spoke, I challenged this consultant to explain how he planned to use the data. He did not know how he would use it, and hoped that I could help by interpreting the data. We started to talk about the intended outcome of the process — the client’s desired outcome.
“They want us to throw it at the wall and see what sticks,” he said.
I was silent for a moment. All I could think of was how much of a waste of time, effort, and money this was. Throwing data up against the wall, the way some people learn how to cook pasta, is counterproductive. It’s a good way to waste a lot of time and money.
By the time we finished the call, this consultant was about half convinced that most of the data was irrelevant to the project. We could not be sure however, since the client did not have a specific question or decision to make based on the data. Still, the consultant felt that he had to use the data. To me, it is unethical to spend the client’s money on data analysis that does not create value by answering a clear question.
The conversation reminded me of Mel, a man I worked with in the 1980s. Mel could add a column of five- and six-position numbers in his head; long, multi-page lists. His accuracy was astounding. I always thought that Mel was some sort of savant, but he was just a normal guy — a guy who’d learned business without the aid of an adding machine. Mel was the CFO of his company, a company he and his brother had started from scratch and sold for over $400 million 30 years later.
Mel believed in using computers, and he was the reason his company bought a computer. He worked with store managers and employees to create an incredible point-of-sale driven system. That system tracked the history of what each customer bought, when they bought it, and in combination with what other items. Managers could look at sales data by item and see how much the company sold — by store, by salesman, and by day of the week. Good salespeople could look at a customer’s sales history and predict what they might buy next.
This system did this magic in 1988, 26 years ago.
Working with people in the company, I asked how they had built such a powerful system, on their own, in less than a year. Each one talked about how Mel started each day of the design process by asking a simple question. By the end of the day, they’d figured out how they would answer that question. Each day one question was asked, and each day they figured out how to answer that one question.
Mel told me that if you can’t ask a question, you can’t find an answer.