What Is a Customer?

[article]
Summary:

When we communicate, the words sound familiar so we think we understand each other. But understanding fizzles when we attribute different meanings to the words we use. In this column, Naomi Karten illustrates how differences in the way departments and companies define their terms can cause confusion, flawed conclusions, and faulty decisions. Naomi asks us to question the meanings of terms before starting a project to ensure that we understand what's called for.

A traditional definition of a customer is someone who purchases products or services from another. Simple enough. But when you're responsible for systems that analyze and report customer data, this definition falls short. And it is not an adequate definition when you're the end-user making business decisions based on reports generated by those systems.

I was reminded of these facts by a Wall Street Journal article that began, "The Securities and Exchange Commission's probe into how telecommunications and cable companies count their customers highlights the varying and sometimes inconsistent standards companies use to keep track of that most fundamental asset."

For example, at one time cable companies differed as to whether a household with both analog and digital cable services counted as one customer or two. Companies in the wireless sector may differ as to how long prepaid customers are counted as customers if they haven't made a call in a long time or they've run out of minutes. According to the article, one familiar online service once counted thousands of people as subscribers who didn't even know they had accounts. It appears these accounts were part of a bulk subscription program given to employers, some who never actually offered the accounts to their employees.

Count on It!
Such differences in counting methods may matter most to those who investigate evidence of abuse. But this multiplicity of criteria for what constitutes a customer raises the question, "Do you know how many customers the company you work for has?"

More often than not, the correct answer is another question: "What do you mean by customers?" That's because the definition of a customer is likely to vary not only from one company to another, but also from one department to another within a given company. For example, the definition may (or may not) include lapsed customers, customers in default of payment, customers who returned the goods, and customers whose last purchase was more than three years ago. Therefore, in preparing specs that concern "customers," you'd certainly want to ask some questions, such as, "According to whom?"

According to Whom?
Of course, it's not just software professionals who need to be aware of multiple definitions, but also the users of the resulting analyses and reports. Back when I was an IT manager, the company classified its customers into three main categories. Two of the categories were distinct from each other. Customers in the third category had attributes in common with the other two.

Predictably, certain business departments treated this third category of customers as part of the first category and others treated it as part of the second. Meetings involving representatives from these different departments often turned into shoot-'em-outs over which department's customer reports had the correct information. On numerous occasions, my department was summarily accused of having created faulty reporting systems.

The truth, as we repeatedly had to remind them, was that both sets of reports were correct. The "discrepancies" merely reflected differences in the way these departments classified the third category of customers. And the way each department classified its customer data was appropriate to its specific function.

As a result, we had to make sure we accurately understood the needs and expectations of both the requester and the ultimate users of the system when defining requirements. And whenever someone asked for a report of some subset of customers, we began by asking, "What do you mean by customer?"
Luggage In, Luggage Out
Definitional ambiguity is not limited to "customers" of course. In my last StickyMinds article, Picture Perfect, I described an experience in which my luggage and I parted company. I went to Point A; my luggage went to Point B. To me, this was a clear case of lost luggage. But airlines are known for having many different definitions of lost luggage, one being luggage not returned to its owner within some specified number of days (a time period often longer than the duration of the trip for which it's missing).

This definition of lost luggage is valid, but you might see the situation differently if you are the affected "loser." Certainly if you are comparing airlines on their record of lost luggage, you'd want to know how they defined their terms—especially if you suspect that each had skillfully selected a definition that permitted it to look good relative to its competitors.

Most airlines now use terms like "delayed luggage" instead of "lost luggage," permitting a time span that can flexibly range from an hour to forever. Happily, my own luggage has had an excellent record in recent years of going where I go, so I'm less "definitional-ly" stressed.

One for Oil and Oil for One
We have to remember to always be on guard because even the most familiar terms can have multiple definitions. I once heard a story about two federal government agencies that generated reports counting the barrels of oil coming into the United States each month. When members of the two agencies met, they discovered that the data in their reports didn't match. Can you guess why?

The first possible reason seemed obvious. Oil is not simply oil; it encompasses numerous grades of crude and refined oil. So they adjusted for this difference and reran the reports. Still, the data didn't match.

Then they remembered that the definition of the United States varies. In certain situations, the US encompasses not just the 50 states, but certain Caribbean islands as well. So they adjusted for the definition and reran the reports. But alas, the data still didn't match.

Hmmm, barrels of oil into the US per month. What else could it be? Month! It seems that one agency tracked barrels of oil based on calendar months. The other, for reasons that undoubtedly made sense for its specialized functions, tracked from the twentieth of each month to the nineteenth of the next.

Adjusting for this difference, they reran their reports yet again. This time their data were (and I'm just quoting here) close enough for government work!

To minimize ambiguity, we need to question definitions and seek clarification. Invariably, it's the most familiar terms—the ones whose meanings are obvious—that cause the greatest confusion. In developing systems and reviewing reports, it pays to have a questioning mind.

About the author

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.