
Customer experience, or CX, is a term used by many to reference or describe the quality of interactions between companies and their customers. It’s not the best term for that purpose, given that most interactions never rise to the level of a human experience, but c'est la vie.
It’s one of life’s ironies that many who use the CX label to describe their career, don’t exhibit a technical understanding of the “x”, and this is more of an issue than it may first seem. Any time there is a lack of disciplinary language, it tends to be a gateway problem, to the errors of practice that follow.
Over many years, though certainly in its most pronounced form through the 2010s, many who found themselves attracted to working with customers have been presented with a very simple, seemingly intuitive, proposition.
To know our customers, and understand our customer base, we must research them.
To be clear, we’re not talking about scientific, empirical, longitudinal and peer reviewed studies by independent research bodies. Rather, the proposition takes in the daily activities of those who generate and analyse customer surveys or similar. The concept has many handles: feedback, voice of customer, insight, experience data, experience management, and more. So too, it has many catchcries about making all this data “actionable”, to be an agent of change, to drive culture, to be “customer obsessed”.
All aboard the train. And make no mistake - this train is packed to its rafters, heaving with very good people, all determined to make a difference.
In 2011, the largest CX association was formed, and it’s no coincidence that it was founded by a survey software analyst. Want to know what your customer’s think? Want to know how they feel? Just ask them.
The metaphor for the CX movement became ‘researcher’.
In the heart of mid-winter, sheltered at an event in a windswept city, I was approached to speak to a board of directors about the nature of their customer base. At the time, I was standing beside a what cold well be described as a card carrying CXer, a truly nice guy from a large financial services company that I’d met that day before.
I asked when they wanted me, and was told that the board had already met, and were now having drinks in an adjacent room. They knew of me, knew I was in the building, and wondered if I could pop in?
My drinking buddy scoffed! He scolded the young lady, telling her that every organization is different, it would need some structured research, and time to analyse the data to get to the facts, and…
I interrupted him.
“Sure”, I said, “let’s go”. I invited my friend along.
Over a delightful drop of Glenmorangie Nectar, courtesy of the chairman, I walked the board of a company I’d never met, through the general patterns and more precise laws that informed their customer base, the sort of managerial and economic implications they represent, and what to be wary of in corporate reporting. Naturally, all from a ten-thousand-foot view. We were in a bar, after all.
And there may have been more than one round.
My friend sat quietly afterwards. He told me that he’d learnt more in 40 minutes than in all his 15 years of reading popular “thought leaders” and paying his association fees. I bought him another drink.
While it certainly seems intuitive that we can just ask customers for insight, we now know that self-reports generate wholly unreliable data. This is an ongoing blind spot for many, and one that no amount of statistical analysis can paper over.
History is full of scientific discoveries that shattered what we had more intuitively believed. It was obvious to all of us that the world was flat.
Using an example that’s closer to our work, the level of existing market penetration is, intuitively, totally unrelated to ongoing loyalty. Yet the empirical evidence shows that it is. There are many examples where science challenges what we had allowed ourselves to believe, or what was popular to think. Industrial superstitions included.
But what if, instead of marketing and customer management, I had chosen a career in the legal profession?
Had that board asked me to brief them on, let’s imagine, an aspect of commercial law, I would have done so drawing on extensive formal training and expertise. Or perhaps, had I chosen corporate finance, and they had asked for an overview of best practice in treasury departments, the same result would have ensued.
Neither profession relies on the peculiar idea, that simply eliciting client feedback is a proxy for a learned, underlying technical knowledge of the field.
Thank goodness that airline pilots are also immune to such an idea. And the next time you lean back in the dentist’s chair and listen to the drill start up, say a small prayer of thanks that if not for a reliable education, the person holding that high pitched drill would not be there - no matter the number of surveys they had sent into the ether.
Of course, in marketing, we do use research all the time.
This is the practical outworking of the ‘market orientation’ principle, and it informs everything else that follows in the marketing method. Category entry points. Segments. Positioning. Targeting. Products. Pricing. All of it.
But that’s outbound marketing, designed for growth, ala an increase in market penetration. While research-service firms or survey software companies might want us to conflate them, marketing and customering have different operational levers and different economic outcomes – even as they share some common science.
That’s not to say that there are no valid use cases for surveys – or the limited application of research - in the customer base. There are, and in the Mini MBA in Customering, I teach them. But while research is primary in marketing, that’s simply not the case in customer management.
The better parallel is that of engineering.
Any engineer can stride into any boardroom, at any company, in any category, and in any geography, without any notice – just as I did - and explain the fundamentals of their profession and their work.
Theirs is a science, with inputs from physics, mathematics, chemistry, design, materials, quality control, and so on. They don’t need to send surveys to understand gravity. It’s learned. Not researched. And when engineers manage a bridge, it doesn’t fall.
In our work, the sciences may be different, but the application of scientific management principles is the same.
I should say that even with all this talk of ‘science’, it’s not that complicated. Give me 12 weeks and you’ll see that the customer world isn’t flat either. In the meantime, the metaphor, if we needed one, for an effective career in customer management is not researcher.
It’s engineer.
April 2026
