Culture Doesn't Train on Yesterday.
I spent 25 years as a marketer, with 14 of them as a VP and CMO, working with heft budgets, large and small teams and I started before the .com boom came along. When website analytics started up, the greatest metric was the number of “hits” a website got. We loved it, clients did too. Hey, this was pre-SEO, the days of Yahoo!, Lycos and AlatVista search engines. And directories. Lots of directories.
But “hits” it turned out, was a bad metric. Just as “likes” are today. As web analytics evolved, so did other metrics emerge. And suddenly, marketers had more data than ever before in our profession. It was heady stuff. Then, over time, marketing got drunk on data. And it’s getting worse in some ways.
One of the areas I find it worsening is with “synthetic data” from companies using Ai tools like LLMs (ChatGPT, Claude etc.) and Machine Learning (ML). Some applications of synthetic data are useful and can be found to be fairly accurate. But not always and in some cases, in important ways.
A constant struggle for marketers has been proving their worth, what marketing outcomes contribute to ROI. And truly, if marketing isn’t contributing to growth and revenues, then it’s not doing a proper job of things.
But you can have too much data. This is where the McNamara Fallacy comes into play; precision ≠ truth; measurement ≠ understanding. And perhaps, Goodhart’s Law, when a measure becomes a target, it ceases to be a good measure. The moment you optimise for the metric, you've divorced it from the reality it was meant to represent.
And of course, there’s that age old issue with analysis paralysis. Been there a few times. But what I find worse, perhaps more insidious as it creeps in ever so quietly is displacement. Your data becomes a substitute for your often very good judgement. Data should never be bureaucratic alibi either.
Synthetic data and using LLMs and ML has their place. But their limitations also need to be clearly understood. What no AI tool can do, is to predict cultural shifts and emerging trends. They can only be trained on old data. They are predictive tools, putting things together in statistical forms based on what’s happened. Humans are weird. We do dumb things and really smart things. We are not predictable at scale, or at the most individual level.
Also, too much of the analytics side of marketing today is focused on individual psychology. Failing to consider culture and the multitude of cultures and how they intermix. Brands used to (sometimes still do, the good ones) be culture makers. Many, unfortunately, have slipped into being overly obsessed with data and thus have become, well, beige. No longer distinguishing themselves. In effect, becoming culturally boring and irrelevant.
The world is changing. Becoming more unpredictable. Marketers are hungry for tools to help them predict cultural shifts and emerging trends. Relying on synthetic data will only get you to the average. It can be a baseline. A useful one. And hey, surveys and focus groups aren’t much better either.
Consumers today are changing their buying habits in meaningful and significant ways, seeking trust and identity anchors, greater transparency and connections. The challenge for synthetic data and LLMs like Claude or ChatGPT is that they’re not able to catch these changes because they rely on lagging data. Useful, yes, but not predictors of cultural shifts.
Too often today, I see marketing teams use third party data analysis tools, especially those promoting AI, as a means to duck for cover and blame when things go sideways. This is unfortunate. There is no oracle in the data tools. They are incredibly useful, but within context. They are not a silver bullet. Because humans are weird and we’re in a time of monumental sociocultural shifts.
The challenge is to know when, where and how to apply synthetic data and AI tools, and when to actually take data away. I’ve helped a lot of teams reduce their data and streamline their metrics, which can become rather messy, rather fast.