The Language We Use To Talk About AI.
In days now gone by it seems, the phrase we called out during dinners with family or friends when a debate was bouncing around was “I’ll just Google it”. To “Google” something became a verb. Today, this phrase has changed to “I’ll just ask chat” or “I’ll ask the AI”. This is a moment of profound cultural shift. In both how the average consumer perceives AI (Artificial Intelligence) and how they are talking about AI.
We evolved to trust Google to deliver us the links to go and explore, to validate the age of a celebrity or to find the nearest Thai restaurant. But we never thought of Google as a “thing” or an “entity” in and of itself. We are however, starting to do this with LLMs (Large Language Models) like ChatGPT, Claude, Gemini.
This is a fundamental shift in the adoption of an emergent technology, whether or not that technology can deliver on what’s been promised and regardless of what LLMs actually are. In a way, this isn’t even about the technology itself. We’re unconsciously reshaping how we think about intelligence, relationships and what it means to seek answers and finding meaning in an uncertain world with something most people don’t quite understand.
To be clear, there is no singular Artificial Intelligence (AI). It’s an umbrella marketing term created in the 1960’s to represent a suite of different tools. Machine Learning, computer Vision, Neural Networks, Generative AI (LLMs and video/image creation tools.) To say “I’ll ask the AI” is meaningless, but to non-technical people, that’s splitting hairs.
In this article, I’m specifically focusing on an AI tool called Large Language Models (LLMs), as they are the tools having the most direct impact on human societies and cultures around the world. It’s also important to note that the most popular tools are made in the West and are imbued with Western views of culture and technology. Not all cultures view LLMs in the same way as they are in the West. So keep that in mind.
People say “I’ll ask the AI’ because they’re applying a form of cognitive shorthand, what anthropologist Claude Lévis-Strauss would call mythical thinking. Where the “AI” becomes a totemic figure, a sort of entity that transcends its technical reality as multiple, competing technologies. We create a sort of liminal space where the boundaries between human and non-human agency blur.
We are applying an animistic ontology; where we have this worldview that non-human entities like plants, and inanimate objects are seen as having some form of sentience or spiritual essence. This is a fairly common cultural behaviour around the world. Such as the Japanese concept of tsukumogami, which is the view that machines and other inanimate objects possess some form of spirits.
To some, and a growing number, LLMs are becoming our digital oracles, taking mindshare from us in the same space as shamans, priests and wise-elders. Some are imbuing LLMs with an implicit belief that the “AI” possesses knowledge beyond our ordinary human access. They do not.
As anthropologist Margaret Mead pointed out, part of the way humans make new technologies we don’t quite understand yet is through anthropomorphisation. We give technologies human-like aspects. Like calling a ship “she” or naming our cars for example.
Up until the arrival of LLMs, or Generative AI (GAI), we may have anthropomorphised technologies, but we understood them to be tools. They helped us perform tasks. But LLMs exhibit what anthropologist Tim Ingold calls “skilled practice”, in that even though they don’t and can’t, LLMs appear to think and reason and respond in ways that mirror human cognition.
This has created what I call the “cognitive uncanny valley” where LLMs are sophisticated enough to trigger our social cognition systems. The parts of our brain that evolved to make us social animals, while at the same time being alien just enough to seem superhuman. Or in the case of some folks, as the actual voice of a god or spiritual entity, hence the Church of ChatGPT.
When we have knowledge gaps, we like to fill them with familiar patterns. When we do this, in anthropology we use a term created by Clifford Geertz called “thick description” where we create rich, meaningful narratives and stories to explain phenomena we don’t understand. Just look up “cargo cults” in the Pacific islands around World War 2 as a great example of this in action.
This is a significant reason why people say “the AI”. It is because they are trying to fill the cognitive gaps around a technology they don’t understand. This isn’t wrong and shouldn’t be a reason to denigrate people who don’t understand a technology. It’s something we all do across all human cultures. It goes back hundreds of thousands of years.
We are even starting to see forms of tribalisation building around various LLM brands. When people say “I’ll ask Claude” or “I’ll ask chat”, they’re forming social positions and socially signalling, around a preferred LLM model they believe to be superior. Similar to the years long Windows vs macOS thing or Android vs iOS. By using these terms, we signal group membership and a shared cultural knowledge. This is critical in the emergence of new technologies. Marketing would call this “word of mouth” or WOM.
I think we are in the next stage of “cyborg anthropology”, where humans and machines are becoming so deeply intertwined that the boundaries between them become culturally meaningless, even though they’re technically distinct. Machines will never “think” like humans because they are not and cannot be or become, human.
Another consideration is that as people who don’t understand what “AI”, especially LLMs are, is that may struggle with emotional intelligence over time. If they’re ever more reliant on LLMs that seem more reliable and less demanding, the economy of human care and connection becomes disrupted by artificial alternatives.
These all have profound cultural implications. We’re not just creating new tools, we are evolving new forms of cognition and meaning-making. These shifts, the way people are talking about “AI” in a singular way, are likely to define the next phase of human cultural evolution and what it means to be human.