Join our daily and weekly newsletters for the latest updates and the exclusive content on AI coverage. Learn more
While companies rush to adopt AI, they discover an unexpected truth: even the most rational business buyers do not make purely rational decisions – their subconscious requirements go far beyond the assessment standards of conventional software.
Let me share an anecdote: it is November 2024; I sit in a New York skyscraper, I work with a fashion brand on their first AI assistant. The avatar, Nora, is a 25 -year -old digital assistant displayed on a six -foot high -up kiosk. She has elegant brown hair, a chic black suit and a charming smile. It acts “hi” during the recognition of the face of a client, Hoche La Tête speaks about it and answers questions about the history of the company and the new technological. I came to prepared with a standard technical control list: accuracy of the answer, conversation latency, facial recognition precision …
But my client did not even look at the control list. Instead, they asked: “Why did she not have her own personality? I asked her favorite handbag, and she didn’t give one!”
Change the way we assess technology
It is striking how quickly we forget that these avatars are not human. While many are worried about the AI of blurring the boundaries between humans and machines, I see a more immediate challenge for businesses: a fundamental change in the way we assess technology.
When the software begins to appear and act human, users stop assessing it as a tool and begin to judge it as a human being. This phenomenon – judge non -human entities according to human standards – is anthropomorphismwhich has been well studied in human relations and is now emerging in the human-AI relationship.
Regarding the purchase of AI products, corporate decisions are not as rational as you think because decision -makers are always humans. Research has shown that unconscious perceptions are most Human human interactionsAnd business buyers are no exception.
Thus, companies which report an AI contract do not only enter a “public service contract” for cost reduction or income growth; They enter an implicit “emotional contract”. Often they don’t even do it themselves.
Obtain the perfect “baby have”?
Although each software product has always had an emotional element, when the product becomes infinitely similar to a real human being, this aspect becomes much more important and unconscious.
These unconscious reactions shape the way your employees and your customers engage with AI, and my experience tells me how widespread these answers – they are really human. Consider these four examples and their underlying psychological ideas:
When my customer in New York asked questions about Nora’s favorite handbag, desire for his personality, he was pressing on Social presence theoryTreat AI as a social being that must be present and real.
A client fixed on the smile of his avatar: “The mouth shows a lot of teeth – it’s disturbing.” This reaction reflects the Strange valley effectwhere almost human characteristics cause discomfort.
Conversely, a visually attractive but less functional AI agent triggered praise due to the Aesthetic effect-usability – The idea that attractiveness can prevail over performance problems.
Another customer, a meticulous business owner, continued to delay the launch of the project. “We have to make our baby perfect,” he said at each meeting. “It must be impeccable before you can show it to the world.” This obsession with the creation of an idealized AI entity suggests a projection of a Ideal self On our AI creations, as if we manufacture a digital entity that embodies our highest aspirations and standards.
What matters most to your business?
So, how can you direct the market by explaining these hidden emotional contracts and winning your competitors who stack a fantasy solution on others?
The key is to determine what matters to the unique needs of your business. Configure a test process. This will not only help you identify the main priorities, but, more importantly, to prioritize minor details, no matter how emotion. Since the sector is so new, there are almost no easily usable manuals. But you can be the first engine by establishing your original way to understand what best suits your business.
For example, the customer’s question on the “AI personality of AI” was validated by tests with internal users. On the contrary, most people could not make the difference between the different versions that the owner of the company had struggled in both directions for his “perfect baby”, which means that we could stop at a “good enough” point.
To help you recognize models more easily, plan to hire team members or consultants who have training in psychology. The four examples are not unique, but are well -documented psychological effects that occur in human human interactions.
Your relationship with the technology supplier must also change. They must be a partner who sails with you. You can organize weekly meetings with them after having signed a contract and share your take -out dishes for tests so that they can create better products for you. If you do not have the budget, stamp at least additional time to compare the products and test with users, allowing these “emotional contracts” hidden to surface.
We are at the forefront of the definition of how humans and AI interact. Prosperous business leaders will adopt the emotional contract and set up processes to sail in the ambiguity that will help them win the market.
Joy Liu has managed corporate products in AI startups and Cloud and AI initiatives at Microsoft.