Skip to content

“How Did They Know That?” The Value of Explainability in Search

3 minute read

By Angel Maldonado, CEO of Empathy.co

In order to scale the benefits of explainability and fully harness its power, it is essential that transparency remains at the core of online interactions. Responsible AI is driving advancements in search and discovery, but consumers are increasingly looking to understand the mechanisms at play behind the content or answers they are provided with.

This requirement extends far beyond eCommerce search and permeates various digital landscapes, such as suggested content on social media platforms. The enigmatic nature of algorithms and their reliance on opaque data often leads consumers to make negative assumptions about privacy, stemming from the persisting lack of trust felt amongst consumers in digital spaces.

As AI, large language models (LLMs), and search capabilities continue to evolve at a rapid pace, the importance of explainability in informing users about the processes behind each received result has become paramount.

While notable players in the tech industry like TikTok and Meta have started offering explanations for content recommendations, a substantial shift in mentality across the entire industry regarding transparency and data privacy is overdue. Only then can we fully realise the advantages that explainability carries.

HOW A LACK OF TRANSPARENCY IMPACTS TRUST

The pursuit of improved search and customer experiences is nothing new, with personalisation a key cog in so many of our online experiences. So, whilst AI and the use of LLMs by the likes of Bing and Google outwardly offer something new to the end user, it’s an attempt to answer the same question businesses have been dealing with for some time, “How can we anticipate the needs of our customer?”

In a digital context, we’ve seen the use of customer data to drive personalisation and enhance customer experiences. However, this reliance on data has meant far too many businesses have neglected the importance of protecting and safeguarding customer data and their privacy. This has resulted in a lack of transparency around data privacy practices which has ultimately undermined consumer trust and confidence, and any attempts to improve the customer experience. What’s more, this data obsession has meant retailers have overlooked alternative ways to enhance customer experience that put privacy and consent integrity at the centre.

The use of AI in search hasn’t yet shown that it offers much of an alternative to these previous practices. ChatGPT has been banned in Italy over privacy concerns, and whilst they have recently updated their privacy practices, there are still significant questions around the data used to train AI algorithms and these LLMs. Consequently, there’s still a significant lack of transparency around the results that AI-powered search produces which will lead to users questioning whether they can trust the results, even with the rich and varied answers that an AI-powered search engine can provide.

This continued reliance on a black box of data, also means businesses are entirely unable to correct and combat inherent biases and blindspots these data points may create. Even if data used to train AI and LLMs is obtained in an ethical manner without transparency there’s significant potential for them to misinterpret and pigeonhole users, leading to results which aren’t relevant.

HOW EXPLAINABILITY CAN BUILD CUSTOMER LOYALTY

Businesses that explain the what, why, and how of each recommendation involved in the search process can mitigate the risks of AI implementation. One key consideration to safely implementing AI into the search process comes down to data anonymity. If consumers’ personal information is collected with consent and anonymised through data silos, then businesses run no risk of encountering GDPR issues.

In recent years, there has been a growing concern about the use of artificial intelligence in the customer experience industry. However, by shifting to ethical AI tools that are human-centred or responsible, businesses can successfully create experiences that eliminate blind spots and ensure that the customer experience is transparent, seamless, and engaging.

Understanding how an AI model makes decisions is important for customers, providing them with an environment of transparency that allows them to trust that the technology is making decisions in their best interest. Explainability can also help businesses to understand their customers better, providing them with insights into how they are interacting with their products and services.

Tailored customer experiences will continue to carry importance in the years to come. However, the pursuit of personalisation and improved customer experiences should never come at the expense of the customer’s data privacy. By being transparent and with the reasoning behind the recommendations and results that customers receive, businesses can build trust and create more engaging customer experiences. This in turn will lead to increased customer satisfaction and loyalty, having a positive impact on sales.

Keep up to date with the latest events, resources and articles.

Sign-up for the Engage Customer Newsletter.