May 2, 2023

The Evolution of AI and Chatbots in Customer Service

Discover how chatbots have evolved from basic programmes to the advanced AI models of today?

The Evolution of AI and Chatbots in Customer Service

The Evolution of AI and Chatbots in Customer Service

Overtime chatbots have redefined our society, especially as of recent. But how have they evolved from basic programmes to the advanced AI models of today?

Early History

The story of chatbots begins in the 1966 century, when Joseph Weizenbaum at MIT introduced ELIZA, the first chatbot. It mimicked a psychotherapist by using pattern matching to engage users in conversation. Though ELIZA's understanding of language was superficial, it laid the foundational concept that machines could participate in human-like dialogue and gave birth to the term “The ELIZA effect” (a computer projecting human-like traits).

The 1980s and 1990s saw significant strides in natural language processing (NLP) and artificial intelligence. ALICE (Artificial Linguistic Internet Computer Entity), developed in 1995 by Richard Wallace, was one such stride. ALICE utilized an NLP interpretation method known as AIML (Artificial Intelligence Markup Language) to converse more naturally with users.

The Birth of Commercial Chatbots

The early 2000s marked the birth of commercial chatbots. SmarterChild was introduced in 2001 on AIM and MSN Messenger and offered users the ability to interact with a wide range of information. Even though it was simple, SmarterChild could still help with simple queries like weather forecasts and movie times, showcasing the potential for chatbots in everyday applications.

2006 demonstrated that AI could do more. IBM’s Watson could process and understand simple language, proving chatbot capabilities could go beyond simple predefined scripts. Watson would later prove its capabilities on the TV quiz show, “Jeopardy!”, beating two of the show's greatest champions to showcase its natural language processing capabilities to the world.

Initial Creation of Modern Chatbots

By the early 2010s messaging platforms like Facebook Messenger, WhatsApp, and WeChat became popular. This opened new avenues for chatbot integration and companies began deploying chatbots for a variety of purposes, from customer support to marketing and sales

Soon after voice assistants were introduced. Siri was the first (2011), followed by Amazon’s Alexa (2014) and Google’s voice assistant (2016). These voice assistants made AI chatbots more accessible with their integration into everyday life. Their development also paved the way for further advancements in AI technology as the functionality of chatbots extended beyond text to include voice commands and auditory responses.

They weren’t the only developments being made to chatbots at the time however. Over the same time period Facebook opened their AI research lab in Paris, Google acquired DeepMind and started development of BERT, and the now famous OpenAI was founded by Sam Altman and Elon Musk amongst others. These ventures greatly improved chatbots' understanding of context, subtlety, and nuance. Chatbots were now capable of answering complex queries, personalizing responses, and learning from interactions.

In the late 2010s platforms like Zendesk and Salesforce made it easier to integrate chatbots across platforms. This enabled experiences across various platforms and devices, facilitated 24/7 customer support and personalized customer interactions using data. By automating routine inquiries, chatbots significantly enhanced operational efficiency and reduced costs, all while scaling to meet fluctuating demands without compromising service quality.

Perhaps more notably, these platforms enabled chatbots to proactively engage with customers and smoothly hand off more complex issues to human agents, improving the overall customer experience.

Impact of Covid-19

The COVID-19 pandemic changed the customer experience landscape as more and more people relied on remote communication, drastically increasing their rate of development. 

Startups such as Memora Health launched chatbots to answer basic patient questions and connect them to the right specialist - all without visiting a hospital.

Analytics became better leveraged enabling chatbots to anticipate customer needs and reply with greater accuracy. Additionally, chatbots evolved to automatically create support tickets for more complex issues to enhance the resolution process and boost efficiency

Start of Ethical Legislative Improvements

The 2021 AI Act by the EU marked the start of a new wave of AI legislation. This move aimed to instill robust standards for transparency, safety, and accountability in AI applications, ensuring they align with European values and fundamental rights. The act sought to nurture a digital environment where chatbots augment customer service without compromising individual autonomy or security.

LLMs and a New Standard of AI

In 2022 ChatGPT launched (using GPT-3 technology). The following year OpenAI released its API enabling chatbots to be more advanced than ever before with human-like text generation, quick self learning, and far greater sophistication. Later in 2023 ChatGPT-4 was released providing an even more powerful AI model as a base for customer service chatbots.

Despite the release of Google Gemini, Claude and other LLM (large language model) based AI's, ChatGPT remains the industry leader.

Now companies such as Algomo are using ChatGPT-4 technology as a base to develop chatbots specialized for customer service. In an effort to improve faster than competitors some of these new chatbots also use their own LLMs and proprietary technology. Some even have research relationships with institutions to yet further improve their chatbots and become the next chapter in the chatbot history books. Algomo, for example, has research partnerships with NatWest Bank, and the universities of Edinburgh and Essex - two UK leading universities in AI development.

With these technological advances chatbots are now capable of customer service feats that even last year most thought impossible.

Interested in finding out more, or getting hands on access to Algomo?

Book a demo here

Paul Silcock

Paul Silcock

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

About Paul Silcock

Started his career at Sony in 2010 as a research engineer, designing video compression algorithms for the HEVC standard before side-stepping into software engineering and consultancy, where he has spent the past decade working for clients in the Government and National Security sectors.

He has worked for over a dozen clients across several organisations, building large-scale systems, leading multiple engineering teams and developing long-term technical strategy.

Paul holds 17 patents.