The Next Stages of Generative AI

A Guest Blog by IDC Analyst Mary Wardley

August 18, 2023 - 5 min read

 
 
Mary Wardley
Program Vice President in IDC’s Customer Experience group of services
 

First, let me say that I am a big fan of ChatGPT. In a ridiculously short amount of time, this technology has created a community of users across age classifications, employment types, industries, and use cases. Everyone can experiment with ChatGPT. It is open, available, accessible, and usable –and that is its allure. Together, the community of ChatGPT users is gaining experience in large language models (LLMs), sharing those experiences, and drawing conclusions about the technology from firsthand experimentation. This is the promise of a technology that has the potential to change everything.

In fact, according to IDC's Consumer Pulse: Money & Shopping Survey conducted in April 2023, end users are overall very positive regarding ChatGPT. In the multiple-response survey, respondents were asked to describe ChatGPT. The leading words were helpful (43%), useful (43%), productive (33%) and cool (21%.) Words with negative connotations such as creepy (13%), weird (10%), detrimental (8%), and irresponsible (7%) were cited at a much lower rate.

ChatGPT has given the marketplace permission to engage in artificial intelligence (AI) in a real way. AI is clearly not new. However, workable use cases are needed for it to gain traction in the middle of the bell curve – not just at its bleeding edge. ChatGPT has done that. This technology is taking large amounts of data, both structured and unstructured, and producing a reasonable output of human-like prose on a given topic. It is a workable demonstration of a use case with universal appeal. ChatGPT (and products like ChatGPT) are the interrogative mechanism that will bring usability to data retrieval. As such, they have the potential to change many environments that handle large amounts of mostly unstructured information that needs to be retrieved and reconstructed in a usable format. It is no wonder that the customer service environment is especially excited by the potential of ChatGPT and LLMs.

ChatGPT and LLMs in general are the gift for which customer service environments have been waiting. Although knowledge bases and FAQs provide an important foundational component for customer service, they are inherently challenged. From a process perspective, knowledge management systems suffer from the sheer size of content preparation requirements. The process of collecting and assimilating the content, drafting appropriate responses, and searching, finding, reviewing, and keeping answers up to date is arduous and time consuming. Customers and agents alike are often frustrated with the information’s quality and timeliness, the speed of receiving it, the lack of personalization, and output that is non-readable. LLMs can address these issues. Generative AI grounded in trusted knowledge and an organization’s trusted data will be a game changer.

However, we are just at the beginning of this chapter. We now must understand what it is that we have, its implications, and how to apply it safely with the best outcomes. Already, multiple news stories have highlighted ChatGPT's bad marriage advice, accounts of its “hallucinations,” and output that sounds right but isn't. A most significant development was the stop processing order issued by Italy due to questions regarding data protection in that country. Italy only began processing again at the end of April after OpenAI (ChatGPT’s creator) announced a set of privacy controls.

 
 

Takeaways for companies

None of this means that LLMs aren't ready or that customer service organizations should stop looking at the technology. Rather, it's an indicator of the work to be done from a technological standpoint in readying organizations for its use and preparing employees who use it. Much like preparing to sell a house, there is a lot to do to get ready. Now is the time for organizations to sort out what they need to do in preparation for generative AI.

Companies will need to figure out:

  • What kind of data do we have? When talking about customer data, it becomes extremely important to understand what is going to be made available to the LLM. One example would be how to separate anonymous data from that which can be used in a generalized fashion.
  • Who has the data? The power of the LLM is its ability to draw across silos in a way that would have been prohibitive to do as a human being. However, the existing content team is not going away. Their job just got bigger and more important.
  • What condition is the data in? Format, completeness, and age are among the elements that need to be thought about. Old data isn't necessarily bad as the content may be relevant, but it needs to be tagged.
  • Do we need all the data, and which data is relevant? The Achilles heel of any system is overburdening it with data that is not relevant. LLMs need lots of data and the market will see a burgeoning of training data. The opposite of hallucinations in LLMs is grounding. Hallucinations are inaccurate outcomes while grounding makes the connection between topics. Accurate, relevant data increases grounding. Therefore, eliminating data that is no longer relevant is important.
  • At what cadence does the data change? The updating of LLM models is a new field of study. ChatGPT's data is trained on public data through September 2022. There is also the question of how best to update a company's own models that contain customer data and how to incorporate new data. To do this, organizations will need to work closely with their generative AI supplier.
  • How to assure customers that we are using data responsibly and earn their trust? Security and governance will be the watchwords in addition to ensuring that a human remains involved and active in the review process.
  • Who will work with us to ensure that customers and agents get the correct information and not hallucinations? This point is further down the line but thinking about the working partnership early should be part of the data considerations.

Generative AI is going to be incorporated into customer handling environments and into the broader organization's software architecture in three ways:

  1. Net-new generative AI-based applications, meaning interesting things that we've never seen before
  2. Infrastructure, meaning enhancements that we won't see but will benefit from
  3. Improvements to existing applications, meaning current applications getting better which in turn means the humans/agents get better

This is only the beginning. For most organizations, there will be a lot of house cleaning that needs to occur before they are truly ready to take on generative AI.

 
 

Message from the Sponsor

Salesforce Service GPT brings together the power of AI + Data + CRM. This means your customer service teams can harness real-time data and AI they can trust to deliver experiences that help you stay ahead of the curve and scale highly personalized service to every customer. From self-service and the contact center to field service, Salesforce brings all of your customer service needs onto one platform. Learn more about the benefits of Service GPT here.
 

About the Author

Mary Wardley
Program Vice President in IDC’s Customer Experience group of services
 

Mary Wardley is a Program Vice President in IDC’s Customer Experience group of services where she leads the Customer Care and CRM service. Mary currently researches the fields of customer service and contact center applications. As part of the Customer Experience practice Ms. Wardley's research views the customer interaction environment from both the perspective of the ultimate customer experience and the enterprise applications architecture and services that supports it. Focusing on the technologies of the 3rd platform such as AI, IoT, conversational intelligence, and cloud. Her research looks to the evolution that organizations are experiencing as they transform to meet new customer demands and the technology and services they are implementing to achieve those goals. This includes the end customer as well as those delivering the service, specifically the contact center agents and customer service representatives.

In conjunction with research services Ms. Wardley also contributes to primary technology research projects with software and services providers. The result of this work has appeared in the form of white papers and case studies, and has been used for product positioning and planning. Ms. Wardley is frequently asked to speak at domestic and international industry events such as trade, vendor, user, and development groups, on topics ranging from global market trends such as customer experience to specific market segments such as contact center-as-a-service.

REPORT

The New Service Imperative: Empathetic Service at Scale

Demo

Solve Cases Faster on the #1 Platform for Service

 

More Resources

 
Blog

Everything You Need to Know About AI in Customer Service

Webinar

Drive Productivity with Einstein GPT for Service

Blog

3 Ways Generative AI Will Reshape Customer Service

 
 

Get timely updates and fresh ideas delivered to your inbox.