#66: Evaluating AI’s Impact on the Customer Service Industry
Google will redefine consumer-facing customer service
It seems that every day there’s a new AI headline. Whether it’s the release of a new version of ChatGPT or NVIDIA signing yet another deal to provide semiconductor chips for the AI arms race, there’s always something to keep up with.
And because of all the headlines, it can be difficult to separate noise from signal. But sometimes, a light bulb goes off when you read through a news update.
Last week, Google released a new AI feature called “Let Google call”. This new technology will slowly be rolled out over time, where “AI will call local stores and make inquiries about the product, then come back to you with a summary of its findings.” Wondering if your local toy store has that KPop Demon Hunters toy that your nephew wants for the holidays? Let Google make the call to find out.
Think about how annoying it is to call a store, be placed on hold, and wait minutes, if not an hour, for a response. Or, if you can get through right away, it’s often because you’re connected with an AI customer service agent that sometimes doesn’t answer to the level of customization you need. As a consumer, I find messaging via a chatbot or making a call to get in touch with a company an incredibly tedious experience.
Google already has a wealth of knowledge on local businesses and what people think of each establishment through Google reviews. But these reviews get outdated quickly. Thus, “Let Google call” is a major win for the shopper as they’ll receive relevant information without having to do the work to obtain it.
So, why did Google introduce “Let Google call”? It allows them to acquire hyperlocal, offline data. But really, they are laying the groundwork to redefine the customer service industry.
More on the hyperlocal, offline data play first. Right now, Reddit is the most cited domain across major foundational models, as per Profound’s analysis. Which makes sense. Reddit threads contain some of the most niche information on the web, thus quite useful when trying to answer specific questions from ChatGPT users. Perhaps Google would like to continue adding to their incredibly vast data sets with more niche information. Offline phone call transcripts with local businesses can be a differentiated data source at scale.
Think about it: as much data as exists online, countless data points and interactions occur each day that go uncaptured by large language models. There’s so much value offline interactions can bring to training foundational models and Google is well-positioned to capitalize here.
My hunch is that only Google can pull this off given their scale regarding their sheer distribution. Subsequently, this may allow the Big Tech player to rely less on non-Google properties (like Reddit) when it comes to finetuning their foundational model, Gemini.
Billions of dollars have flowed into companies developing AI customer service agents. But this investment has primarily flowed towards B2B use cases, or in other words, developing AI customer service teams for companies. Instead of scaling your headcount, rely on technology to adopt your brand’s style and voice, and respond to customer inquiries. By training an AI agent to think and respond like your brand (based on customer information and past inquiries), AI agents can be a reasonable facsimile to a person in a call center.
Companies like Decagon, Elise AI, and Sierra AI are some of the major players disrupting the customer service industry. They work primarily with enterprise clients to develop AI agents that handle a company’s voice (phone calls), chat, and email interaction points with their customers.
So, there’s been plenty of investment going into making AI agents better. But working towards a solve on more “effective” customer service agents for businesses only solves part of the problem. If consumers still must pick up the phone, write an email, or fire off a message to a chatbot about a problem they’re experiencing, there’s a huge opportunity sitting on the table for someone to think outside the realm of B2B.
I believe that investment has flowed more quickly into automating customer service for businesses versus consumers because you can assume there is a finite amount of relevant information needed to answer a customer’s questions. Although the possibilities of inquiries may seem probabilistic, the realistic set of what people would ask can be viewed as a deterministic problem like returns, refunds, and account access requests when viewing a company in a silo.
Tackling the consumer side (B2C) of customer service automation is tougher as there is a theoretically infinite number of inquiries for businesses of all types and sizes. It’s difficult for a company without a swath of data and global distribution to start training an AI agent that can represent consumers. And that’s not even to mention the sheer capital needed to turn all this data into insights. In summary, B2C AI customer service can be best supported by the foundational layer (possess the resources and expertise), while B2B AI customer service (niche, proprietary data sets) is a better fit for the application layer.
Yet, you know who has a tremendous amount of data, expertise in AI, and global distribution? Google. In the grand scheme of things, the acquisition of hyperlocal data is an ancillary benefit. I believe Google introduced “Let Google call” as a first step at redefining the consumer side of customer service.
What makes the “Let Google call” product launch even more impressive is how it bridges AI agents across the online and offline worlds. Plenty of businesses barely have a website, so why not have AI do the legwork and reach out to these businesses in their preferred communication method? Over the phone.
Google’s new feature is an AI agent. It’s worth calling out that the idea of an AI agent executing a task on behalf of a consumer isn’t new. In fact, OpenAI (Atlas) and Perplexity (Comet) have introduced AI-native web browsers that can handle tasks connecting the dots between both online and offline data. But their ability to handle customer service tasks is limited, with neither able to make a phone call nor Atlas unable to interact with a chatbot (yet).
As a consumer, I’m excited by what Google is working on in the customer service arena. Even if Google can’t retrieve relevant information from the business that I’m curious about, I’m glad technology wasted its time and not me. Let AI chat with the inexperienced sales associate at the store that doesn’t know how to look up shoe sizes.
What will be fascinating is when we’ve moved towards a world where AI agents act on behalf of both the consumer and the business. Imagine this scenario: you type a message to Google’s AI Mode on how you are looking to purchase a new pair of shoes of X size from Timberland, Google calls Timberland’s AI agent, who based on inventory counts for that size and style by location, says yes and puts a hold on that pair so you can come to the store to try the shoes on. Or you instruct Google to engage in agentic commerce, and Google AI Mode makes the shoe purchase on your behalf.
Wrapping this up. Thanks to Google’s new feature release, businesses of all sizes will have to consider how they interact with both humans and AI agents. I find it an especially exciting time to be a consumer as the opportunity to instruct an AI agent to communicate with customer service represents a clear productivity win for the consumer. I recommend staying close to what Google rolls out adjacent to “Let Google call” as it’ll serve as the signal amongst the noise in the AI news cycle. This could come in the way of Google Maps/Shopping updates or even new API integrations for businesses.

