7/22/2023 0 Comments Gpt 3 ai chat![]() What GPT-3 is doing is predicting, based on what it knows about how our English language works, what the response text is statistically most likely to be. Judging whether an answer is correct or helpful. Understanding anything about Help Scout the company or its products. Searching a knowledge base or reading help documents to find the “right” answer. How should we understand the answers that GPT-3 is providing? Let’s start by being clear on what it is not doing. Here is an example from one of Matt’s tests: Please note we have no plans to actually implement GPT-3 at Help Scout - this was a purely experimental exercise. Next, Matt took some genuine customer questions (different from the six examples) and had GPT-3 generate responses. That’s why some of the answers sound real, but don’t make much sense. He fed GPT-3 only six examples of real responses from our entirely human (and highly skilled) Help Scout customer service team.įrom those six responses, GPT-3 did not learn anything about Help Scout or its products it only looked at the voice, tone, and structure our team used in providing those answers. The team at OpenAI made GPT-3 available via an API, so Matt signed up. The potential for customer service usage is clear - could this software read your incoming customer questions and generate accurate, helpful answers? Our own data scientist, Matt Mazur, decided to find out. We asked GPT-3 to answer real customer service questions That model produces some impressive results across a variety of use cases. Instead, GPT-3 imbibed an enormous (and broad) amount of public online text and used that to develop its model. One of the most important differences between GPT-3’s generation of tools and earlier machine learning models is that you don’t need to train it with high-quality, carefully labeled and structured information. ![]() You have a language model, too what is the missing word in this sentence? “Why did the _ cross the road?” In simple terms, language models help computers estimate the probability of word sequences. That model can then be used to generate prose (or even code) that seems like it was written by a real person. GPT-3 is a language model - a way for machines to understand what human languages look like. GPT-3 has been making news recently, so it’s worth taking a look to understand what it is and how it might help. If not, then perhaps you have a broader interest in AI and machine learning and the ways they could assist in delivering better customer service. If that’s a question you heard around budget time this year, this article is for you. “Could we be using GPT-3 to automatically respond to people instead of hiring those new customer service agents?”
0 Comments
Leave a Reply. |