ChatGPT is amazing. It’s a revolutionary AI product that enables anyone to ask questions and receive believable, high-quality responses. The release of GPT-3.5 and ChatGPT and the resulting reaction to them have been so impressive that they have spurred Microsoft into increasing their investment in OpenAI by 10 billion dollars. Even Google is rumored to be scrambling to find an alternative, or risk its main product, Google Search, falling behind the times. In fact, Google's Deepmind has a similar product called Sparrow that they may release soon. Unlike ChatGPT, Sparrow provides citations to back up its claims, partly addressing one of the main concerns with ChatGPT currently: its tendency to hallucinate facts. With all these groundbreaking advancements in artifical intelligence, it's an especially exciting time for chatbot building platforms, as the generational leap in AI promises better bots.
In this article we’re going to talk about some of the ways that generative AI will improve chatbots, including making business chatbots more like ChatGPT. ChatGPT, in its current form, is an incredibly useful tool. However, there are many ways that its power, and the power of large language models in general, can and will be utilized for new use cases in the near future.
Botpress will soon have some significant announcements on what we are building that will allow you to create ChatGPT-like bots for your business. One great use case that we'll be providing access to shortly is Knowledge Bases, our chatbot generating feature that can create a chatbot in a couple of minutes from a short description and a link to your website. This auto-generated chatbot will be immediately available to use and share with others.
Simply put, ChatGPT is a general-purpose chat application offered to the public during the beta. It can answer anything you throw at it, using probable answers.
GPT, on the other hand, is a text completion engine and GPT-3 is the latest version. GPT-3 has several models – some faster, some with better logic but slower and more costly. The latest and greatest model is text-davinci-003 and it's the one used in ChatGPT. When using GPT-3 instead of ChatGPT, you can adjust variables such as response length, how likely it is to repeat verbatim what you gave it, or how often to change subjects or stay on track. For GPT beginners, ChatGPT is easier to use as it includes reasonable defaults for GPT to be used for a human conversation. GPT-3, on the other hand, can be used to write code or follow patterns and has an API for developers to use with ease. For example, it can generate lists of names that can easily be fed into a baby name application.
ChatGPT combines a user interface with a highly tuned GPT-3 engine for responding to text. For instance, ChatGPT, unless told otherwise, will favor long and complete answers, such as the ones given in a school assignment. Since the answers are longer, you can often see what looks like a thought process. This is often unnecessary for end-user applications, but great for a demo.
In theory, you only really need GPT-3 to produce amazing content, but in practice ChatGPT makes it easier for everyone –except industry professionals– to use the technology. When researching ChatGPT features for building bots, we use GPT-3 to have greater control over the process.
It’s easy to see why everyone is so interested in this kind of AI product - it works seamlessly without any additional effort on their part. For most use cases (barring logic-intense situations), you will more likely than not get a good answer.
Unfortunately, while ChatGPT may “just work” in some cases, there is still some risk involved when creating a bot with unreliable output. Take for example the case where someone asked a hotel bot what time breakfast is served to be given the response that breakfast starts at 7AM near the reception, when in fact the hotel doesn't offer breakfast! Wrong responses like these can hurt a business and quickly discourage business owners and users alike.
Traditionally bot projects require listing use cases, coming up with the information you want your bot to know, and then testing everything to make sure it works. The process of gathering bot information can be radically simplified and shortened using GPT, taking away with it the greatest pain of bot building.
ChatGPT, simply put, has two capabilities that combine to give it awesome predictive power. The first is that it can search or access any information you provide as well as anything it knows from before 2022. The second capability is writing text to make sense of it.
Let's illustrate search with a simple query:
Generation is a little bit more complicated to show. You could argue that the above example is simply searching and not generating anything new. Let's try another recall question:
It's a bit more text, but still not explicit. Let's combine the two to force the two capabilities to come to light:
Great! Now we've illustrated recall and generation. Notice how human-like the generation is. This is the quality needed for a bot representing a company. Finally, we've also demonstrated something new: its failure to do basic geometry and utter shamelessness at providing an answer.
Our team has been working with GPT-3 long before ChatGPT came out late last year, even releasing a product leveraging it among other solutions to make bots answer accurately from a list of facts. The accuracy of the bot's responses was greatly improved over the current-gen chatbot engineer, but still required users to submit a list of facts for the bot to work.
Since ChatGPT emerged and GPT-3 has been updated to its latest version, we have been working on some new approaches to chatbot building that we will announce shortly. One use case we will demo soon is the ability to reduce bot-building time by leveraging the generative aspect so that the building experience may be as magical as using ChatGPT (it just works!).
In an ideal world, building a bot should be as simple as answering a few questions about what you want, then reviewing the more uncertain facts for accuracy and hitting a go-live button.
There are several issues with leveraging GPT for businesses. Let's explore a few of the common pitfalls with actual GPT-3 outputs below.
Answers that are false positives: this is a common category of answers and unavoidable to a certain extent.
Answers that should be public but weren't made public at all: this could be an omission from marketing to keep things simple, too specific of a question, or some fine print disclaimer that should have been included from the start
Things that have to be inferred: this is either through a parent-child relationship like a product in a store or figuring out what is the best way to help the user when something isn't clearly written.
Things that shouldn't be publicly available (e.g. Walmart Groceries refund approval process): some things should just not be available, in which case a bot should not guess or try and help the user.
Some things are private, circumstantial, or linked to individuals: yes or no answers are dangerous either way, and saying "I don't know" is not helpful.
All of the above are contingent on whether or not a bot should be able to reply. This in itself is tricky, as it typically takes a bit of planning and decision making to decide on the scope of a chatbot. Commit to too little and your chatbot will feel useless. Try to do everything and it will fail. Chatbot projects require a balancing act to succeed.
Our goal at Botpress is to create tools that overcome these challenges so that you can harness most of the power of GPT while at the same time overcome these pitfalls.
Large Language Models like GPT-3 will feed into services such as Google Search and Bing; it's an existential threat to these services, which is why they are investing heavily into making them more like ChatGPT. The challenges are not purely technological. Google needs to consider the business model since interacting with a chatbot may not provide the same opportunities for advertising revenue. Right now, it’s also too expensive to use large language models in this way, but no doubt that cost problem will not be an issue soon with the speed of innovation.
These search engines have no choice but to improve their ability to understand and answer natural language questions or face destruction from new players in the search engine space. The huge investments being made in this technology is great news for bot-building platforms as they will be able to leverage these search engines (which index everything public) to respond to queries conversationally.
Generating answers will become tricky when there isn't any information out there about a specific query - this happens more often than one might think! When developing bots using our research tool, we realized websites don't write information in a conversational way, making extracting information tricky. They even often lack important information altogether (e.g., career pages, in favor of a contact email). This means the bot builder has to do the hard task of finding standard ways of solving problems (e.g., how should a bot respond when someone asks for a discount on an e-commerce site?) and reading between the lines (e.g., how should you answer an employment query when there is no career page?). When a user is told an item they ordered will arrive in two weeks, and the shop itself has been closed for the winter holidays and hasn't shipped the item, how should it respond?
There is an endless list of things bots need to be able to handle that search engines simply cannot predict or do not have access to - this will be the direction research takes in order to have truly fantastic and easy-to-use bot builders that can compete with established ones like ChatGPT!
Help us build the future of bots by trying out Knowledge Bases and giving us your feedback. Once we realized that webpages are so sparse on so many questions, we realized our feature had the potential to not only find and provide content from the website, but also exceed it and give details that would otherwise have no place on the website.
To finish off, here's Oxford Languages' definition of the word magic: the power of apparently influencing the course of events by using mysterious or supernatural forces. ChatGPT is certainly mysterious, and while bot builders like Botpress are making new technologies easier to use, we'll leave it to the reader to use the applications to influence the course of events!
Steve works at a design agency focused on chatbot development. He’s proficient in node.js and excited to be working...
There is a lot of hype about chatbots and AI. Of course, people implementing a chatbot want to demonstrate...
Currently, the development of customer service chatbots is dominated by high priced specialists who often use...