Chatbots and Fast Food Ordering
In this post, we again talk about the future of chatbots, but this time we’ll take a look at how a chatbot would do with fast food ordering.
Bots are all the rage, so let’s have another post about them. I still wonder how quickly users will start using bots. I could see early users becoming frustrated if the chatbots don’t understand what they’re asking.
For example, let’s say McDonalds allowed you to order via sms. That would be very cool. What if you connect with their sms bot and said “2 fries, and 3 big macs, boom”. I’m not sure why you would say “boom”, but you did. How would the chatbot respond to this? It would have to be sophisticated enough to parse the text and understand an appropriate reply.
Let’s break down this super simple order from the chatbot’s perspective. The bot breakdown we’ll call it.
“2 fries”. The bot should know you want two orders of fries. That’s pretty straightforward. But it should immediately respond, “What size?” If you respond “Large”, then the bot needs to hold the state from the first sms to the second one. It should now know that you want 2 Large Fries.
Most of the bot frameworks include holding of state so that’s great. But it has to be developed into this McDonald’s bot.
“3 big macs.” This is also pretty straightforward. The bot should recognize that the customer wants 3 big macs. But now that the bot knows the user also wants 2 Large Fries, the bot should ask if they want the value meal, which would include a drink. So let’s say the bot responds with, “Do you want 1 fry and and 1 big mac to include a drink, a value meal”? Now the bot has to understand many different variations that could lead to a potential value meal. Then if the customer says “Sure”, the bot must add the extra cost to the order, but not the full amount since the drink is part of the value meal.
“Boom.” This is nonsense of course but what should the chatbot do it with. It could do many things, whatever you want. What could make sense is to address the first two items with appropriate responses, either confirming or asking clarifying questions. Then the chatbot could say “Boom, I’m sorry I’m not sure what you’re asking”. Then if the user says “Boom shake the room”, the Bot could say the same thing. This would make it pretty simple but not get the chatbot into trouble.
Isn’t language complex? It is, which makes it so interesting.
The key to a lot of these bots will be a complex and dynamic language and context engine. This engine will need to understand all the potential questions, in all the different formats that a customer could ask.
For example, with ordering, it could be “Can I have”, “I’d like”, “Can I order”, “2 big macs”. And that’s just a start. Over time I imagine there will be bot libraries that have the Ordering library which will take care of some of these issues. But there will always be custom tweaking and ongoing maintenance. And when you launch your chatbot you will need to carefully analyze the language used by your users and make constant updates to the language/context engine. Over time your engine will be able to answer most questions but initially it’ll be fairly dumb.
One question is how much time do you spend trying to answer all the questions your users will ask, and how they will ask those questions, or provide answers. One route would be to spend a ton of time perfecting it. I think there is some merit to this because you want your chatbot to sound very intelligent. The other route is to do something bare bones and then learn how your users are interacting with it. The chatbot should also learn over time. But you’ll also have to update the engine manually.
That’s all for now. I hope this post made you think a bit.