Skip to main content

From time to time, I engage in creating chatbots for my clients based on ChatGpt. I've used Chatfuel, where you can provide the model with information not exceeding 12,000 tokens, based on which the chatbot responds. Meanwhile, Zapier in zapier interfaces has a chatbot that can use a data file that can total up to 1MB! Wow!

As we know, ChatGPT API has no memory. To enable it to respond based on context or provided knowledge, it needs to be supplied with it in the form of prompts before each statement.

GPT-3 turbo can handle a context of 4K in the cheaper version, and 8K in the more expensive one. This means that in the cheaper version it can have a total data size of 4096 tokens.

For simplicity: 4096 tokens correspond to about 4 characters each in English. This means it can handle 16384 characters - we are talking here about the prompt with the question and the answer to the prompt in total. 16384 characters weigh about 16KB in a UTF-8 encoded CSV file.

And Zapier allows you to add a data file, which its chatbot will use, with a size up to 1MB, which is 64 times larger than what chatGPT-3 turbo allows!

I would like to have such a possibility when creating bots for clients, but chatfuel does not allow me to do this, but it has other valuable functions like building flow. And Zapier, for now, is quite limited. What are your experiences? What works for you when building chatbots?