Skip to main content

Hi,

As the other thread is closed, here in need for a solution about:

“If you’re using the ChatGPT integration (as opposed to the OpenAI one) and are seeing prompt responses cut off due to length, it could be due to the time limit Zapier has for returning a response. If it takes longer than 150 seconds for ChatGPT to provide a response, Zapier will post what there is at that point and cut off the rest. If the response is cut off, the field was_cancelled will say true.“

Sorry for the double post. But can you add me to the feature request to remove the cutoff for the conversation action for the ChatGPT integration.

Thx in advance!

Hi @cookie 

Good question.

Try using this Zap action: Formatter > Text > Split Text in Chunks for AI Prompt

 


I should have read completely: While testing the ChatGPT action, it cuts off the response after 50 seconds. But just while testing. When published in production this extends to 150 secs which seems to be more then enough. Thx though!!


Glad to hear it, @cookie! 🙂

I’ve added you to the feature request you referenced previously which will help to increase it’s priority. This will mean that you’ll get an email notification from us once the the cutoff for the conversation (ChatGPT) action is removed. In the meantime, it sounds like you’re all set for now! 🙂

Since we’ve already got another topic in Community discussing that feature request, which isn’t closed, I’m going to close this one out now so we can continue to keep track of interested folks in a single topic thread.

For anyone that would also like to be added to that feature request please reach out on the main topic for it and we’ll get your vote added: