ChatGPT responses are getting trimmed more and more as I test it out with the same prompt.
Is there a way to fix this?
This is the message it shows up at the end of the response:
history_context:
was_trimmed: true
ChatGPT responses are getting trimmed more and more as I test it out with the same prompt.
Is there a way to fix this?
This is the message it shows up at the end of the response:
history_context:
was_trimmed: true
Best answer by Danvers
Hi
I did some digging with our developers and learned more about the history_context.was_trimmed field.
When you use ChatGPT, Zapier stores the responses so they can be used in future prompts/zaps. Think of the whole conversation as a google doc, and there is a fixed sized window that captures what we can send to ChatGPT, like a limited field of view. That window has to include the most recent message, so it's pinned to the "bottom" of the doc. If the top of the window excludes some of the messages at the "top" of the doc (the earliest messages), then the was_trimmed is true.
history_context.was_trimmed is true when Memory Key is used but we needed to pop some of the oldest memorized messages to stay under the token limit.
I hope that's clear, please let us know if you have any questions!
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.