Skip to main content

Trying to use the Conversation in ChatGPT app with the GPT-4o model, which has a Max Token limit of 128,000 according to the OpenAI API Documentation.

For the token length in my app on my Zap, I set it to just over 32,000.

However, when I try to run the Zap, I get an error that says:

“This ChatGPT step hit an error.

Error message: :400] max_tokens is too large: 32768. This model supports at most 4096 completion tokens, whereas you provided 32768.”

Curios if anyone knows why:

  1. It’s saying my max_tokens is too large?
  2. It says that “this model supports at most 4096” which from my understanding is not the case, considering the API Docs say it can handle up to 128,000.

Am I missing something?

Any insight would be greatly appreciated!! Thanks!

Also might be worth noting that I had this setup and working perfectly fine using the GPT-4 Turbo model and a 16k token limit.

However once I switched to the GPT-4o Model, I am now getting the max_tokens error.


Hi there @Zyler,

Welcome to the Community! 🎉

I did some digging into this, and it seems like the max token limit in Zapier is 4096. However, we do have an open feature request to increase the max token limit for ChatGPT. 

I have added your vote to the open feature request. That does a few things:

  • Bring this to the attention of the integration developers
  • Help track interest in this feature being implemented
  • Allows us to notify you via email if this feature becomes available in the future

While I don't have an ETA on when this feature might be implemented, we will notify you via email if it is!

Hopefully, this helps.


@ken.a 

 

Thank you for the update, much appreciated!

Just out of curiosity, do you know why the limit for the new model would be set to be lower than the previous models?

I’m currently able to use over 16,000 tokens with GPT-4-Turbo model, so I find it odd that the limit would be set less for the new model.


Reply