Skip to main content

Hi, 

I’m taking a transcription output and telling open ai (not chat gpt) integration to turn the text into an outlined blog article. However the text of the transcription exceeds the 4000 tokens limit. I’m wondering if there’s a workaround? 

For example can someone show me how you could perhaps do this via Open AI’s API Beta call action or webhooks? I’m stumped and would really love images or videos or a very detailed explanation on what you did and how you did it. 

I’ve looked at Open AI’s api documentation and it’s difficult to work through and know where to map what within the Open AI API Beta call action. 

Any and all help would be really helpful! 

Thanks!

Hi @Tina Lopez 

Good question.

Maybe try using the this Zap action: Formatter > Text > Split Text into Chunks for AI Prompts

https://help.zapier.com/hc/en-us/articles/15406374106765-Modify-large-data-for-your-AI-prompts

Then you can follow that with the Looping app: https://zapier.com/apps/looping/help

 

If you need help with advanced approaching in configuring Zaps, consider hiring a Certified Zapier Expert: https://zapier.com/experts/automation-ace