Skip to main content

Hey All.

A problem that many have been running into as AI becomes part of more workflows is the token limits of the different LLM models. To help work around this, we’ve released a new beta Text transform for Formatter - Split Text into Chunks for AI Prompt (beta). Chunks being the AI parlance for a segment of text that fits under your LLM model’s token limits.

We have some initial help for this here, and have included a Zap Template that can help get you started. It’s a rather complex workflow that we’d like to make simpler over time.

If you build out a Zap using this transform, we’d love feedback on how it goes.

Kirk

 

The template for this article doesn’t work?

 


@Kirk any chance y’all can resurrect the shared template. I’m getting the same as the last post: - https://zapier.com/app/editor/template/1288787 doesn’t work any more

 


Can you give that https://zapier.com/app/editor/template/1288787  another try? This is a bit of a janky setup, as this template isn’t fully published. Sorry about that.


Reply