Can I create a workflow to auto-summarize large articles with ChatGPT?
Howdy folks,
Wondering if anyone has suggestions or a guide to setup an article summary workflow. I’ve successfully done this with smaller articles using chat GPT, but it seems like everything I Zap to it results in errors due to the character count. Is there a workflow anyone has that can automatically summarize a larger article?
Thanks in advance.
Page 1 / 1
Hi @ZapMorris
Good question.
Perhaps try this Zap action: Formatter > Text > Split Text into Chunks for AI Prompts
Hi, you have to use the new OpenAI model called gpt-3.5-turbo-16k
@ZapMorris
Also, when using the ChatGPTConversation action, make use of the MemoryKey.
has anyone got the 16k token models working? I was advised yesterday by tech support that over 4K tokens doesnt work and its a known bug. The screenshot above shows that only 10 tokens were used, and 3990 are remaining, i.e. 4K tokens.
Hey there, @luked! Thanks for reaching out in community and mentioning this!
I’ve tagged this thread so we’ll also be sure to keep it updated with any news.
In the meantime, I wonder if using GPT-4, the chunking method, and the memory key as Troy mentioned, could serve as a workaround?
Keep us posted!
has anyone got the 16k token models working? I was advised yesterday by tech support that over 4K tokens doesnt work and its a known bug. The screenshot above shows that only 10 tokens were used, and 3990 are remaining, i.e. 4K tokens.
@luked Thanks for reporting that the token counter was not working properly for these 16k models.
Our team has made an update so these should show the correct token count and use up the full 16k tokens these models allow.
Thanks for putting in the fix. I can confirm that we've updated the plugin in our workflows and all is now in order. Thanks for the quick fix and for reporting on the thread.
Hi @luked,
Awesome! I’m glad everything is now sorted.
If you have any other questions, please don’t hesitate to reach out in the Community. We’re always happy to help!