Best answer

Can I create a workflow to auto-summarize large articles with ChatGPT?

  • 15 June 2023
  • 8 replies
  • 318 views

Howdy folks, 

Wondering if anyone has suggestions or a guide to setup an article summary workflow. I’ve successfully done this with smaller articles using chat GPT, but it seems like everything I Zap to it results in errors due to the character count. Is there a workflow anyone has that can automatically summarize a larger article?

 

Thanks in advance. 

icon

Best answer by hoon 27 June 2023, 18:42

View original

8 replies

Userlevel 7
Badge +14

Hi @ZapMorris 

Good question.

Perhaps try this Zap action: Formatter > Text > Split Text into Chunks for AI Prompts

https://help.zapier.com/hc/en-us/articles/15406374106765-Modify-large-data-for-your-AI-prompts

Userlevel 1
Badge +1

Hi, you have to use the new OpenAI model called gpt-3.5-turbo-16k

 

This is the action block where ChatGPT is used.
This image shows the token size of the aritcle.

 

Please see that here the model is used gpt-3.5-turbo-16k.

 

 

The result was a summry generation of the hwole article. Obviously, you can customize your prompt in your style and get desired results.

 

Userlevel 7
Badge +14

@ZapMorris 

Also, when using the ChatGPT Conversation action, make use of the Memory Key.

 

Userlevel 1

has anyone got the 16k token models working? I was advised yesterday by tech support that over 4K tokens doesnt work and its a known bug.   The screenshot above shows that only 10 tokens were used, and 3990 are remaining, i.e. 4K tokens. 

 

Userlevel 7
Badge +9

 Hey there, @luked! Thanks for reaching out in community and mentioning this!

I’ve tagged this thread so we’ll also be sure to keep it updated with any news.

In the meantime, I wonder if using GPT-4, the chunking method, and the memory key as Troy mentioned, could serve as a workaround?

Keep us posted!

has anyone got the 16k token models working? I was advised yesterday by tech support that over 4K tokens doesnt work and its a known bug.   The screenshot above shows that only 10 tokens were used, and 3990 are remaining, i.e. 4K tokens. 

 

@luked Thanks for reporting that the token counter was not working properly for these 16k models.

Our team has made an update so these should show the correct token count and use up the full 16k tokens these models allow.

Userlevel 1

Thanks for putting in the fix. I can confirm that we've updated the plugin in our workflows and all is now in order. Thanks for the quick fix and for reporting on the thread. 

Userlevel 7
Badge +6

Hi @luked,

Awesome! I’m glad everything is now sorted.

If you have any other questions, please don’t hesitate to reach out in the Community. We’re always happy to help! 😊

Reply