Skip to main content
Question

Improving news aggregation for the life insurance industry using a research assistant

  • February 17, 2026
  • 3 replies
  • 15 views

I created a research assistant to gather news in the life insurance industry. My user message is below. My instructions are additional, not added here. Problem: the weekly data seems largely dyplicated.  "🟢 Market / Industry (3–4) LIMRA (press releases only) AM Best (Life/Annuity news only) McKinsey Insurance page Deloitte Insurance page  🔴 Medical / Mortality (3–4) SOA research updates Swiss Re Life & Health blog Munich Re Life updates RGA insights  🔵 AI / Tech (2) NIST AI updates Stanford HAI news only  Focus on developments from the past week.  Published Date is in the last 7 days. Provide a comprehensive summary organized by data category with key insights, trends, and actionable findings. Include source links where possible." QUESTION: i use Zapier, chatgpt, gmail. Should I be using an RSS reader? I dont know how to make this work well and dont have lots of time to spend experimenting.

3 replies

SamB
Community Manager
Forum|alt.badge.img+11
  • Community Manager
  • February 18, 2026

Hi there ​@PAH, welcome to the Community! 🎉

Not sure what URLs you’re pointing the ChatGPT action towards for its research but, if the data is being duplicated it could be that the ChatGPT action is reviewing the exact same information each time it runs.

I think you’re right, using an RSS reader to pull in the content might work better for what you’re trying to do here. You could still have ChatGPT reviewing the RSS items though, you’d just split the workflow up across two Zaps. For example, one Zap would collect the data from multiple RSS feeds and add it to a digest. Then, another Zap would trigger on a weekly basis, release the digest, pass the collected information over to ChatGPT to analyze and email you the summary. That would look something like this:

Zap 1: RSS feed item collection

Zap 2: Weekly digest release, analysis and email

  • Trigger: Every Week (Schedule by Zapier) - set to run at whatever time you prefer the email to be sent e.g. Friday 8 AM.
  • Action: Release Existing Digest (Digest by Zapier) - releases the accumulated items in the digest.
  • Action: Analyze Text (ChatGPT (OpenAI)) - summarizes/analyzes the information in the digest.
  • Action: Send Email (Gmail) - send you an email with a summary of news from that week.

 

Hope that helps. If you give that a try and get stuck at all, just let me know! 🙂


drtanvisachar
Forum|alt.badge.img+5
  • Zapier Solution Partner
  • February 20, 2026

Hello ​@PAH 
Yes, RSS will make this way more reliable.

Right now “search + summarize” tends to re-pull the same stories weekly. RSS gives Zapier a clean list of new posts so you can dedupe before ChatGPT.

Simple setup:

Pull RSS per source
Deduplicate by URL/title using Storage by Zapier
Send only new links to ChatGPT
Email the digest via Gmail

Dr. Tanvi Sachar
Monday Certified Partner, Tuesday Wizard


Forum|alt.badge.img+1

@PAH Hi! The issue you’re seeing (lots of duplicated content) comes from the way you’re currently pulling data. Right now, if your research assistant is just scanning sites via ChatGPT prompts or a Zap from Gmail, it will often fetch the same press releases or updates multiple times, because there’s no structured feed or “seen before” logic.

Using an RSS reader would help a lot. Most of the sources you listed (LIMRA, AM Best, Deloitte, McKinsey, SOA, Swiss Re, Munich Re, RGA, NIST, Stanford HAI) offer RSS or Atom feeds. With RSS:

  • You get only new content, avoiding duplicates.

  • It’s easier to track multiple sources in one place.

  • Zapier can trigger on new RSS items, which you can then send to ChatGPT or Gmail automatically.

A simple workflow could be: new RSS item → Zapier → send to ChatGPT → summarize → email you the weekly digest. You can also filter by date in Zapier to make sure it’s only last 7 days.