Skip to main content

I have been struggling to build a zap for the following use case: 

I have a Notion database that gets updated by marketing stakeholders with campaign information. I want to bring the information from certain columns in this Notion database into Snowflake. 

 

So far, I have been able to create new rows in the Snowflake table from “New Database Item” in Notion; however, this only brings in a specific record/row. I need to retrieve all the existing rows from the Notion database into Snowflake. And then I want the zap to check if there are any new or updated rows in the Notion database. For new rows, they will be added to the snowflake table. If there are any updates to existing rows in Notion, then the existing rows in Snowflake will be updated. 

Is there a way to actually make a zap that can do this? 

 

You could do this in theory, but it would require at least two Zaps - one that triggers when a new record in the Notion DB is created, and another that triggers when an existing record is updated.

For the latter, if you only care about certain properties in the Notion DB being updated, I would suggest using Notion’s database automations feature to send to a webhook only when one of those properties is edited, so that you’re not unnecessarily running your Zap and consuming tasks when other properties are changed.

It would look something like this, so you can specify a which properties you want to monitor. Just set up a Webhook trigger for your Zap and paste the URL into the Send webhook field below. (If you just use the Updated Database Item action from Notion, it doesn’t let you pre-filer on specific properties, so it could run a lot more than you need/want it to).

Hope this helps, and let me know if you have any questions.

 


Thanks ​@DennisWF - I’ll try that out. Do you know how I can get ALL existing records in Notion database into Snowflake first? That would be the basis before setting up triggers for new or updated database items


@jcdk25 annoyingly, if you just export the database records to csv from the UI, it leaves out some critical data, such as the pageIds of the records. So what I do is a quick API script to export the records, and write that to a csv.
 

Here’s a sample script in R.


Hi ​@jcdk25 

 

To get all the records in Snowflake first you can try to do a one time Transfer. This runs the zap on all the existing data. It runs as a zap though, so it will consume tasks.