Question

BigQuery Select Save to to a New Google Sheet Daily Don't Override Data

  • 18 January 2024
  • 9 replies
  • 52 views

I want to run a query on a schedule and save it to a new google sheet in a specified folder every day. I would want to sheet to be name ABC_{DATE_TIME} where date time would be the date and time it was created. 

When using googles connected sheets it overwrites the data and it is limited 50K rows so I am looking for a no or low code solution using zapier.

 

Thanks

 


This post has been closed for comments. Please create a new post if you need help or have a question about this topic.

9 replies

Userlevel 7
Badge +14

Hi @joecohn 

Good question.

Make sure to review the help articles for using GSheets in Zaps to be aware of record limitations. (10K)

https://zapier.com/apps/google-sheets/help

 

You may be able to get around this using the GSheets API via these GSheet actions:

  • Custom Action
  • API Request

I may have mis-spoken a bit. I am ok with saving it as a csv if GoogleDrive. The main thing I needs is somewhat larger datasets to be in files accessible outside the database. I want to make sure a new file gets created with each export and that it runs daily. 

Any help would be appreciated

 

Userlevel 7
Badge +14

Hi @joecohn 

Can you post screenshots with how your Zap steps are outlined and configured to give us more context?

 

Zap 1

  1. Trigger: Schedule
  2. Action: Run a Query

Zap 2

  1. Trigger: BigQuery - Query Job Completed with Row Data
  2. Action: GSheets

 

 

 

I keep trying to get step 2 to work and that does not seem to work for me. It doesn’t seem to be running my query correctly. It is a simple select statement

 

Userlevel 7
Badge +14

@joecohn 

We would need to see screenshots with how your Zap steps are configured to have full context.

Thanks so much for offering to help. Let me try to provide context and whatever else I can.

 

At 10:00 am daily 

Run a query against Bigquery database (Example)

 

SELECT * FROM `my-project.my-data-set.mytable` WHERE my_date_field = CURRENT_DATE('America/New_York')-2

Take results and save in a folder called Inbound FBA Details(which I can create in my Drive) in a file called 

inbound_fba_details{-DATE-TIME}.csv where the DATE-TIME is replaced with the date and time the file was created. 

It can be either a csv file or a google sheet.

But sometimes I may have over 100K results. 

 

 

 

Userlevel 7
Badge +14

@joecohn 

Looks like this will create a Job.

Can you show screenshots of the DATA OUT from testing the step?

 

You may have to add a Delay, this Zap action: BigQuery - Get Rows for Job Completed

 

Or use a different Zap with this trigger: BigQuery - Query Job Completed (with Row Dat)

 

NOTE: Results with 100k rows may exceed the data limits for Zap steps.

I am not that familiar as this is my first real foray in zapier and I just can’t seem to get it working. (This query returns a about 4K rows), I will looks into what you

This keeps happening when i get to test. @Troy Tessalone Thanks for whatever insight you can offer

 

Userlevel 7
Badge +14

@joecohn 

Guesses:

  • Request is taking too long and timing out.
  • Response is too large for Zapier to handle.

 

Perhaps try opening a ticket with Zapier Support for possible guidance in troubleshooting: https://zapier.com/app/history/