Hello Zapier Community,
Several year user of Zapier and automations to connect multiple sources like Unbounce, MyEmma, SurveyMonkey etc.
I have a new challenge that I’m not sure Zapier is up to but thought I would ask the community.
We are working with a third party to create a daily email list that is based on a triggered action within their proprietary platform. At this time they cannot expose their output endpoint API to us and can only auto-generate a daily flat file and place it on a site (Exavault hosted SFTP or Box being our two preferred locations).
My problem is that this flat file will contain hundreds to thousands of emails. One per row.
I would like to perform the following workflow within Zapier if possible.
- Watch for a new file in a consistent location in either SFTP server or Box. Same file name each day.
- When the new file exists, parse each row as a unique zap action.
- Hand off that email to our ESP (MyEmma) via their API (already know how to do this and am currently using.)
- Use MyEmma automation to determine the prospect journey (new - send invite, existing record - do nothing unless a certain number of days have elapsed.)
- When last row has been parsed, delete or rename flat file.
- Go back to listening for next new file
Steps 3 & 4 are trivial and we already are doing this with other platforms.
The challenge is parsing that flat file. Since this is a many to one action, I’m not sure a Zap workflow can handle - or I need another intermediate task to do the parsing which would look like an incoming API call to Step 3 and beyond.
Anyone doing something similar.
PS - I’ve already asked the third party if we could just use outbound API calls to send the info that they aggregate and transmit each night - but do to some licensing and security constraints, they are only willing to do to do the flat file transmission at this time...
Best answer by williamView original
Well - as soon as I posted I had a minor inspiration. I probably don’t have to parse the list if I can get it outputed in a particular format.
If I can get the file to be a proper JSON payload, I could just send the entire file contents to the Emma API and bulk add members.
Here is the Emma API entry for this action:
Add new members or update existing members in bulk. If you are doing actions for a single member please see the /members/add call below.
An import id
Anyone think I am headed in the right direction? But this still may not be a good Zapier solution - or I would need to hire someone to create that custom API call to Emma as the off-the-shelf Emma integrations are a one-to-one action.
@flynmoose, hope you’re well!
Zapier isn’t really designed to handle bulk import/export jobs: https://zapier.com/help/create/basics/bulk-import-data-into-zaps
The main concern would be running into our rate limits for triggering zaps: https://zapier.com/help/troubleshoot/behavior/rate-limits-and-throttling-in-zapier
That said, we do have an integration with Box that can trigger when a new file is received. We could take the file if it is a CSV and try out our Formatter’s Import CSV function to import the CSV as line items. From there, we could send those line items to Google Sheets using the Create Spreadsheet Row(s) action.
This would create one row for each item in the CSV.
Finally, we would use a second zap that triggers using the Google Sheets - New Spreadsheet Row trigger. We would have this zap watch the above import sheet. This would allow the zap to trigger once for each new row added to that sheet by the CSV. From there, we can take the row data and send it wherever you need using an action.
This would allow you to use the one-to-one actions in the public Zapier integration for Emma if the actions available fit your needs.
As for deleting the flat file from Box, there currently isn’t a Delete File action available for the Box integration on Zapier.
For more info on importing CSVs into a zap from a file, check out this article: https://zapier.com/help/create/format/import-csv-files-into-zaps
The Import CSV from File should work for files with hundreds of rows, but thousands of rows are likely to pose a problem with our size limit. The Import CSV File utility only supports importing files that are 150 KB or less in size (around 1,000 rows of a 10 column CSV file). You'll need to split the CSV file into multiple files if it's too large.