Question

Failed to create a request in Scrape-It.Cloud Response payload size exceeded maximum allowed payload size (6291556 bytes).

  • 2 January 2024
  • 3 replies
  • 90 views

Hey,

So I was using the pro free trial version of Zapier through Nov 27-Dec 27 and I had no errors as shown in the title, I activated my ‘starter’ membership today at 750 zaps. As soon as I started playing my Zaps the exact one as I did a month earlier I started receiving this error. 

 

“Failed to create a request in Scrape-It.Cloud Response payload size exceeded maximum allowed payload size (6291556 bytes).” 

 

I am not an expert when it comes to computers and wanted to ask if this is because my membership is the starter and during the trial period I had the pro membership or is it from Scrape it Cloud side. 

 

Finally does anyone have a solution?

Thank you. 


3 replies

Userlevel 7
Badge +12

@DarkSyed 

I did some searching on that error and came up with many hits on web scraping using AWS Lambda.  I found that there is a 6 MB limit on payload with AWS Lambda.  Funny your error is just over that limit.  I am wondering if Scrape-it is running on AWS servers.  Maybe someone will chime in with more info.

See this: https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-limits.html

Look under Function, Configuration, deployment and execution.

Kind regards,

Zap Support

GetUWired

@GetUWired - Thanks for your swift and informative response, it does make sense. 

 

The only issue is that I was scraping pages much larger than 6291556 bytes, the current scrap webpage has 50% less text than what I had previously scraped, furthermore there were no images on the page, which I have scraped before as well. 

Could it be a sudden change in the Scrape Cloud internal limits? I was not able to find any updates on their website also their support team is always online but never responds. 

I am thinking if I should switch the scrapper, based on my experience with them besides this error I would rate the 5 * for function but 0 * for support.  

 

Thanks

Userlevel 7
Badge +12

@DarkSyed 

Switching scrapers is a good option.  Writing the API call in the platform UI is another, because you could trap the error and send an appropriate message, etc.

It is possible that something changed with Scrape Cloud.  Applications get planned updates to their APIs to improve things and provide requested features, and sometimes the Zapier public integration needs an update to stay in sync.

Kind regards,

Zap Support

GetUWired

Reply