This is correct. The steps provided by your integration get about 30 seconds to complete all requests and processing.
https://platform.zapier.com/docs/constraints
Zaps are built around workflows with fairly fine grained, single entity events and messages, rather than large batch or ETL processing.
You’re right to strongly consider the runtime profile of your transaction and make sure it reliably, comfortably fits within the processing window.
If you could share some more about your use case I’ll bet we could reorganize that transaction in a way that fits with the way users define and use Zaps. What’s a typical problem a user would be building a Zap to solve? What app and event would they use as a trigger? If your action returned thousands of records, what would the user want to do with those, in their Zap, as a result?
For instance, what do the 10,000 items represent? Would your users want to perform an action on each of those items? Might a user want to create a trigger on those new items as they are created or updated rather than processing them in a batch?
You mention an action returning bulk data. Create actions are generally used to mutate data, creating or updating a resource. Search actions are expected to return a single unambiguous result. Triggers are the elements expected to yield large sets of objects, and then each of those objects is handed by a separate Zap execution.
Not uncommon exceptions to this include “line items” in orders where child objects must be handled together with the parent. And maybe your use case is similar. Also note any action would have to have been written by the app’s developer to expect a line item array, and they’d also have to be able to completely consume and handle that set of line items, within 30 seconds.
You might also have a look at Transfer, which allows a user to retrieve data in bulk (from a specially implemented trigger) and process each returned object through a Zap workflow.
https://zapier.com/transfer
https://platform.zapier.com/docs/transfer