I am currently designing a solution which will likely include upload (insert/upsert) of 1000s of records per upload. I am doing some calculations to evaluate how far the solution might be from hitting governor limits as the upload will make some triggers fire directly and indirectly (via RollUps summary fields). Your input will be appreciated.
Use case: uploading (insert/upsert) 1000s records via data loader. Not using Bulk API. Batch size in data loader set to 100.
Is each batch a discrete operation/transaction? If so, I believe governor limits apply to each batch/operation rather than to the whole upload?
Attribution to: TingoDTS
Possible Suggestion/Solution #1
Yes that is correct, governor limits will apply to each batch, which in turn is decided by the batch size set on the dataloader.
The dataloader will chunk the records from your csv file in batches of the number of records you've specified, and hence the governor limits are therefore distributed.
Attribution to: techtrekker
Possible Suggestion/Solution #2
Yes, each batch has its own limits and each batch consumes an API call, they are unrelated to one another.
Reduce the batch size if you hit governor limits, this will of course increase the time it takes to process them.
You might find this useful for understanding the impact of that: http://limitexception.herod.net/2011/12/15/talend-vs-apex-dataloader-bulk-uploaddownload-benchmarks/
Attribution to: Steven Herod
Possible Suggestion/Solution #3
Well there is a limit to which you can increase the batch size. Ultimately dataloader uses the salesforce API and the max allowed batch size for a call is I guess 200(Not Bulk API). Hence the dataloader internally batches them into smaller chunks if the batch size is set more than 200 by user.
Attribution to: Avidev9
This content is remixed from stackoverflow or stackexchange. Please visit https://salesforce.stackexchange.com/questions/4324