Find your content:

Search form

You are here

Create CSV file bigger than 15 MB from within Force application (no dataloader, etc.)

 
Share

I need to export data to XML/Excel or CSV from Salesforce.com Checking the current limits I just see those options and limitation:

  1. Render file via Visualforce page with special Mime-Type <- LIMITS 15 MB max page size and maximum records to iterate through in read-only mode
  2. Render file via Blob class in Apex Batch mode <-- LIMITS 12 MB heap size in batch mode.

That's way to few MB for my purpose. Did anyone came up with a better yet not documented solution? Please post it here!

I thought of appending to an existing file during an Apex Batch job, but the Blob class doesn't have any append like methods.


Attribution to: Robert Sösemann

Possible Suggestion/Solution #1

I think you can do it by using JSRemoting & FileApi in html5 but that will work in chrome only. Also can try https://github.com/eligrey/FileSaver.js


Attribution to: SFBlogForce

Possible Suggestion/Solution #2

If your sole objective is to get the Salesforce data out into a CSV then there are several options outside of apex. I see in your question you tagged the question as apex, so these options may not suit your requirements.

  1. You could use the Data Loader to export the data. As @eyescream points out in the comments, this can be further automated if required - Using Data Loader from the command line
  2. You could create a Data Export (Setup | Data Management | Data Export). This can be scheduled to run on an interval (weekly or monthly) and creates zip files that you can download with the required CSV data.
  3. You could use the Soap API to run an Apex query and then save the results to CSV yourself. There are existing tools that can do this for you if required.

If you need a purely native apex solution you could try and implement some form of server side pagination combined with the @ReadOnly annotation (see Working with Large Sets of Data).

For pagination, you can use a combination of Limit and Offset. That said, Offset still appears to be in developer preview (according to the documentation) and there are notes about efficiency with offsets into a large result sets. The maximum offset of 2000 rows will limit the usefulness of this approach.

Using a Standard Set Controller as covered in How to query more than 50000 records in apex and bind it in vf page in jqgrid and paging? seems like a good option for paging through large amounts of data, but I'm not sure how you would utilise it to create a CSV.

If neither of those options work for pagination you will most likely need to devise your own method to create pages based on the data being exported.

Beyond that I don't think it will be possible to create a CSV of arbitrary size. One way or another the limits will stop you.


Attribution to: Daniel Ballinger

Possible Suggestion/Solution #3

You can use the Bulk API and do it all from the command line with curl (well, almost all)

http://www.salesforce.com/us/developer/docs/api_asynch/Content/asynch_api_code_curl_walkthrough.htm

This would be recommended over using the SOAP API as it'll more efficiently handle a lot more data.

It'll also export as CSV for you.


Attribution to: Steven Herod

Possible Suggestion/Solution #4

try out email services. 1. Create an email service

  1. Use inbound email handler and send an email to the unique id

  2. and in the email service process handling class, code it in such a way that you create a blob and store the document in a contentversion (library) and send the link to that document as an email to the specified email address.

Heap size Limit in salesforce for Email services is 36MB and you can send 2000000 inbound emails per day.


Attribution to: Sathya
This content is remixed from stackoverflow or stackexchange. Please visit https://salesforce.stackexchange.com/questions/3894

My Block Status

My Block Content