Find your content:

Search form

You are here

How can I avoid Sites Limits?


There are limitations on bandwidth and processing time. Has anyone run up against these limits? How do people mitigate the likelihood of those happening?

For the bandwidth limits, I suppose you could host images and other static (non JS of course) files, especially larger files, on a non Salesforce server. While possible, it is not ideal because then you have to pay/maintain for something else. Salesforce is no longer the one-stop shop. (If it isn't in Salesforce it doesn't exist, right?)

For the processing, caching can be used in some circumstances, but what if the pages are very dynamic in nature and involve a lot of user interaction? I don't want to optimize prematurely (it's the root of all evil, after all).

While I do my best to avoid hitting them, these types of Salesforce limits that affect public facing websites keep me up at night. ;)

Attribution to: Peter Knolle

Possible Suggestion/Solution #1

There are many best practices to avoid hitting these limits.

For example, static content and images can be cached and delivered by the CDN. Anything that is cached does NOT count against these limits.

If you estimate being close be sure to contact support as they can help with guidance and even adjustments where appropriate.

Attribution to: Scott Jorgensen

Possible Suggestion/Solution #2

You can also setup workflows to alert yourself of upcoming limits. Take a look at the SF documentation: Using Workflow for Sites.

Attribution to: Fitz

Possible Suggestion/Solution #3

I have hit those limits in the past by the nature of the things we were trying to implement. Luckily they are soft limits where a rep will get in touch to talk about increasing bandwith for a fee. If you think your going to be getting close, hedge your bets by letting the client know they might need to purchase extra capacity if their usage increases over time.

Attribution to: ebt

Possible Suggestion/Solution #4

Also, be wary of file upload features on Sites - larger files will quietly chew up limits.

Attribution to: tompatros

Possible Suggestion/Solution #5

It's pretty easy to cache static resources using salesforce's CDN, and when those resources are served up by the CDN, they don't count against the sites limits:

As for dynamic resources, you can buy more page views (IIRC it's $1K/mo for each extra 1M page views) but I don't think there's a way to increase the bandwidth and service request time limits.

Attribution to: Rob Cheng

Possible Suggestion/Solution #6

Depending on the amount of traffic being served, is a viable option for hosting sites that connect to Salesforce data, via either the SOAP or REST APIs.

Heroku supports Java apps, so any investment in Apex controllers on Sites is somewhat portable.

Jeff Douglas has a great example here:

Attribution to: dlog
This content is remixed from stackoverflow or stackexchange. Please visit

My Block Status

My Block Content