In reading this interesting title (via MEAP) on the trendy topic of Big Data, I was reminded of the many times I’ve had to explain the general principles of the Salesforce data storage model and specifically how many records can be stored per user.
In short each user license gets an allowance, the edition of Salesforce dictates how big the allowance is. A minimum threshold of 1GB applies regardless of users and edition. For illustration, Enterprise Edition (EE) customer with 100 standard user licenses gets 100*20MB=2GB (20MB per-user for EE, UE is 120MB). If the same customer had only 10 users, they would have 1GB, as 10*20MB=200MB is less than the minimum value.
So – how does this equate to physical record storage? The answer to this is straightforward, in general terms each record held consumes 2KB regardless of the number of custom fields, therefore simply divide the allocation by 2KB as below.
2GB = 1,073,741,824*2 = 2,147,483,648 / 2,048 = 1,048,576 = approximately 1 million records
Note – exceptions to the 2KB per-record rule are Article (4KB) Campaign (8KB), CampaignMember (1KB), PersonAccount (4KB) and EmailMessage (variable). Article Types and Tags all consume 2KB for each record. Certain records such as Products, PriceLists, Chatter Posts and Comments etc. do not count. *Update* – Contact Roles and Partner Roles also do not count toward data storage.
This post is not exhaustive, but the maths and noted exceptions should be sufficient to gain an approximation of required versus actual storage. In a world where cloud storage is perceived to be almost a free commodity, the Salesforce model looks restrictive when considered simply in GB terms, particularly in comparison to other cloud services such as iCloud, where 5GB is free.
It always makes sense to implement a fit-for-purpose archiving strategy (off-platform perhaps), balancing business need for data versus storage cost – a further consideration being the performance implications of larger data sets in areas such as reporting and search.