Salesforce to Salesforce – A Short Case Study

First of all let me be clear on one thing, I’m a big advocate for Salesforce-to-Salesforce, for many org-to-org data convergence/integration use cases S2S is an efficient, cost effective solution.

Over the last couple of years I’ve had the pleasure to work with a non-profit organisation, via the Salesforce foundation, on an interesting use case for data integration with Salesforce to Salesforce. I won’t disclose the organisation name or the nature of the data in play; this isn’t relevant directly to the purpose of this post which is to concisely outline the integration pattern. Hopefully in considering this short case study, the potential of S2S for multiple-org architectures and other record sharing use cases will be clear. S2S isn’t just for sharing Leads and Opportunities!

Case Study Context
The organisation provides a variety of support services, both directly to individuals and also to other charitable organisations. In respect to individuals, external partners/providers are utilised in service delivery.

In this context Salesforce is implemented as a data hub tracking individuals and the services they receive from external providers by location. This aggregation of data enables a 360 degree (or holistic) view to be taken on the support individuals are receiving. The primary challenge in delivering this view has been the implementation of a consistent and controlled aggregation of data across external providers. To address the consistency aspect the organisation developed a managed package containing the required custom objects for the service provider to populate, and advises on Salesforce implementation. To address the data integration challenge, the initial implementation approach employed middleware technology to extract, transform and load data from the multiple provider orgs into the central hub org. For a number of reasons (including cost, complexity, requisite expertise) the middleware approach to data aggregation didn’t provide a sustainable solution. Having in-house Salesforce expertise, the organisation identified Salesforce to Salesforce as a potential option to deliver a simplified data aggregation solution built on native Salesforce functionality.

Solution Outline
S2S 2

Technical challenges
To understand the requisite implementation steps for S2S, refer to the help documentation. In summary objects and fields are published from one org and subscribed to in another org in the context of an established partner connection. This configuration takes place within standard functionality using the Connections tab. The partner connection is established through a connection request (via email verification link) initiated from one of the orgs. Once the partner connection, publications and subscriptions are configured records can be shared manually with various UI elements added in both orgs to indicate the external sharing status. Note, the relationship between the two orgs within a partner connection is bi-directional, both orgs can define publications and subscriptions.

Whilst S2S fully automates the synchronisation of records between publications and subscriptions, there are a number of areas where complementary technical customisation can be required.

1. Automated record sharing.
S2S requires records to be manually shared, in many cases it is preferable to automate this via Apex Trigger script. Basic example below demonstrating how record Ids can be inserted into the PartnerNetworkRecordConnection standard object to initiate sharing. Note, the PartnerNetworkConnection standard object holds the connection details.

trigger AccountAfterInsert on Account (after insert) {
  if (!S2S_Configuration_Setting__c.getInstance().External_Sharing_Active__c) return;	
  if (S2S_Configuration_Setting__c.getInstance().Org_Connection_Name__c==null) return;
  String connectionId = S2SConnectionHelper.getConnectionId(S2S_Configuration_Setting__c.getInstance().Org_Connection_Name__c);	
  if (connectionId==null) return;
  Map<Id,SObject> idToAccount = new Map<Id,SObject>();
  for (Account a :{		
    if (a.ConnectionReceivedId==null){
      idToAccount.put(a.Id, a);
  S2SExternalSharingHelper.shareRecords(idToAccount, connectionId, null);

public with sharing class S2SExternalSharingHelper {	
  public static void shareRecords(Map idToSObject, Id connectionId, String parentFieldName){
    try {
      List shareRecords = new List(); 
      for (Id i : idToSObject.keySet()){
        String parentRecordId;
        if (parentFieldName!=null) {
          SObject o = idToSObject.get(i);
          parentRecordId = (String)o.get(parentFieldName);
        PartnerNetworkRecordConnection s = new PartnerNetworkRecordConnection(ConnectionId = connectionId,
							                         LocalRecordId = i,
							                         ParentRecordId = parentRecordId,
							                         SendClosedTasks = false,
							                         SendOpenTasks = false,
							                         SendEmails = false);
      if (shareRecords.size()>0) insert shareRecords;            
    } catch (Exception e){     
      //& always add exception handling logic - no silent failures..

2. Re-parenting records in the hub org.

Parent-child record relationships must be re-established in the target org, this does not happen automatically. To do this a custom formula field can be added to the shared record which contains the parent record identifier as known in the source org – lookup fields can’t be published. This custom formula field (ParentIdAtSource__c for example) is published to the connection. In the target org this field is mapped in the subscription to a custom field. An Apex Trigger can then be used to lookup the correct Id for the parent record as known in the target org which can then set the related relationship field value. Specifically the logic should query the PartnerNetworkRecordConnection object for the LocalRecordId which matches the PartnerRecordId value held in the custom field.

3. Record merges in the source org.

In the record merge case the update to the surviving parent record flows via S2S without issue, re-parented child records however do not. To address this an Apex Trigger (on delete of the parent record) can be used to “touch” the child records as shown in the basic example below.

trigger AccountBeforeDelete on Account (before delete) {
  try { 
    if (Trigger.isBefore && Trigger.isDelete) { 
      // call @future method to do a pseudo update on the contacts so that the reparenting flows via S2S 
      List<Contact> contactsToUpdate = [select Id from Contact where Accountid in :Trigger.old]; 
      Map<Id,Contact> idToContact = new Map<Id,Contact>();
      if (idToContact.keySet().size() > 0) { 
  } catch (System.Exception ex) { 
    //& always add exception handling logic - no silent failures..  

public static void touchContactsForAccountMerge(Set<Id> contactIds) { 
  List<Contact> contactList = [SELECT Id FROM Contact Where id in :contactIds]; 

4. Data clean-up.

Record deletions are not propagated via S2S instead the Status field in the PartnerNetworkRecordConnection object is set to ‘Deleted’ with no further action taken. A batch process in the target org may add value in flagging such records for attention or automating deletion.

5. Unit test coverage.

Unfortunately, the PartnerNetworkConnection object is not creatable (insertable), therefore unit test code is reliant on the existence of an active connection in the org. The ConnectionReceivedId field on standard and custom objects is also not createable or updateable, requiring shared records to be in place (and SeeAllDate=true) in order to test custom functionality in the target org. Not ideal.

Note, S2S record sharing interactions count against standard API limits and speed of update is not guaranteed. In my experience the updates are typically sub 5 seconds latency. My understanding is that the underlying synchronisation process runs on a polling schedule and therefore the speed of update will vary based on the distance to the next poll cycle.

Useful Links
An Introduction to Salesforce to Salesforce
Best Practices for Salesforce to Salesforce

Any-org Design Considerations

The concept of any-org development is an interesting one. The strict definition, to my mind, being the development of a set of components (perhaps packaged) that are designed and coded specifically to install and function in any Salesforce org. This is typically an ISV concern, where testing and maintaining a single-code base can be highly desirable over managing a base package plus multiple extension packages, or in the worse case multiple independent packages. Either way an ISV needs to maximise the addressable market for a product whilst minimising the ongoing effort to do so. The same drivers do not apply in the single-org case, where a consultancy and/or in-house team are delivering technical components to be installed into a known Salesforce org (or multi-org estate). In the single-org case it is common practice to see technical components designed and coded for the current state of the target org(s), with no consideration to how the org(s) may evolve over time. This can often result in situations where costly technical work is required simply to activate an optional product feature, or to provide user access in another locale. In such cases the situation can often be compounded by the fact that the original development team are no longer available.

In short, in my view some degree of future-proofing should be considered in designing for the single-org model, using the techniques applied by ISVs in the any-org model.

    Any-org Design Considerations

  1. Optional Features
  2. Examples; Person Accounts, Quotes

    There are a multitude of optional product features which can be enabled directly in the Salesforce web application UI or via Salesforce support. In the majority of cases such feature activations irreversibly add new objects and fields to the Salesforce instance. From the perspective of keeping simple orgs uncluttered by objects related to unused features this makes perfect sense. From the perspective of designing for the any-org model, this approach poses a few challenges. The main challenge being that Apex code won’t compile where a static reference exists to an object (or field) that doesn’t exist in the org. There is no simple answer to this, instead a selective approach should be taken where optional features that may already be active (or could in the future be activated), that have some impact on your code are accommodated. The approach to achieving this for any-org Apex code basically involves replacing static references with Dynamic SOQL and Dynamic Apex (see coding techniques below).

  3. Multi-currency
  4. The default currency mode of a Salesforce org is single currency, the majority stay this way. It is however common to have multi-currency and perhaps advanced currency management (ACM) activated in orgs where business operations are international. Activation of multi-currency often occurs once the Salesforce org has become established, perhaps in a single region. This can be problematic where technical customisations have been added that aren’t currency aware.

    In the any-org case, all Apex code should be multi-currency aware and use Dynamic SOQL to add the CurrencyIsoCode field to all object queries involving currency fields. Additionally, currency aware logic should include checks to ensure that related transactions are the same currency, and that custom analytics are presenting data in the corporate currency (default and therefore expected behaviour for the native reporting functions). Note, the behaviour of aggregate functions involving currency fields must also be handled.

  5. Editions Support
  6. A key design decision for ISVs is the Salesforce editions to be supported by their managed package. This one has less relevance to the single-org model, unless the multi-org estate includes different editions.

    It is possible to group editions into two distinct groups;
    1. Group (or Team) Edition and Professional Edition
    2. Enterprise Edition and Unlimited Edition

    In the case of group 1 assume that standard objects such as Product, Pricebook2, PricebookEntry, RecordType do not exist and ensure no static references exist in the code. The OrganizationType field on the Organization object tells us which edition the code is executing within.

    public static Boolean isTeamOrProEdition(){
    	if (isTeamOrProEdition==null){		
    		List<Organization> orgs = [select OrganizationType from Organization where Id=:UserInfo.getOrganizationId() limit 1];
    		if (orgs.size()>0)				
    			isTeamOrProEdition=(orgs[0].OrganizationType=='Team Edition' || orgs[0].OrganizationType=='Professional Edition');
    	return isTeamOrProEdition;
  7. Internationalisation
  8. Whether an international user base is anticipated or not it is general software development best practice to externalise string literals into resource files. In the Salesforce context this means Custom Labels. A best practice here is to apply strict categorisation and a meaningful naming convention. Also ensure all literals are externalised not just labels in the UI, for example trigger error messages.

    Another consideration for i18n is the use of currency and date formatting helpers. Where UI components do not apply default formatting for an SObject field you need to handle this in code. An i18nHelper class which translates ISO locale and currency codes to date format strings and currency format strings plus symbols respectively can be very helpful.

    Useful abbreviations:
    i18n – internationalisation; development practice enabling support for localisation.
    l11n – localisation; act of localising an internationalised software product for a specific locale.

  9. Profile Permissions
  10. Visualforce pages are preprocessed for components directly bound to SObject fields where the user profile does not have CRUD or FLS permissions. In such cases the fields are not displayed or are made read-only, depending on visibility state. This comes as a surprise for many developers who assume that User Profile permissions are entirely ignored on Visualforce pages.

    reference: Enforcing_CRUD_and_FLS

    In the any-org model, where direct SObject field binding is being used in a Visualforce page, this may require a manual check during initialisation to protect the functional integrity of the page. For example, a custom page with no fields displayed and no explanation is not a great user experience, instead the page should simply inform the user they don’t have sufficient permissions, they can then take this up with their Administrators.

    private Boolean hasRequiredFLS(){    	
        // rule 1: all custom fields must be accessible.
        // rule 2: check isUpdateable on all fields where inline editing offered.
        Schema.DescribeFieldResult d;
        Map<String, Schema.SObjectField> siFieldNameToToken=Schema.SObjectType.SalesInvoice__c.fields.getMap();
        for (Schema.SObjectField f : siFieldNameToToken.values()){    		
        	d = f.getDescribe();
        	if (!d.isCustom()) continue;
        	if (!d.isAccessible()) return false;
        d = siFieldNameToToken.get('InvoiceDate__c').getDescribe();
        if (!d.isUpdateable()) 
        else {
        	d = siFieldNameToToken.get('DueDate__c').getDescribe();
        	if (!d.isUpdateable()) 
        	else this.isInlineEditable=true;
        return true;
    Coding Techniques

  1. Dynamic SOQL
  2. Do not statically reference objects or fields that may not exist in the org. Instead compose Dynamic SOQL queries and execute via Database.query(). With this approach, you can build the required query using flags which indicate the presence of optional feature fields such as RecordTypeId, CurrencyIsoCode etc. The Apex Language Reference provides good coverage of Dynamic SOQL. Be very careful to ensure that your composed string does not include user supplied text input – this would open up a vulnerability to SOQL injection security vectors.

    public static Id getStandardPricebookId(){
    	if (standardPricebookId==null){			
    		String q='select Id, isActive from Pricebook2 where IsStandard=true';
    		SObject p = Database.query(q);
    		if (!(Boolean)p.get('IsActive')){
    			update p;
    	return standardPricebookId;
    public SalesInvoice__c retrieveSalesInvoice(String siId){
            //& Using dynamic Apex to retrieve fields from the fieldset to create a soql query that returns all fields required by the view.
            String q='select Id,Name,OwnerId';
            for(Schema.FieldSetMember f : SObjectType.SalesInvoice__c.FieldSets.invoices__Additional_Information.getFields()){
                if (!q.contains(f.getFieldPath())) q+=','+f.getFieldPath();
            if (UserInfo.isMultiCurrencyOrganization()) q+=',CurrencyIsoCode';			
            if (AppHelper.isPersonAccountsEnabled()) q+=',PersonEmail,PersonContactId';  
        	q+=',(select Id,Description__c,Quantity__c from SalesInvoiceLineItems__r order by CreatedDate asc)';     		
            q+=' from SalesInvoice__c';
            q+=' where Id=\''+siId+'\'';
            return Database.query(q);
        } catch (Exception e){
            throw e;
  3. Dynamic Apex
  4. Do not statically reference objects or fields that may not exist in the org. Instead use Dynamic Apex techniques such as global describes and field describes. Where a new SObject is required, use the newSObject() method as shown below, this is particularly useful for unit test data creation. The Apex Language Reference provides good coverage of Dynamic Apex, every developer should be familiar with this topic.

    public static List<SObject> createPBE(Id pricebookId, List<SObject> products){
    	SObject pbe;
    	List<SObject> entries = new List<SObject>();		
    	Schema.SObjectType targetType = Schema.getGlobalDescribe().get('PricebookEntry');
    	if (targetType==null) return null;
    	for (SObject p : products){
    		pbe = targetType.newSObject();
    	if (entries.size()>0) insert entries;	
    	return entries;
  5. UserInfo Methods
  6. The UserInfo standard class provides some highly useful methods for any-org coding such as;
    isMultiCurrencyOrganization(), getDefaultCurrency(), getLocale() and getTimezone(). The isMultiCurrencyOrganization() method will be frequently used to branch code specific to multi-currency orgs.

    public static String getCorporateCurrency(){			
    	if (corporateCurrencyIsoCode==null){			
    		if (UserInfo.isMultiCurrencyOrganization()){
    			String q='select IsoCode, ConversionRate from CurrencyType where IsActive=true and IsCorporate=true';
    			List<SObject> currencies = Database.query(q);
    			if (currencies.size()>0)
    		return corporateCurrencyIsoCode;

  1. Unit Test Data
  2. In the any-org model the creation of unit test data can be a challenge due to the potential existence of mandatory custom fields and/or validation rules. To mitigate the former, Dynamic Apex can be used to identify mandatory fields and their data type such that test data can be added (via a factory pattern of some sort). In the latter case there is no way to reliably detect a validation rule condition and as such for ISVs it is a blessing that unit tests do not actual have to pass in a subscriber org (wrong as this may be in principle). In the single-org case we can improve on this (and we have to), by adding a global Validation Rule switch-off flag in a Org Behaviour Custom Setting (see previous post) – this approach is helpful in many areas but for unit test data creation it can isolate test code from Validation Rules added post-deployment. There’s a tradeoff here between protecting unit tests versus the risk of using test data that may not adhere to the current set of Validation Rules.

  3. Unit Test Code Coverage
  4. The addition of multiple conditional code paths, i.e. branching, for the any-org case makes it challenging to achieve a high code coverage percentage in orgs which do not have the accommodated features activated. For example, unit tests executing in a single currency org, will not be run code specific to multi-currency, and therefore the code coverage drops accordingly. To mitigate this, consider adding OR conditions to IF branches which include unit test flags and perhaps Test.isRunningTest() to cover as much code as possible before leaving the branch. During coding always strive to absolutely minimise the feature-specific code – this approach will help greatly in respect to unit test coverage.

  5. QA
  6. In the any-org model, it is imperative to test your code in an org with the accommodated features activated. This will require multiple QA orgs and can increase the overall testing overhead considerably. Also, factor in the lead time required to have features activated by Salesforce support, such as multi-currency and Person Accounts.

  7. Security
  8. Dynamic SOQL queries open up the possibility of SOQL-injection attacks where user-supplied text values are concatentated into an executed SOQL query string. Always sanitise and escape data values where such code behaviour is necessary.

  9. Governor Limits
  10. The any-org model is highly contingent upon the use of limited resources such as Apex Describes. As a best practice employ a helper class pattern with cached values.

    One Approach – Future Proofing Single-org Developments

    Optional Features – selective
    Multi-currency – yes
    Editions Support – no
    i18n – yes
    Unit Test Data – yes
    Profile Permissions – yes

    The list above is my default position on the approach to take on single-org developments, this can change significantly depending on the current state of the org in terms of configuration and customisation, plus the client perspective on the evolution of their Salesforce org and attitude toward investing in future-proofing/extensibility. In the single-org, consultancy project case it’s advisable to be completely open and let the client decide if the additional X% cost is worth the value. I think the real point here is that the conversation takes place and the client has the opportunity to make an informed decision.


Firstly, welcome to my blog!

The key themes for the blog will be technical architecture and agile development practices delivered in the form of how-to posts and theoretical musings. The intent is to provide interesting, actionable knowledge for cloud architect practitioners specialising in technologies. Along the way secondary topics such as Scrum, usability methods and AppExchange ISV considerations will be covered.