Showing posts filed under:

Salesforce

UI Tips and Tricks – Country List

This tip introduces a simple pattern used by many AppExchange solutions to manipulate standard layouts. In short, a custom sidebar component is added which contains JavaScript code which manipulates the page at the DOM level. The following simple example shows how the country fields on the account layout can be changed to lists – a common customer request. A Country__c custom object with the Name field populated with country names is required. The Dojo library is also used. Please note, the code below is old and hasn’t been tested recently, I provide this for illustration of the pattern, it certainly won’t be winning any prizes.

So, in the example, Dojo runs the replaceCountryInputs() function when the DOM is ready, this finds the country fields by their ID (Firebug is a good way to browse the page markup), we then remove the original input element and replace with a select element with the same Id. The select element is then populated in JavaScript with the result of a soql query – using the Ajax API. Finally we need to add the inline editing handlers to the new select element – and we’re done.

On the configuration side, the sidebar component must be assigned to the home page layout for relevant users and the setting to enforce display of custom sidebar components must be enabled.

As I’ve said, the use case here isn’t significant it’s the possibility that this technique enables in terms of standard layout manipulation. Be aware that the field ids may be subject to change.

[sourcecode language=”javascript”]

<script src="/js/dojo/0.4.1/dojo.js"></script>
<script src="/soap/ajax/19.0/connection.js" type="text/javascript"></script>
<script type="text/javascript">
var arCountries = getCountries();
var vProcess = replaceCountryInputs();

function replaceCountryInputs()
{
if (document.getElementById(‘acc17country’)!=null) {
var defaultVal = document.getElementById(‘acc17country’).value;
var cInput = swapFieldType(‘acc17country’);
setCountryListWithDefault(cInput, defaultVal);
}
if (document.getElementById(‘acc18country’)!=null) {
var defaultVal = document.getElementById(‘acc18country’).value;
var cInput = swapFieldType(‘acc18country’);
setCountryListWithDefault(cInput, defaultVal);
}
if (document.getElementById(‘acc17_ilecell’)!=null) {
SetupInline(‘acc17’);
}
if (document.getElementById(‘acc18_ilecell’)!=null) {
SetupInline(‘acc18’);
}
}

function swapFieldType(i)
{
var cInput = document.getElementById(i);
var cInputParent = cInput.parentNode;
cInputParent.removeChild(cInput);
cInput = document.createElement(‘select’);
cInput.size = 1;
cInput.id = i;
cInput.name = i;
cInputParent.appendChild(cInput);
return cInput;
}

function setCountryListWithDefault(i, d)
{
if(i!=null) {
if(arCountries.length>0) {
for(x=0;x<arCountries.length;x++) {
if(arCountries[x]==d) {
i.options[x] = new Option(arCountries[x], arCountries[x], false, true);
} else {
i.options[x] = new Option(arCountries[x], arCountries[x], false, false);
}
}
} else {
i.options[x] = new Option(‘No countries found’, ‘No countries found’, false, true);
}
}
}

function SetupInline(prefix) {
var _element = document.getElementById(prefix + ‘_ilecell’);
if (_element) {
_element.ondblclick = function() {
var _loaded = false;
if (!sfdcPage.editMode)
sfdcPage.activateInlineEditMode();

if (!sfdcPage.inlineEditData.isCurrentField(sfdcPage.getFieldById(_element.id)))
sfdcPage.inlineEditData.openField(sfdcPage.getFieldById(_element.id));

var idInput = prefix+’country’;

if (document.getElementById(idInput)!=null) {
var defaultVal = document.getElementById(idInput).value;
var cInput = swapFieldType(idInput);
setCountryListWithDefault(cInput, defaultVal);
}
}
}
}

function getCountries()
{
sforce.sessionId = getCookie(‘sid’);
sforce.connection.sessionId=sforce.sessionId;
var out = [];
try {
var queryCountries = sforce.connection.query("Select Id, Name FROM Country__c ORDER BY Name");
var countries = queryCountries.getArray(‘records’);
for(x=0;x<countries.length;x++) {
out[out.length] = countries[x].Name;
}
} catch(error) {
alert(error);
}
return out;
}
dojo.addOnLoad(replaceCountryInputs);
</script>
[/sourcecode]

UI Tips and Tricks – Picture Upload

In the very early 90s I was employed as a professional Visual Basic programmer (and no that isn’t a contradiction in terms) enjoying greatly the development of client-server accounting systems. Good times. In those days tips and tricks sites were a big part of how the community shared knowledge. Following some recent reminiscing, through this series of posts, I’ll share some UI tips and tricks that aren’t necessarily an architects concern however I hope will prove helpful to some.

Ok, so down to business. Today’s tip concerns the upload and display of record images in the context of standard functionality, for example a Product image or Contact photo. The latter I’ll use as an example.

1. Add a field to Contact named [Photo Document Id], of the text type, 18 char length.

2. Add a field to Contact named [Photo], of the text formula type, formula set as below.
[sourcecode language=”xml”]
<formula>IMAGE(‘/servlet/servlet.FileDownload?file=’Photo_Document_Id__c”)</formula>
[/sourcecode]

3. Add a Visualforce page named ContactPictureUploader, with no markup between the page tags.
4. Add an Apex class ContactPictureUploaderController, code as below, set as the VF page controller and ensure the page action is set to the initialise method.
[sourcecode language=”java”]
public with sharing class ContactPictureUploaderController {
private Id contactId;

public ContactPictureUploaderController(){
contactId = ApexPages.currentPage().getParameters().get(‘cid’);
}

public PageReference initialise(){
List<Attachment> listA = [select a.Id from Attachment a
where a.createdById =:UserInfo.getUserId()
and a.parentId=:contactId order by a.createdDate desc limit 1];
if (listA.size()>0){
Id attachmentId = listA[0].Id;

Contact c = [select Id from Contact where Id=:contactId];
c.Photo_Document_Id__c = attachmentId;
update c;
}
return new PageReference(‘/’+contactId);
}
}
[/sourcecode]
VF Page markup.
[sourcecode language=”html”]
<apex:page controller="ContactPictureUploaderController" action="{!initialise}">
<!– Controller context page – no markup –>
</apex:page>
[/sourcecode]

5. Add a custom button to the Contact object of the JavaScript type, named Upload Photo, script as below.
[sourcecode language=”javascript”]
parent.window.location.href=’p/attach/NoteAttach?pid={!Contact.Id}&retURL=/apex/ContactPictureUploader?cid={!Contact.Id}’
[/sourcecode]

In short, the solution works by invoking the standard attachment page, which on submit redirects to the VF page which copies the uploaded document id to the Contact Photo Document Id field, then redirects back to the contact record. The image field on the Contact object then loads the image using the IMAGE() formula field, easy. The image field can then be used on the contact layout, related lists etc..

Summer ’12 Lookup Relationships

The Summer ’12 release introduces some fundamental changes to the functionality of lookup relationships. In summary:

1. Optionality. Lookup relationships can now be set as mandatory (Required Attribute). This is great news in that the usual validation rule enforcement can now be forgotten.

2. Referential integrity. Prior to Summer ’12, the parent record could be deleted without regard to related child records, which would have their lookup fields nulled. This remains the default behaviour. You can however specify one of the following behaviours in the optional case, for mandatory lookups only 2 and 3 are possible. I’m using the terms parent and child here in the loosest sense for convenience, the nature of the relationship is associative.

2.1 Clear the child field value – default
2.2 Prevent the parent being deleted (Don’t allow deletion of the lookup record that’s part of a lookup relationship)
2.3 Cascade delete (Delete this record also) – This one requires activation via salesforce support, and ignores the sharing model, meaning if a user has record access to the parent, they can delete it and related children without requiring permissions at the child level. This option is restricted to cases where the child is a custom object.

The above enhancements to the lookup relationship type start to blur the lines between lookup and master-detail, however the key differentiation remains in terms of ownership versus association.

Salesforce Large Data Volumes

My own simplistic definition of an LDV scenario in the Salesforce context is in excess of 2 million records in a single object. This is a rule-of-thumb rather than prescriptive.

The following Salesforce functional areas are most affected by LDV, in terms of performance, timeouts etc. :

Reports
Search
Listviews
SOQL

Working with LDV :
In LDV cases, first and foremost you should consider options for Data Reduction such as archiving, UI mashups etc.. Where LDV is unavoidable the concept of Selectivity should be applied to the functional areas impacted most.

Data Reduction Considerations:
Archiving – consider off-platform archiving solutions
Data Warehouse – consider a data warehouse for analytics
Mashups – leave data in-situ in external systems and integrate at the UI level

Selectivity Considerations:
Selective – reduce the number of objects and fields used in a query.
Narrow Range Reporting – apply selective report filtering.
Filtering – apply restrictive filtering to Listviews and SOQL where clauses.
Custom Indexes – can be effective where query values in the indexed field represent less than 10% (or <300K) of the overall record count.
Skinny Tables – can be effective as a report accelerator at 5M records plus.
SOQL – avoid NULL values (can’t be indexed)
Horizontal partitioning of data – split objects by geography for example.
Denormalisation – to remove expensive joins use Apex Triggers to resolve FK
Divisions – acts like a DB partition (by geography, ownership etc.)
Avoid over parenting – 10K limit for child records, per parent record. For example avoid one parent account having more than 10,000 contacts etc.
Data Skew – look for even distribution.

Loading Data :
Use the Bulk API to load large data volumes, via the Apex Data Loader perhaps (requires setting value to be explicitly set). The Bulk API, in simple terms, uploads the data into temporary tables then executes processing of the data (actual load into target objects) using parallel asynchronous processes. This offers potential performance improvements over the serial and synchronous combined upload+process model employed by all other loading techniques.

Data Loading Considerations;
Defer complex sharing rules
Disable Triggers and Workflow (post-process via Batch Apex)
Speed of operation; Insert then Update then Upsert (involves implicit query)
Group and sequence data to avoid parent record locking
Remember database statistics calculate overnight, wait to do performance testing
Tune the batch size (HTTP keepalives, GZIP compression)

Salesforce Org Architecture

The figure above shows a complex multiple org architecture (Hub-and-Spoke model). I’ll return to the drivers for multiple-org versus single org in a future post. For now let’s consider some interesting aspects of the above scenario.

SSO : users log in via their Active Directory Credentials. The CORPORATE org being a SAML 2.0 Service Provider to the AD Identity Provider. The CORPORATE org is also aN Identity Provider, enabling SSO across all child-orgs (which are SP).

Managed Packages : versioned baseline functionality. It’s often the case that certain configuration elements are common across orgs in a multi-org architecture. A best practice is to distribute this metadata as a managed package thereby preventing local modification. The business owners of the client org are free to innovate in their org, but the baseline configuration is locked (possibly to ensure compatibility with data sharing processes). Managed packages arenot just for ISV.

Salesforce-to-Salesforce : data sharing (automated or manual). S2S is a very underrated technology, enabling bi-directional, selective sharing of data between orgs. A great fit for multi-org architectures where common data can be shared across all orgs, or partitioned (geographically, business type etc.) and perhaps consolidated at the CORPORATE level.

External Execution Environment : complex, off-platform processing (perhaps legacy components written in Java). Salesforce orgs are subject to execution limits (governor limits etc.), whilst this becomes less restrictive with each release, there are times when an external execution environment can be helpful. For example a payroll calculation engine (written in Java and used within the enterprise) could be deployed to Heroku and called via Apex. Personally, I look to repurpose or buy technology before coding anything – the ability to assemble a solution should not be overlooked.

Force.com Flow

With Spring ’12 the native Cloud Flow Designer went GA, flows are now also metadata API enabled (an important point for ISVs). Force.com Flow, or Visual Workflow as it’s also referred, introduces a compelling approach to business process automation for less technical projects, enabling business analysts to define complex flow logic involving data manipulation and user-interface interactions. Technical input is only really required when complex calculations are required, and perhaps also when a host Visualforce page is necessary to run the flow (think portal or Site deployments). The ability to drop in Apex code plug-ins, provides great flexibility, for example a complex discount calculation could be added to a configure-price-quote type flow.

Some general notes:

1. Flows are defined visually by the native Cloud Flow Designer, this tooling is targeted at the Business Analyst role.
2. Good fit for staged process with simple input interactions.
3. Typical use cases; Insurance quote, Mortgage application, Call scripting, Diagnostics.
4. Flows contain Steps, Screens, Multi-state decision elements, CRUD operations, Variables, Formulae, Apex plug-ins (Apex class implementing the Process.Plugin interface).
5. Invoke a Flow using its URL (access/integrate via custom link in the native UI).
6. Flows can be hosted in a Visualforce page – for Portals and Site entry points, plus custom styling can be applied.
7. Beyond Spring ’12 – headless flows, asynchronous executions, mobile runtime.

Salesforce Certified Technical Architect

I passed the Salesforce Certified Technical Architect programme in January of this year (2012), here I’ll share general thoughts on the process and some key areas to study.

Ok, so the CTA programme has 3 elements;

1. Self-certification
Simple record of who you are, plus a self-assessment on domain relevant skills. Note you are required to provide referees who can testify to your expertise and experience.

2. Multi-choice assessment
Typical salesforce.com certification multiple choice exam – 2 hours, 60+ questions. I found that the questions were longer than the advanced developer, administrator and cloud consultant equivalents. I also took the beta version which was 120+ questions over 4 hours – the duration of which, via online proctoring, was a challenge in itself (I couldn’t leave my desk!).

3. Review board
This is by far the most serious undertaking in the realm of salesforce.com certification and given the expense and preparation involved should only be considered by those practitioners with broad and deep platform knowledge and relevant experience as a TA on multiple, diverse salesforce.com technical projects.

The review board session is in 2 parts; hypothetical scenario and customer case study. Note; the exact composition and timings of the session reflect my session in January, it’s likely this will be refined over time.

Hypothetical scenario
75 minutes prep – this was easily the quickest time of my life; a lot of requirements, context and solution design considerations to tackle. A clear strategy of how you intend to synthesise all the information, define solution options (with trade-offs) and articulate this back to the board will be key. You really will need to know the salesforce.com platform intimately, specific examples being the sharing model and where declarative functionality can be employed to avoid the necessity for custom technical components.
30 minutes presentation – present findings back to the board. Clear and confident delivery is key – practice makes perfect if your job doesn’t involve frequent, formal presentations to customers/clients etc.
30 minutes Q&A – you will be quizzed on the rationale behind your proposed solution options – being able to stand-up your design and discuss the alternatives is imperative.

Customer case study
30 minutes presentation – In my view this isn’t an occasion for 50 slides full of text – instead a clear, interesting story covering the topic areas highlighted in the study guide is required. The time is fixed, so practice to ensure you can deliver your pitch at a comfortable pace within 25-30 minutes. I ran out of time on the last slide – meaning I missed some key content, and I’d practiced to an audience 3 times in advance. I took this part of the review board as a time to shine by presenting a recent project I was comfortable to answer absolutely any question on. I also chose to stick to the facts rather than strive to cover 100% of the stated objectives – not every project includes a Change Management Board for example.
45 minutes Q&A – detailed questioning on your case study.

I really enjoyed taking the CTA programme, for technical/solution/enterprise or even cloud architects working with the salesforce.com platform, this prestigious accreditation requires significant personal commitment but the feeling of reward is considerable. As with any architect level accreditation the CTA programme is extremely challenging, however this is exactly how it should be in my view.

Study areas:
The CTA programme has a very different focus from the salesforce.com Advanced Developer certification (DEV501), which primarily examines proficiency with Apex and Visualforce – it’s unlikely you’ll encounter such questions in the CTA multi-choice exam or review board sessions.

The list below is by no means exhaustive but covers the high-level areas I focused my study upon;
SSO (Delegated and Federated via SAML)
OAuth Flows (User Agent, Web Server, SAML Assertion etc..)
Large Data Volumes (big data impact areas, strategies)
Platform Security
Portals and Record Sharing Strategies
API and Integration Use Cases (inbound and outbound)
Data Migration Considerations
Platform Governance (COE, Change and Risk Management)
Development Methodology (Agile, Waterfall)
Build Automation and Source Code Control
Data Modelling (ERD, Normalisation)
Declarative versus Programmatic Considerations
Org Strategy (multi versus single org)
Force.com Flow

Excellent references:
Dreamforce 2011 sessions on youtube
developer.force.com
wiki.developerforce.com
certification.salesforce.com/architects