Showing posts tagged with:

Reporting

Salesforce Selective Queries

A basic understanding of the selective query concept is fundamental to scalability on the Salesforce platform. Non-selective queries have a significant performance impact on List Views, Reports and SOQL and can often result in Apex Trigger runtime exceptions as below – as the data volume increases over time.

[code language=”java”]
System.QueryException Cause: null Message: Non-selective query against large object type (more than 200000 rows)
[/code]

SOQL queries executed from Apex Triggers have to be selective where the record count in the queried object is 200,000 or more. The determination of a selective query is a function of volume plus field-level selectivity. It is therefore the case that the selectivity state of a given query is volatile meaning in practical terms that the initiating Apex Trigger may work one day and not the next.

Selective Query Definition

Selectivity is determined by the index state of the query filter conditions and the number of records the filter returns (selectivity) versus the object total records. The thresholds below show the difference between the selectivity calculation for a standard index versus a custom index.

Selectivity Thresholds:
Standard Index – 30% (first 1M) then 15%
Custom Index – 10% (first 1M) then 5%

Unary filter:

e.g. select Name from Account where IndexedField__c=’ABC’

With a custom index on IndexedField__c the filter must return <10% of the total records in the object to be considered selective – up to the first 1 million records from that point the threshold drops to 5%.

Multiple filters AND (exclusively):

e.g. select Name from Account where IndexedField__c=’ABC’ and SecondIndexedField__c=’123′

The Query Optimiser will set the leading operation on the basis of lowest cost. If no filters are selective a table scan is performed.

If all filters have an index then a Composite Index Join optimisation can be applied.

In this case each filter must be less than 2x (two-times) the selectivity threshold.
All filters combined must be less than selectivity threshold.

If all filter fields are standard then use the standard index selectivity threshold – otherwise use custom index selectivity threshold.

Multiple filters OR (at least one):

e.g. select Name from Account where IndexedField__c=’ABC’ or SecondIndexedField__c=’123′

Selective AND filter indexes could be set as the Leading Operation – if none exist, then a table scan occurs unless all filters have an index then a Composite Index Union optimisation becomes possible.

In this case each filter must be less than selectivity threshold.
All filters combined must be less than selectivity threshold.

If all fields are standard then use the standard index selectivity threshold – otherwise use custom index selectivity threshold.

Parent Field Filter:

e.g. select Name from Contact where IndexedField__c=’ABC’ and Account.IndexedField__c=’ABC’

Where parent object fields are referenced in a filter, each filter index is individually and the lowest cost option selected as the leading operation.

Note, the parent field is not indexed on the queried object, so Account.Id can incur a table scan on Opportunity whereas AccountId may allow the standard index to become the leading operation.

The notes above provide a basic outline of the concepts but should be sufficient to convey the key concepts.

 

Implementation Approach

As data volumes grow query behaviour can change dramatically, to mitigate this database queries originating in Apex code, list view and report definitions must consider the future peak data volume and field-level data characteristics (primarily selectivity). This considered approach can help identify an appropriate indexing strategy that maintains query performance by ensuring query selectivity. So, forward planning is absolutely key; queries should be designed to be selective up to the projected peak data volumes. Thinking ahead to this extent is very seldom applied in my experience, particularly where the Salesforce implementation evolves project-by-project and the longer term picture is not a priority.

In order to evaluate the selectivity of a given query – the following 2 approaches can be applied.

REST API Query Resource Feedback Parameter

The Force.com REST API exposes a Query resource that accepts an explain parameter which can set with a SOQL query, List View Id or Report Id. The results show the options considered by the Query Optimiser and the lowest cost option (leading operation) taken. A relative cost value of less than 1 indicates a selective query, anything higher indicates non-selective. The example below shows the output for a report where a report filter hits an indexed field.

selective-queries-api

Developer Console Query Plan

From Summer ’14 on the Developer Console can be enabled [Help>Preferences>Enable Query Plan] to display a Query Plan option on the Query Editor tab. The construct of the output is the same as the API approach. Note, this appproach is limited to SOQL queries.

selective-queries-developer-console

The references section below provides link to the technical detail of the 2 approaches introduced above.

 

References

Query & Search Optimisation Cheat Sheet

Developing Selective Force.com Queries through the Query Resource Feedback Parameter Beta

Developer Console Query Editor

Improve performance with Custom indexes using Selective SOQL Queries

Custom Index Request Checklist

Salesforce Analytics Cloud Overview

analytics-cloud

This post provides an overview of the features and functionality of the Salesforce Analytics Cloud.

The Analytics Cloud, or Wave Analytics as it is also referred to, is a cloud-based business intelligence platform that enables the connection of disparate data sources to form interactive data views (or visualisations) that can be distributed via dashboards. A key concept here is empowerment of business users to deliver their own insights, in fact usability is often described as the defining feature – perhaps in addition to mobile access.

From a history perspective, during 2013 Salesforce acquired a cutting edge BI platform via the EdgeSpring acquisition. The EdgeSpring platform included the EdgeMart data storage service plus the Lens dynamic visualisation engine; both of which (one could safely assume) feature strongly in the Analytics Cloud architecture.

The first release of the Salesforce branded Wave Analytics took place at Dreamforce 2014, where the dynamic visualisation capabilities drew significant attention. The offer at this stage was expensive and complex in terms of licensing (Explorer and Builder user licenses plus platform setup fee/license), platform-led and focused on enterprise deployments. The perceived complexity and cost aspects particularly impacted negatively upon adoption. A second generation of Wave Analytics was launched at Dreamforce 2015, this time with a simplified license model and a focus on prebuilt analytics apps for Sales, Service and ultimately Marketing. The introduction of prebuilt Wave Apps offers two clear benefits; template-based, simplified deployment and tighter, cross-cloud integration (with Sales and Service Cloud predominantly). This latter point is key; end-users shouldn’t have to navigate consciously between distinct analytic and transactional views the two services should be seamlessly blended – this is the key differentiator Salesforce will be targeting to drive adoption.

The current set of prebuilt Wave Apps are listed below. There are also a growing number of 3rd party Wave Apps being developed on the Wave Analytics Platform; Financialforce (ERP Wave Apps; Accounting and Supply Chain) and Apptus (Quote-to-Cash Intelligence App) are notable examples.

Sales Wave Analytics App: A 9-step wizard captures parameters relating to segmentation, lead funnel and opportunity pipeline fields plus additional Sales Cloud related dimensions and measures; on completion an App is created with datasets populated via a number of auto-launched dataflows.

Service Wave Analytics App: As above with Service Cloud related dimensions and measures (7-step wizard).

Event Monitoring Wave App: Event log and setup audit trail datasets enable analysis of org and user behaviour.

Wave for B2B Marketing App: Consolidation of Sales Cloud and Pardot data to enable analysis of marketing impact on sales etc.

Key Concepts

Wave Assets – App: Analogous to a Folder an App contains a logical grouping of dashboards, lenses, and datasets. Apps can be Shared or Private.

wave-assets-app

Wave Assets – Dashboard: A dashboard is a composition of charts, metrics, and tables based on the data provided by one or many lenses.

wave-assets-dashboard

Wave Assets – Lens: A lens is a view on to the data provided by a dataset. The definition of a lens encapsulates both a query and visualisation for a specific analysis. A Lens can be clipped; this effect of this is to copy the Lens query to a Step within the most recently used dashboard.

wave-assets-lens

Wave Assets – Dataset: A dataset provides the source analytical data in a representation optimised for interactive visualisation. Data Sources can be Salesforce objects, uploaded files or partner connectors (Informatica, Jitterbit etc.). Fields added to a dataset are defined as date, dimension (qualitative) or measure (quantitative) type. Predicates can be added, with filter logic, that define record-level permissions which reflect Salesforce record ownership, management visibility or team/account collaboration.

wave-assets-dataset

Dataflow: A dataflow is a set of instructions (in JSON format) that specifies the data to extract and transform from Salesforce objects or datasets.

wave-dataflow

Visualisation: A single analytical representation of data (chart, tabular etc.) underpinned by a query.

Dimension: A dimension is a qualitative value such as Product or Region. Data analytics are primarily comprised of measures projected over multiple dimensions.

Measure: A measure is a quantitative value such as Price or Quantity. Mathematical operations can be applied to measures to calculate aggregates.

Architecture

The diagram below shows the main building blocks of a Wave Analytics App and the flow of data.

wave-analytics-architecture

Key Features

Cross-Dataset Faceting: Faceting enables steps to auto-filter in response to filters applied to a related step. Steps that share the same dataset facet by default, cross-dataset faceting can be defined directly in the dashboard designer.

Trend Wave Dashboards: Trended Datasets can be created from Salesforce reports. A snapshot of the report data is created each time the dataset is updated by the trend schedule. Each snapshot is limited to 100K rows.

Bulk Actions (Spring ’17): Table widgets can invoke a custom bulk action where the underlying SAQL query is passed to a designated Visualforce page. Apex page controller code can then execute the SAQL query (via the Wave Analytics API) and apply custom logic to results. This provides a flexible integration point, where analytics can be used to drive action such as campaign creation, email sending and so on – powerful stuff.

Smart Data Discovery (Spring ’17 tbc): This feature relates to the integration of BeyondCore (another Salesforce acquisition) into the Analytics Cloud UI. BeyondCore adds statistically significant insights such as unbiased answers, explanations and recommendations.

Analytics Home (Spring ’17): The new Analytics home page allows commonly used apps to be pinned, notification tracking is displayed with convenient links directly to the related dashboards. In new Wave orgs the Analytics tab can be accessed directly within the Salesforce UI, older orgs require a custom tab plus Visualforce container page.

Dashboard Annotations: Widgets within a dashboard can be enabled for collaboration via Annotations. This feature is natively integrated with Chatter; annotations will appear as Chatter posts and vice-versa. @Mentions are also supported.

Smart Notifications: Number type dashboard widgets can be configured for smart notifications; criteria is added to define the notification logic and a query scheduled defined. The notifications appear in the app, mobile app and are also sent via email.

Salesforce Integration Points

Lightning Experience: The Wave Dashboard component enables dashboards to be added to Lightning Home Pages, Record Pages and App Home Pages defined within the Lightning App Builder.

Salesforce Classic: Wave Analytics Assets appear listed in the palette of the Enhanced Page Layout editor and can be added directly to page layouts with context supplied via field mapping. The Visualforce component enables dashboards to be embedded and integrated at any entry point available to Visualforce (Custom tabs, links/buttons etc.). The component supports filtering to enable context to be set.

Communities users (Customer Community Plus and Customer Partner Community licenses) can view Wave Analytics Dashboard embedded in a Visualforce page.

References

Analyze Your Data – 700+ page PDF
Wave Analytics Platform Setup Guide
Wave Analytics Data Integration

Salesforce Standard Reporting (3 of 3) – Best Practices

This final post in the Salesforce Standard Reporting series outlines a selection of best practice concepts and techniques to be considered in the delivery of on-platform reporting solutions that maximise the value of the standard reporting tools.

The key message of this series is simple; a solid understanding of how the Salesforce standard reporting tools work in practice, and the reporting patterns supported, can avoid the necessity to employ additional, expensive off-platform reporting tools. The earlier this thinking is applied, the greater the likelihood of success.

Best Practice Considerations

In no significant order, the following points provide a non-exhaustive selection for consideration.

  • Solution Design
  • Perhaps the most obvious point, but equally a common oversight, is the consideration of key performance indicators during the solution design phase. This approach shouldn’t entail a full coverage of all reporting requirements; instead a selection of exemplar reporting use cases that represent a broad set of required outputs should be documented in clear concise terms and factored into the overarching solution design. This early consideration mitigates the inherent risk of the classic “reporting workshop” held once the solution design is no longer emergent; the impact of which can often be the introduction of off-platform reporting solutions with their cost and security implications.

  • Data Model
  • A Salesforce data model, in physical terms, will not necessarily comply to standard relational database normalisation principles, other factors must be considered. The sharing model is one such consideration, where record access or indeed object access permissions mandate a deviation from the standard 3rd normal form. Reporting is another significant area of consideration; constraints in respect to the depth of object relational hierarchies and the ability to traverse parent-to-child relationships must be accommodated within the data model design to maximise the potential of the standard reporting tools. Techniques in this regard include reporting convenience lookups, which are added to bridge parent-to-child and sibling relationships, and restructuring hierarchical data to limit the depth of hierarchy to four levels.

  • Sharing Model
  • The standard reporting tools fully respect the implemented sharing model and as such a complex, over-specified sharing model design can inhibit the potential use. The sharing model design often reflects the transactional processing requirements for record visibility, but not necessarily the reporting need. To avoid creative workarounds during report production it is imperative that the sharing model design reflects both the former and the latter requirements.

  • Report Types
  • Standard Report Types are maintained by the platform and require zero administration, new fields are exposed automatically. As such, wherever possible build reports using standard report types. Custom Report Types require ongoing maintenance, but can be highly useful in providing a clear data-set for a focused purpose or where the standard report types are insufficient. The implementation of a limited number of complementary custom report types with clear descriptions that adhere to a strict naming convention is the best practice approach. A standardised approach here promotes re-use. A typical implementation for reporting on the Salesforce platform involves end-users developing their own reports, for this to work efficiently they should be provided with a clear set of intuitive report types upon which to work.

  • Conventions
  • Given that reports are often developed by business users and not administrators or developers, it can be challenging to maintain an ordered state. To mitigate this risk, a strict naming convention and structure should be adopted for report folders and exemplar reports should be provided that exhibit a standardised approach to report naming (and description). It is a best practice to conduct periodic reviews of the reporting environment with the business users to ensure standards are applied, inefficiencies are avoided (such as duplication), platform features are being exploited optimally and ongoing training requirements are identified and addressed.

    Folders

  • Art of the Possible
  • Clear communication in regard to the capabilities and constraints of the standard reporting tools is imperative to the successful implementation of a reporting solution. A demonstration environment configured with contextualised reports and dashboards which showcase the art-of-the-possible can be an effective communication tool. Business users and the project team should have access to the demo org to explore the possibilities and to establish their own frame of reference.

    This approach can aid understanding (and therefore increase usage) of less obvious concepts such as historical trend reporting, reporting snapshots, dynamic dashboards, dashboard filters and joined reports.

  • Visibility
  • A common oversight in the implementation of an effective on-platform reporting solution is the visibility of reports. Reports should not be hidden away on the Reports tab, instead all possible entry points and display options should be considered as part of an overarching report visibility model. Examples in regard to entry points include custom links and buttons on detail pages, perhaps with some level of parameterisation. Examples in regard to display options include report charts added to detail pages (Embedded Analytics) and console sidebars (Summer ’15 feature). A further consideration is the inclusion of report charts in Visualforce pages (via the reportChart component), this approach avoids the requirement to address the underlying data directly in Apex code where the reporting engine can be applied.

  • Collaboration
  • Reports and analytics can provide important data visualisations and business insight that should serve as the basis for employee collaboration. An effective Chatter implementation model should therefore encourage communication and sharing around reports and dashboards such that the internal conversation is captured.

    Dashboard Collaboration 1

    Dashboard Collaboration 2

  • Active not Passive
  • Reports and analytics are typically implemented to deliver outputs in a passive state, i.e. the report runs on-demand or by schedule and the output is provided. A final best practice to consider is the active state where report outputs are evaluated against defined conditions and action (email, post, Apex script etc.) is taken automatically. Reporting Notifications provide active state options that can be a powerful tool in reducing report-noise and driving actions proactively from significant data conditions.

    Notifications

Salesforce Standard Reporting (2 of 3) – Report Builder

This post is the second in the Salesforce Standard Reporting series and serves to outline the capabilities and constraints of the standard Report Builder. I’m using the term Report Builder loosely here as a term that groups the majority of the on-platform standard reporting capabilities.

The key message of this series is simple; a solid understanding of how the Salesforce standard reporting tools work in practice, and the reporting patterns supported, can avoid the necessity to employ additional, expensive off-platform reporting tools. The earlier this thinking is applied, the greater the likelihood of success.

Report Builder Capabilities
The following sections outline the key capabilities of the Report Builder with a view to establishing the context within which the supported reporting patterns can be described.

  • Fields
  • The right-hand-side of the Report Builder UI displays the sections and fields defined within the selected Report Type. All reports are based on one principal Report Type, whether Standard or Custom.

    Note, at a conceptual level it can be useful to mentally picture the data presented by the report type as single-big-table of denormalised data (just rows and columns like a spreadsheet) with the maximum number of rows equating to the number of child records at the lowest level.

  • Report Formats
  • Report Builder supports 4 formats;

    1. Tabular
    A simple view comprised of an ordered set of columns as-per a spreadsheet, with no summarisation of data, the lowest level input records are presented.

    2. Summary
    Extends the tabular view to enable specified field values to be used to group input records with subtotals per grouping.

    3. Matrix
    Extends the summary view to enable both row and column groupings, as per Pivot tables in the Excel context.

    4. Joined
    Enable multiple sub-reports (blocks) to be added to a single report, with each block being linked to a specific report type and configured with its own fields, filters and sorting.

    4. Joined Report

    With a joined report input records can be grouped across the blocks using fields from common parent objects; such fields are listed under Common Fields. The common parent object must be applicable across all report types added.

    5. Joined Report Run

  • Filters
  • Each report type added to a report has a Dynamic Filter (e.g. “My Accounts”, “My Team’s Accounts”), a date field filter (which requires a date field to be specified, plus a range), and custom filters where any field in the report type can be filtered against static value or relative date value criteria. If a historical trending report type is used then a Historical Date filter is added which allows selection of up to 5 snapshot dates.

  • Bucket Fields
  • Bucket fields can be defined which map a list of input field values against a bucket field output value. The bucket field can then be added as a column, to provide a summarised view of the input data. Bucketing can be useful for use cases such as grouping strategic accounts or extensive ranges of data into smaller distinct set of High, Medium, Low range values. Note bucket fields aren’t available for Joined Reports.

  • Formulas
  • Report Builder supports Custom Summary Formulas and Cross Block Custom Summary Formulas. In the former case, the min/max/sum/average value of numeric fields can be used as inputs to a formula expression, the output of which displays at the selected summary level. Each report can have 5 such formulas. Cross Block Formulas extend the same approach, but enable block specific inputs from multiple blocks to be assembled into a single expression.

  • Report Generation
  • Once defined a report can be run to generate the output. In view mode Report Builder supports filter manipulation, data export, a printable view (html) and the ability to schedule future runs with email distribution to named users or public groups.

  • Report Charts
  • Report Builder enables summarised report data to be presented on a chart located directly within the report. Summary groups can be added to the chart as rows or columns (depending on the type of report vertical/horizontal bar etc.), with a selected aggregated value available on the opposing axis. Up to 2 levels of grouping can be selected per chart, with combination charts enabling additional aggregated values to be plotted; this potential ranges from 1 to 3 values depending on the type of chart. Cumulative Line Charts enable the aggregate values to be plotted as a cumulative figure.

    Report Builder has other capabilities not mentioned in the outline above; multiple currency handling, conditional highlighting and the ability to invoke a parameterised report via URL are notable examples. However for the purpose of setting context, the above provides a sufficient coverage.

Report Builder – Benefits
The Report Builder tool is a classic transactional reporting tool, with a deliberate focus on usability over complexity. In my view this is definitely the right approach; report production should be an end-user concern whenever possible. A technically-oriented, complex report builder would clearly detract from that possibility. Self-sufficient end-users, confident that they can report and track accurately can be the best adoption advocates possible. This best-case scenario has the added benefit of removing any potential resource bottleneck associated with the implementation project/IT team delivering all the reporting outputs. The usability of the report builder correlates to its functional simplicity, which in turn imposes constraints on the type of reporting possible. It is the case however that the Report Builder does provide a good level of coverage for the majority of transactional reporting patterns, some examples are listed below.

  • Transactional Reports
  • Basic tabular reports are easy to produce; the WYSIWYG drag-and-drop editor provides a rich, intuitive design environment for basic transactional reports.

  • Summary Reports
  • Basic summary reports that can be satisfied within the grouping limits are straightforward to produce. For summary reports, 3 levels of grouping can be applied, for matrix reports it is possible to have 2 column groupings and 2 row groupings, so 4 in total. Excel allows an unlimited number of row or column fields in a PivotTable report (limited by system memory); in this light 4 appears restrictive.

  • Exception Reports
  • An often overlooked capability of Report Builder is the Cross Filter (the Row Limit filter is another example). Cross Filters allow explicit control over the join type (with = inner, without = outer) and enable the production of exception reports such as “Accounts without Cases”. Within a Cross Filter, up to 5 sub-filters can be added that increase the selectivity applied to the related object. This type of filter is typically used in conjunction with rollup summary fields to cover scenarios such as “Closed Opportunities over time, without a particular product”.

  • Embedded Analytics
  • Embedded data visualisations within report outputs, or page layouts provide a convenient method for combining the headline statistics and underlying detail in a single view. In the former case, if you then consider the native ability to schedule the generation and distribution of reports, the potential business utility is considerable.

  • Dashboards
  • Report Builder produced reports underpin standard dashboard components, enabling a single report to deliver both the transactional view and data visualisations drawn from the same data. The dashboard features are extensive but key features beyond the obvious are Dashboard filters and Dynamic dashboards. In regard to the former, up to 3 filters can be applied to drill into the data, each filter may have up to 10 (50 hard limit) filter options. With the latter it’s possible to define a single dashboard that inherits the record visibility of the viewing user. This relatively new capability removes the historic requirement to maintain duplicate dashboards set to run with different specified users. Dashboard filters can be applied to Dynamic dashboards providing a high degree of flexibility and through consolidation a significant reduction in the number of dashboards required.

  • Scheduling and Distribution
  • Scheduling reports (and dashboards) with automated distribution to Salesforce users is easy to do and a very effective communication tool.

  • Report Notifications
  • Report Notifications is a recent feature addition and a very interesting one. In short, users subscribe to reports by providing a time or recurrence pattern to run the report, user-defined conditions to evaluate against the generated report data and selected actions (email, post, Salesforce1 Notification, Apex code) to invoke if the condition logic evaluates to true. This powerful capability enables reports, typically passive in nature, to be proactive drivers of action. The notification model also avoids the common issue that scheduled reports, regardless of content, tend to lose their effectiveness over time.

  • Report Snapshots
  • Report Snapshots enable a tabular or summary report output to be mapped to fields on a custom object; scheduled generation of the report populates the custom object with the report data. This feature is limited to the population of up to 2,000 records in the target object per execution.

A final benefit associated with Report Builder is the wealth of packaged reports provided as standard and also available on the AppExchange. It always makes sense to utilise the available reports as far as possible and build any exceptions. It’s surprising how many implementations ignore the packaged reports and expend time (and money) re-creating similar outputs.

Report Builder – Limitations
As introduced in the preceding section, Report Builder is implemented to deliver a robust, simple set of reporting functions focused on end-user usability over complexity. The following set of limitations highlight some of the reporting patterns not directly supported by the Report Builder.

  • Object Relationships
  • Report Builder consumes report types in the construction of reports, each report type is limited to 4 object relationships meaning parent to child relationships, a further 56 objects can be referenced by lookup. Note, if a single report contains columns from more than 20 objects an error is thrown. The parent to child relationship limit is key to understand when deciding upon the primary object for a custom report type.

  • Comparative Analysis
  • Report types do not support the addition of sibling objects, the parent to child object relationships are strictly linear. One common case where this constraint can manifest is the production of comparative analysis style reports which contrast records in one object against another. For example, consider a scenario where date-based, budget values and actual values are held in separate objects, both linked to a parent cost centre object. With a joined report it’s possible to bring the data together and group by cost centre – but there’s no direct means to group the data by date to deliver a side-by-side analysis of actual versus budget per date period. There are techniques to mitigate this issue, the next post in the series will cover such concepts.

  • Date Ranges
  • Where input records represent a date range between two boundary date fields (Start Date and End Date for example), the Report Builder requires that one of the date fields is used in date-based grouping. Individual dates within the range can’t be referenced. Reporting tools often provide a pivot function with dynamic columns that enables ranges to be exploded within the reporting layer, avoiding the data storage overhead of storing the transactional data at the lowest level.

  • What-if Analysis
  • Report Builder does not enable the input data to be adjusted, thereby precluding scenario-based, what-if analysis type reports. There is of course the option to export the data to Excel, which does cater for this requirement.

  • Formulas
  • Report Builder supports sub-total, grand total and cross-group summary formula fields, but not row-level formula fields. For such requirements, a custom field can be added to the object, but the means to isolate the formula field to a report would help maintain a lean configuration.

  • Sub Reports
  • Joined reports, dashboards, embedded analytics and report charts provide effective, but simple, capabilities for the delivery of composite report views. Many reporting tools support the concept of sub-reports, or nested data, where multiple transaction level views, with or without common relationships, can be combined.

  • Report Distribution
  • A common complaint raised in relation to the standard reporting tools is the inability to schedule a report for distribution to individuals who are not Salesforce users.

  • Historical Trend Reporting
  • The Historical Trending Custom Report Types expose data for the last 3 months only, and Reporting Snapshots are limited to 2,000 (summarised) records per execution. One or both of these capabilities will provide an effective historical data analysis solution for many customers, at larger data volumes and for larger organisations the limits may be intractable.

The next and final post in the “Salesforce Standard Reporting” series outlines the key data model techniques that can help maximise the potential of the standard reporting tools.

Salesforce Standard Reporting (1 of 3) – Report Types

A key contributor to the successful implementation of Salesforce is an informed approach to reporting. I’ve said this many times and have been meaning to complete this short set of posts for a long time. The basic idea is that the delivery of key reports and analytics on-platform using the standard reporting tools should be a primary objective for all implementation projects. This approach may not always be feasible where requirements are complex or atypical, however the standard reporting features should be exhausted before an off-platform business intelligence tool or analytics cloud is employed. Such complementary services come at a significant price point.

In order to maximise the potential of the standard reporting features, both expertise and experience must be applied early in the project. Early consideration of reporting requirements enables a data model design that supports the production of required reports via the standard tools; this should never be an afterthought but often is.

This short series of three posts serves to outline the key concepts, the reporting patterns supported by the standard tools and critically the implications on the data model to be considered. This latter point is imperative as an effective Salesforce physical data model strongly reflects the reporting of data; a key differentiator from a pure physical data model applied to a traditional RDBMS.

The first post in the 3 part series focuses on the foundational concept of Report Types.

Report Types
In the context of outlining the capabilities of the standard reporting tools, the following concepts are significant.

  • Standard Report Types
  • Report types abstract the data model into logical reporting entities on top of which reports are built. This model can’t be circumvented.

    1. Select Report Type

    Standard Report Types are automatically created for all Standard Objects and for Custom Objects when the “Enable Reports” or “Allow Reports” checkbox is checked. Standard report types can’t be edited, new fields are automatically added.

    For unrelated objects, or parent objects in lookup relationships only, a single Standard Report Type is created, named as per the object plural name. E.g. “Rubric Scores”. Note there are exceptions to this where enabled standard features (e.g. Salesforce to Salesforce) add object relationships which result in additional report types (e.g. “Rubric Entries with Connections”).

    For parent objects in master-detail relationships multiple Standard Report Types can be created; one per master-detail relationship. E.g. “Rubric Entries with Questions”. Additional report types are added for each lookup relationship on the child object (e.g.”Rubric Entries with Questions and Score”, “Rubric Entries with Questions and Criterion”), where Score and Criterion are parent object relationships of the lookup type. Additional report types are also added for each additional master detail relationship (e.g. “Accounts with Assessments and Courses”).

    Standard report types are not created for grand-child relationships, or for child objects in master detail relationships.

    Note, the applied naming convention uses the child field label for lookup relationships and the parent object plural label for master detail relationships.

    A key feature of Standard Report Types is the ability to navigate up a lookup relationship from Child to Parent. The example above “Rubric Entries with Questions and Criterion”, is logically a parent to child (master-detail), then child to parent (lookup) traversal.

  • Custom Report Types
  • Custom report types (CRT) enable the definition of a report type that includes up to 4 levels of parent-to-child relationship, regardless of whether each relationship is lookup or master-detail type.

    2. CRT Definition

    Each CRT has a layout which can be configured with custom sections and selective field inclusion across the objects. Objects and Fields can also have CRT specific labels added and a flag set to include selective fields in new reports by default.

    Note, an often overlooked feature of Custom Report Types is the ability to add a parent object field regardless of whether the relationship is master-detail or lookup (although the UI indicates lookup only, both work). This de-normalisation technique is incredibly powerful; up to 5 levels of the parent object hierarchy can be traversed.

    3. CRT Add Lookup Field

    CRTs therefore enable the production of specialised, convenient report types providing a focused set of denormalised fields for a specific purpose. Many successful Salesforce implementations adopt a model where Salesforce Administrators or Developers produce the required CRTs, and the end users build the reports.

  • Historical Trending Custom Report Types
  • Historical Trending is a reporting feature, added in the Winter ’14 release, which can be enabled and configured on a per-object basis. For each enabled object a special CRT is added (e.g. “Rubric Entries with Historical Trending”), containing up to 8 selected fields (numeric, date, picklist, currency). Up to 3 months of data is available via the CRT.

    The next post in the “Salesforce Standard Reporting” series will cover the capabilities of Report Builder to consume the date exposed by report types.

Winter 14 Embedded Analytics

Super quick post on Embedded Analytics just in case this has passed anyone by and because it’s probably my favourite non-technical new feature for some time.

embedded_analytics_screenshot

In short, it is now possible to embed two charts (not reports) into a standard page layout (custom objects and standard objects), blurring the lines slightly between layouts and dashboards. The source report must be of the summary or matrix type and include a chart. Note the displayed analytics support manual refresh, but this is a limited resource (a users can refresh up to 100 report charts every 60 minutes, at the org-level this is 3,000 report charts every 60 minutes).

Hopefully the embedded analytics capability will be enhanced over future releases to remove the 2 chart limit and to support non-chart report outputs. A blended approach to the composition of layouts where dashboard components and standard page sections could be intermingled would be great – particularly if this supported dynamically resizing Visualforce page components. I can dream for now at least. In the meantime the Winter ’14 functionality covers a key functional area typically addressed through technical customisation or un-secure off-platform solutions using image formula fields (Google Charts anyone?). Standard functionality that directly removes the need to build such workarounds is a big positive.

Salesforce Analytics API

With Winter ’14 the new Analytics API turns GA. This RESTful style of API consumes standard API call limits, uses OAuth authentication and JSON request/response message formats (representations).

The key functionality of the Analytics API is the execution of reports (in synchronous or asynchronous modes), the application of dynamic filtering and the inspection of report metadata.

Note, in experimenting/testing any of the RESTful APIs (Force.com REST API, Chatter REST API, Analytics API etc.) the Apigee Salesforce API Consoles are an extremely convenient and useful tool – screenshot below.

apigee_screenshot

Core API Resources

– Report metadata (GET /services/data/v29.0/analytics/reports/report_ID/describe)

– List recently viewed reports (GET /services/data/v29.0/analytics/reports/list)
Note to obtain a full list of reports the standard REST API should be used to query the Report resource (via Soql).

– Synchronous report execution (/services/data/v29.0/analytics/reports/report_ID)
GET for a simple execution (add includeDetail=true to the Querystring if required)
POST to apply filtering.

– Asynchronous report execution (/services/data/v29.0/analytics/reports/report_ID/instances) – POST to request an execution.
GET to retrieve a list of the execution instances.
GET to the ../instances/instance_ID to request the results of a single execution.

– Fact Maps
The JSON message format for a report execution, synchronous or asynchronous, provides the detail and summary data in a Fact Map structure. In short, the keys provided in the structured groupings (across and down) data provided in the embedded report metadata are concatenated to provide a composite lookup key to detail row (hasDetailRows response flag indicates the existence of detail rows) and aggregate data held in the data map.

For more on Fact Maps refer to the “Decode the Fact Map” section in the Analytics API Reference.

Use Cases
The Analytics API is a long awaited key enabler for mobile/tablet analytics use cases, allowing on-platform reports to be executed and the results accessed in a constrained-device friendly lightweight format. Without this API mobile app developers have been limited to the execution of Soql queries via the standard REST API. With the API such developers have access to the recently viewed reports for a specific user, the report metadata, the ability to execute reports and then to pass the report results up to the graphics library of choice using a standard data format.

Outside of the mobile context, the API offers the same patterns to composite applications such as complementary web applications and 3rd party tools where datasets can be defined on-platform using reports (by end-users perhaps) and the data picked up for services such as mass email marketing etc.

Key Limitations
– 500 synchronous report executions per hour
– 1200 synchronous report execution requests per hour
– Asynchronous report results are available for 24 hours from execution

Report Response Example – Matrix Report
[sourcecode language=”text”]
HTTP/1.1 200 OK
Sforce-Limit-Info:api-usage=21/5000000
Date:Sat, 28 Sep 2013 18:30:16 GMT
Content-Length:8705
Content-Type:application/json;charset=UTF-8

{
"hasDetailRows": false,
"attributes": {
"describeUrl": "/services/data/v29.0/analytics/reports/00O300000041iFQEAX/describe",
"instancesUrl": "/services/data/v29.0/analytics/reports/00O300000041iFQEAX/instances",
"type": "Report",
"reportId": "00O300000041iFQEAX",
"reportName": "Example Report"
},
"groupingsDown": {
"groupings": [
{
"value": "001Q000000PUcCtIAX",
"key": "0",
"groupings": [
{
"value": "2013-08-28",
"key": "0_0",
"groupings": [],
"label": "2013/08/28 – 2013/09/03"
}
],
"label": "New Business"
},
{
"value": "001Q000000Q0wLMIAX",
"key": "1",
"groupings": [
{
"value": "2013-09-18",
"key": "1_0",
"groupings": [],
"label": "2013/09/18 – 2013/09/24"
}
],
"label": "Existing Business"
},
{
"value": "001Q000000OwobqIAX",
"key": "2",
"groupings": [
{
"value": "2013-07-31",
"key": "2_0",
"groupings": [],
"label": "2013/07/31 – 2013/08/06"
}
],
"label": "Prospects"
}
]
},
"groupingsAcross": {
"groupings": [
{
"value": "Product Sales",
"key": "0",
"groupings": [
{
"value": "a0AQ00000054Q0uMAX",
"key": "0_0",
"groupings": [],
"label": "Product A"
},
{
"value": "a0AQ0000004YV8xMAX",
"key": "0_1",
"groupings": [],
"label": "Product B"
}
],
"label": "Product Sales"
},
{
"value": "Services",
"key": "1",
"groupings": [
{
"value": "a0AQ0000005G0OpMAX",
"key": "1_0",
"groupings": [],
"label": "Service A"
}
],
"label": "Services"
}
]
},
"reportMetadata": {
"name": "Example Report",
"id": "00O300000041iFQEAX",
"aggregates": [
"RowCount"
],
"groupingsDown": [
{
"name": "Client__c.Name",
"sortOrder": "Asc",
"dateGranularity": "None"
},
{
"name": "Projection__c.Date__c",
"sortOrder": "Asc",
"dateGranularity": "Week"
}
],
"groupingsAcross": [
{
"name": "MyProduct__c.Classification__c",
"sortOrder": "Asc",
"dateGranularity": "None"
},
{
"name": "MyProduct__c.Name",
"sortOrder": "Asc",
"dateGranularity": "None"
}
],
"reportType": {
"type": "MyProduct_CRT__c",
"label": "My Product CRT"
},
"reportBooleanFilter": null,
"reportFilters": [],
"detailColumns": [],
"reportFormat": "MATRIX",
"currency": null,
"developerName": "Example_Report"
},
"factMap": {
"1_0!T": {
"aggregates": [
{
"value": 1,
"label": "1"
}
]
},
"0!1_0": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"1!0_0": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"1!0_1": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"2!T": {
"aggregates": [
{
"value": 5,
"label": "5"
}
]
},
"0!0_0": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"0!0_1": {
"aggregates": [
{
"value": 2,
"label": "2"
}
]
},
"1!1": {
"aggregates": [
{
"value": 1,
"label": "1"
}
]
},
"T!1": {
"aggregates": [
{
"value": 1,
"label": "1"
}
]
},
"1!0": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"2_0!1_0": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"2_0!0_1": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"0_0!1": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"0_0!0": {
"aggregates": [
{
"value": 2,
"label": "2"
}
]
},
"2_0!0_0": {
"aggregates": [
{
"value": 5,
"label": "5"
}
]
},
"T!0": {
"aggregates": [
{
"value": 7,
"label": "7"
}
]
},
"1_0!1_0": {
"aggregates": [
{
"value": 1,
"label": "1"
}
]
},
"2_0!0": {
"aggregates": [
{
"value": 5,
"label": "5"
}
]
},
"2_0!1": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"1_0!0_0": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"1_0!0_1": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"0!1": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"T!1_0": {
"aggregates": [
{
"value": 1,
"label": "1"
}
]
},
"0!0": {
"aggregates": [
{
"value": 2,
"label": "2"
}
]
},
"1_0!1": {
"aggregates": [
{
"value": 1,
"label": "1"
}
]
},
"1_0!0": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"2!1": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"2!0_1": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"2!0_0": {
"aggregates": [
{
"value": 5,
"label": "5"
}
]
},
"0_0!0_0": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"0_0!0_1": {
"aggregates": [
{
"value": 2,
"label": "2"
}
]
},
"2!0": {
"aggregates": [
{
"value": 5,
"label": "5"
}
]
},
"T!T": {
"aggregates": [
{
"value": 8,
"label": "8"
}
]
},
"0!T": {
"aggregates": [
{
"value": 2,
"label": "2"
}
]
},
"1!T": {
"aggregates": [
{
"value": 1,
"label": "1"
}
]
},
"2!1_0": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
},
"0_0!T": {
"aggregates": [
{
"value": 2,
"label": "2"
}
]
},
"T!0_0": {
"aggregates": [
{
"value": 5,
"label": "5"
}
]
},
"1!1_0": {
"aggregates": [
{
"value": 1,
"label": "1"
}
]
},
"T!0_1": {
"aggregates": [
{
"value": 2,
"label": "2"
}
]
},
"2_0!T": {
"aggregates": [
{
"value": 5,
"label": "5"
}
]
},
"0_0!1_0": {
"aggregates": [
{
"value": 0,
"label": "0"
}
]
}
},
"allData": true
}
[/sourcecode]

Salesforce Exception Reports

I think it’s fair to say that consideration of reporting and analytics is not traditionally a technical architect’s direct concern. In the Salesforce context I always challenge this presumption and promote the idea that a project architect should absolutely be looking at the high-level analytical requirements during the foundation stage of the project. Why you may well ask? The answer to this is simple, in order to design a fit-for-purpose data model you must consider data coming out (via the native reporting tools) as well as data going in efficiently. I make this point in a prior post that outlines my view on the Salesforce development process. The number of projects I see where this thinking hasn’t been applied and 3rd party tools are wheeled in to address the gap continues to surprise me. Key to success here is understanding what can be achieved with the Salesforce reporting functionality. Which brings me on to the topic for this post – exception reports, a type of reporting not obviously catered for by the standard reporting tools. For instance – show me a list of Accounts without an Opportunity in the last year. Such reports are invaluable in identifying omissions, focusing sales activities, generating call lists, tracking inactivity etc. etc.

I’ll use 2 examples below to clarify how to approach such a scenario.

Simple Example.
Accounts without a closed Opportunity in the last 30 days.

For this report we use a “without” Cross Filter as per the screenshots. Cross Filters were added in Spring ’12 and enable records to be filtered based on the existence (or non-existence) of related records which meet defined criteria. Within a Cross filter up to 5 sub-filters can be defined. Note, the report’s top level filters do not apply at the sub-filter level. Cross filters can be added for any related object, custom or standard – the limit is 3 per report (each with up to 5 sub-filters). Powerful functionality indeed and a great convenience that avoids a Custom Report Type for the outer join, note the sub-filter is limited to fields on the related object, no-cross object sub-filter logic is possible.

simple exception report

Complex Example.
Accounts without a Won Opportunity involving a certain Product Family in the last 60 days.

For this report we start with a Cross Filter report to satisfy the “without Won Opportunity” requirement. From there we need a indicator (of the existence of an Opportunity Product linked to a product of the specified family) at the Opportunity level which can be used in a sub-filter. This is actually straightforward to do and relies on the old school technique for exception reports, namely the use of a rollup summary fields and simple workflow.

Solution components :
Opportunity Product – Custom Field – ProductFamilyHardwareFlag
Number (18,0 default 0)

Opportunity Product – Workflow Rule – “Opp Product – New or Changed Product”
Evaluate – Created and Edited – Formula (OR(ISNEW(), ISCHANGED( Product2Id )))

Opportunity Product – Workflow Action – “Opp Product – Set ProdFamilyHardwareFlag”
Field to update [ProductFamilyHardwareFlag] with Formula = IF( TEXT(Product2.Family)=’Hardware’, 1,0)

Opportunity – Custom Field – ProductFamilyHardwareCount
Rollup Summary Field (Opportunity Product – SUM – Field to aggregate = ProductFamilyHardwareFlag). All records included.

How does this all work? In short, a workflow rule on Opportunity Product sets a numeric field equal to 1/0 based on the product family meeting the defined criteria. The numeric field is then rolled up to the Opportunity level via RSF, which is then applied in a sub-filter as below.

complex exception report

Note, this is just an example to illustrate the concept. An actual implementation would typically require a more generic approach.