Integration and Data Management is the last section of Salesforce Sales Cloud Consultant exam, which constitutes 8% of total score in the exam. This section covers topics such as common integrations, data migration, large data and transaction volumes, and data exchange between organizations. Without further ado, let's get into it!

Guideline for Integration and Data Management

  • Explain the use cases and considerations for integrations common to Sales Cloud implementations.
  • Explain the use cases and considerations for data migration in Sales Cloud.
  • Given a scenario, analyze the implications and design considerations of large data and transaction volumes.
  • Explain how integration facilitates the exchange of data between Salesforce organizations.

Integration Patterns

  • Integration Patterns (strategies) are used to integrate Salesforce functionality with 3rd party applications.
  • List of Integration Patterns:
    • Batch Data Synchronization
      • Data is imported into Salesforce or exported from Salesforce reflecting changes on a regular basis (nightly or weekly).
      • A 3rd party ETL (Extract, Transform and Load) tool can be used for data export and import.
      • The benefit is that imports and exports can be happened during non-business hours, such as during the midnight (particularly suitable for processing large amount of data).
      • Example: import leads to Salesforce, export customer info to external system regularly
    • Remote Process Invocation
      • Request and Reply
        • Salesforce calls a remote system synchronously and waits to process the response from the remote system as part of the same transaction as the initial call.
        • Custom Visualforce page or button, a trigger or Apex job can be used to initiate the callout to remote system in a synchronous manner.
        • Example: when opportunity is Closed Won, a button is clicked to create sales order on remote system.
      • Fire and Forget
        • Salesforce calls a remote system asynchronously and does not wait for a response back from remote system. Once the process is completed, remote system will update Salesforce request in another transaction.
        • Workflow Outbound Message can be used to send data information to remote system asynchronously.
        • Custom Visualforce page or button, a trigger or Apex job can be used to initiate the callout to remote system in an asynchronous manner.
        • Example: Outbound Message can be sent to a remote system when opportunity is set to Closed Won
    • Remote Call-In
      • A remote system can create, retrieve, update or deleta data stored in Salesforce using SOAP or REST API.
      • Example: order status in Salesforce is updated when the order is finished processing in remote system.
    • UI Update based on Data Changes
      • Streaming API can be used to update in real time triggered by Salesforce event.
      • PushTopic can be created to send notification to certain users based on data changes in Salesforce, which allows Salesforce UI to be updated without refreshing the screen.
      • Example: send notification to manager when an opportunity is Closed Won by sales reps
    • Mashup
      • A remote system website might look like part of Salesforce by adding it to a web tab or page layout to access information from the external system when required.
      • NOTE: Data from external system is not stored in Salesforce.
      • Salesforce Connect can be used to create external data source and external objects for the records in external system.
      • External Objects can be set writable to allow direct modification on external records in Salesforce.
      • Example: Sales orders from remote system can be viewed and modified in Salesforce by creating a web tab for the external system portal or integrating external system using Salesforce Connect.
  • External Services:
    • External Service can be used to connect an external service provider and bring its logic into Salesforce.
      external-services
    • Process of using External Service in Salesforce:
      1. A developer creates a schema definition that describes the API.
      2. An administrator creates a named credential to authenticate to the external service provider's endpoint using its URL.
      3. The administrator creates a new remote site setting to authorize endpoint access.
      4. The administrator registers an external service using the named credential and schema definition.
      5. The administrator create a flow using the Apex actions generated by the external service.
    • Example of registering an external service:
      1. Prepare schema definition:
        • supports the following actions:
          • GET getAccount
          • POST addAccount
          • PUT updateAccount
          • DELETE deleteAccount
            external-service-support-apex-actions
      2. Create Named Credential in Setup > Named Credentials:
        external-service-create-named-credential
      3. Create Remote Site Setting in Setup > Remote Site Settings:
        external-service-create-remote-site-settings
      4. Register External Service in Setup > External Services:
        external-service-register-external-service
      5. Use Apex Actions in Flow:
        external-service-apex-action-in-flow

Data Migration

  • Data Migration means transferring data from external source to Salesforce system.

  • Data Migration Process:

    • Identification of Stakeholders
      • Identify who the stakeholders are, communicate with end users about the migration and stuff
    • Data analysis
      • Understand what data needs to be migrated
      • Identify the business process
      • Determine whether data needs cleanup
      • Determine how objects are mapped in Salesforce
      • Identify the effect of data migration to minimize the negative impact
    • Data and system preparation
      • Cleanup data as needed
      • The order of migration should be taken into consideration based on the object relationships in Salesforce.
      • Basic setup is done in this phase, such as creating profiles, adding validation rules, workflows and etc.
    • Data migration testing
      • Sandbox can be used to test data migration after configuration is setup properly.
      • Full data migration allows identifying issues from migration.
    • Final data migration and validation
      • Data migration to production can be setup after a successful full data migration to sandbox.
      • Log in in as different users to ensure proper permission is set and no weird behavior on end-user.
  • Data Loader:

    • Data Loader supports the following operation:
      • Insert
      • Update
      • Upsert
      • Delete
      • Export
    • Data Loader can be used in User Interface (Windows and Mac) and CLI (Windows only).
    • CLI can be used to automate batch operations.
    • Data Loader supports up to 5 million records of all objects.
    • History objects such as History, AccountHistory, ContactHistory and etc. are available for export in Data Loader.
    • Bulk API can be used for better performance when dealing with millions of records (default is SOAP API).
      • Bulk API will speed up the process as it runs asynchronously and utilizes parallel processing, though serial mode can be enabled for Bulk API which process batches one at a time.
  • Data Import Wizard:

    • Data Import Wizard supports up to 50,000 records at a time.
    • Data Import Wizard supports the following operation:
      • Add new records
      • Update existing records
      • Add new and update existing records
    • Trigger workflow rules and processes is optional.
    • Data Import Wizard can perform duplication check on Contact/Lead/Account (Match Contact/Lead by Name or Email, Match Account by Name & Site/Customer Id (External Id) ).
    • Contacts or Leads can be assigned to campaigns automatically.
      data-import-wizard-details
  • Data Migration Order (reference only):

    1. Users
    2. Accounts
    3. Contacts
    4. Opportunities
    5. Products
    6. Product Line Items
    7. Cases
    8. Leads
    9. Campaigns
    10. Campaign Members
    11. Contracts
    12. Assets
    13. Solutions
    14. Custom Objects
    15. Activities

Large Data and Transaction Volumes

  • Impacts of large data and transaction volumes:
    • Slow page loading, everything loads slowly
    • Reports and dashboards run slowly or timeout
    • Sharing rules take a long time to recalculate
  • Performance optimization:
    • Store large data volumes externally
    • Segment/partition data with divisions
    • Use custom indexes to improve query performance:
      • Submit request to Salesforce Support to enable custom index can be created on certain field types
      • Indexes cannot be created on the following:
        • multi-select picklists
        • textarea (long)
        • textarea (rich)
        • non-deterministic formula fields
        • encrypted fields
      • Indexes are created on External Id fields
      • Certain standard fields are indexed automatically:
        • Name
        • Systemmodstamp (LastModifiedDate)
        • CreatedDate
        • Division
        • Email
        • Id
        • RecordTypeId
        • Foreign key relationships, ex: AccountId, ContactId
      • NOTE: record is searchable only when it is indexed. Indexes are created automatically when the records are created or updated in Salesforce.
      • NOTE: the concept of 'indexed fields' is different from 'search index', not only indexed fields are searchable.
    • Use of skinny tables:
      • Skinny tables allow combining fields from the table that contains standard fields and the table that contains custom fields into a new table to avoid joins.
      • Skinny tables contain frequently used fields that remain in sync with the fields in the source tables.
      • Skinny tables can be created on Account, Contact, Opportunity, Lead, Case and custom objects (no support fields from other objects).
      • Skinny tables can be enabled by contacting Salesforce Support.
    • Archive or remove obsolete data:
      • Data not being used should be archived to reduce data volumes.
    • Optimize sharing rules
      • Recalculation takes a significantly long time to process if there is large data volume to process.
      • Recalculation can even be suspended when processing large number of data changes, so it should be considered.
    • Use of Bulk API:
      • When processing large amount of data asynchronously, Bulk API should be used.
      • Data records are loaded in parallel mode which allows running multiple batches at the same time.
      • When deleting large data volumes, Bulk API is preferred along with the hard delete option.
    • Optimize workflow and process builder:
      • Prevent or remove unnecessary or repeating/duplicated actions or operations.
    • Data import considerations:
      • Triggers, Workflow Rules and Validation Rules are run on data import, so should be considered.
      • Be careful when sharing rules are in place, recalculation might cause overhead.
    • Use filters:
      • Use filters on list view or reports and dashboards whenever possible to avoid loading tons of data.
      • Report filters should be created on indexed fields to improve performance.
      • Only include fields that need to be included in the report.
    • Report considerations:
      • If there are millions of records, it is suggested to extract data and report in a data warehouse (Salesforce report might break if loading high data volume).
      • Data from multiple objects can be aggregated into a custom object using Triggers (improve performance).
      • Using roll-up summary fields might impact query performance. Use Triggers to store summarized information on the parent object is a better solution in terms of query performance.
    • Enable separate loading of related list
    • Prevent data skew

Data Exchange Between Salesforce Organizations

  • Data Virtualization:

    • Data from another Salesforce org or system can be accessed and managed by users without actually transferring the data.
  • Salesforce Connect:

    • Salesforce Connect can be used to surface data in one Salesforce to another.
    • Users are allowed to view and manage data that is stored in different salesforce org or external system.
    • Salesforce Connect can be used to view and manage data that is stored in different Salesforce org (or other external system).
    • Example of Data Exchange using Salesforce Connect (trailhead):
      1. Enable Lightning Connect:
        lightning-connect-set-customer-id
      2. Create External Data Sources in Setup > External Data Sources:
        create-new-external-data-source
      3. Validate that you can connect to external system, then sync its schema with Salesforce org:
        validate-and-sync-schema
      4. Once it is synced, you will see External Objects created in Salesforce.
        external-objects-order
        • NOTE: sync does not store any data in Salesforce, it only maps to external tables that contain the data.
        • NOTE: external objects has API name with suffix __x rather than __c
    • Other use cases of Salesforce Connect:
      • external data source setup for another salesforce org that contains products, so Sales users can gain on-demand access to products related to opportunities
      • view and update leads from the primary Salesforce organization
      • contacts may be stored in different organization, but they can be made visible on the account page using indirect lookup.
    • NOTE: Salesforce Connect does not allow user to share data with other org manually (no sharing!!! that is keyword!).
  • Data Management:

    • Integration can be used to transfer data via REST and SOAP API and other pre-built integration solutions to exchange and tranfer data between Salesforce organizations.
  • Salesforce to Salesforce:

    • Salesforce to Salesforce is a native feature that can be enabled to let you share and update records with business partners that use Salesforce as well:
      salesforce-to-salesforce-settings
    • Main features:
      • Objects and fields can be selected to publish with connected org.
      • A connected org has a record that is connected to the record in the source org.
      • If record is updated on one org, it will reflect on another org.
      • Records can be forwarded to the connected org or assigned to a queue (can be accepted by connected org).
      • Related records or child records sent with it are automatically accepted as long as certain criteria are met.
    • Examples:
      • Share Lead records with partners who have access to a connected Salesforce org
      • Share Account/Opportunity records with Salesforce orgs in different regions
      • Share records with users who have access to different Salesforce org manually
    • Only certain standard objects such as Account, Contact, Lead, Opportunity and Case are available, all custom objects are available as well.
    • NOTE: connected org must first subscribe to an object to accept records of that object, it doesn't have direct access to the records that are shared.

Well, that's the end of Salesforce Sales Cloud Consultant certification exam preparation! I will see you in the next Salesforce certification exam preparation series, stay tuned!

Post was published on , last updated on .

Like the content? Support the author by paypal.me!