Total Pageviews

Showing posts with label salesforce. Show all posts
Showing posts with label salesforce. Show all posts

Monday, July 6, 2015

Integration Design patterns in Salesforce

Has this been your reaction when trying to navigate your path around design patterns especially when it involves patterns governing integration !! Well to be  honest technical jargon always gets to my nerves , but after the initial jerky reaction I always take it upon me to demystify / simplify it.

So what are "Design Patterns"? 
As per Wikipedia -   design pattern is a general reusable solution to a commonly occurring problem within a given context in software design. A design pattern is not a finished design that can be transformed directly into source or machine code. It is a description or template for how to solve a problem that can be used in many different situations. 

In layman's language - its a solution approach one should adopt for a certain category of recurring problem statement. For example one has to come up with a "Contingency plan for a natural disaster" what would be the main steps to formulate a strategy?
1. Evaluate the vulnerability of the area under inspection
2. Find the steps to reduce damage and vulnerability
                  3. What would be the precaution each can take.
                  4. Have an advance warning system if possible.
                  5. Formulate a relief and rescue strategy

So be it a earthquake/ floods or any other natural calamity the basic steps will follow above pattern.

What are the main types of design patterns?

  • Algorithm strategy patterns
  • Computational design patterns
  • Execution patterns 
  • Implementation strategy patterns
Another school of thought goes by the following classification:-
  • Creational Design Pattern
  • Structural Design Pattern
  • Behavioral Design Pattern 


 Lets leave the general design pattern and dive into design patterns associated with SFDC Integration.




Broadly all integration design pattern are classified into:-
1. Data - These service the need for taking or synchronizing data from one system to another. Mostly the purpose of these integration is to see that both system have meaningful data. These type of integration are supposed to be the simplest.
2. Process - Here the two different system target and source communicate to service a business process related requirement.These might require complex design and implementation and involve multiple systems, where the one invoking the other systems may be the controller while other act like members of an orchestra .

There are certain things which need to be considered while embarking over a type of design pattern. the main considerations are:-

1. Source / Target systems

2.Type of integration (Data/ Process)

3.. Timing (Synchronous/ Asynchronous)

We would deep dive on those aspects in a followup blog.



Saturday, February 1, 2014

Batch Apex – What you should be careful about?


I have been writing or reviewing codes in Salesforce for over six years now. Whenever one comes across a tricky situation which involves processing huge volumes of data “Batch Apex” is the way to go, but every now and then you may be alarmed by someone telling you – that hey I hit the 10k limit while executing a Batch Job!

You would wonder- how is that possible? Soon you would realize that its the result of one the common slips that one does while designing a “Batch Job”. Most commonly we fetch records in access of 10,000 within the Execute() method. Treat all code written within execute() method as if you are writing for normal Apex.
Batch Apex surely has more flexible Governor Limits for example:-
·         200  SOQL per cycle (Please check the latest limit a s per latest release)
·         Records retrieved by SOQL query 50,000,000 as against 50k in the normal apex.
·         Heap Size is much larger than that compared to normal apex.
Why Batch Job is capable of handling huge data sets?
 One has to understand what makes Batch Apex successful in processing huge data sets:-
·         The large data set is processed in batches – which implies that the parameter “query” in  Start Method () that has to be the query which will return the bulk of data. Please refer to following code snippet for clarity:-
global Database.QueryLocator start(Database.BatchableContext BC){
 return Database.getQueryLocator(query);
Common Pitfalls
è You might say well what’s new in that we all know that. Well true but just reiterating -that the dataset which is the largest - it has to be fetched with the SOQL in “Query” and then processed in batches
è Always remember within the Execute() method never execute an Insert or update which will process more that 10k records in a cycle.
·         Also the enhanced governor limits enables the Batch Apex to process huge datasets, but one needs to cautious that though relaxed the governor still exist w.r.t. Batch Apex too. The following may be important for considering while finalizing your design and organization wide setup:-
o   Batch Apex Governor Limits
o   These are the governor Limits you need to keep in mind when dealing with Batch Apex
o   Up to five queued or active batch jobs are allowed for Apex.
o   A user can have up to 50 query cursors open at a time. For example, if 50 cursors are open and a client application still logged in as the same user attempts to open a new one, the oldest of the 50 cursors is released. Note that this limit is different for the batch Apex start method, which can have up to five query cursors open at a time per user. The other batch Apex methods have the higher limit of 50 cursors. Cursor limits for different Force.com features are tracked separately. For example, you can have 50 Apex query cursors, 50 batch cursors, and 50 Visualforce cursors open at the same time.
o   A maximum of 50 million records can be returned in the Database.QueryLocator object. If more than 50 million records are returned, the batch job is immediately terminated and marked as Failed. So if you have a requirement to process in excess of that value factor in the necessary checks and balances in your design.
o   If the start method returns a QueryLocator, the optional scope parameter of Database.executeBatch can have a maximum value of 2,000. If you set to a higher value it will anyways break it into smaller chunks, which is a blessing in a sense.
o   If no size is specified with the optional scope parameter of Database.executeBatch, Salesforce chunks the records returned by the start method into batches of 200, and then passes each batch to the execute method. Apex governor limits are reset for each execution of execute.
o   The start, execute, and finish methods can implement up to 10 callouts each.
o   Batch executions are limited to 10 callouts per method execution.
o   The maximum number of batch executions is 250,000 per 24 hours.
o   Only one batch Apex job's start method can run at a time in an organization. Batch jobs that haven’t started yet remain in the queue until they're started. Note that this limit doesn't cause any batch job to fail and execute methods of batch Apex jobs still run in parallel if more than one job is running.

Friday, August 9, 2013

Data Migration (Legacy system to Salesforce) a dummy’s guide


Data is distinct pieces of information, usually formatted in a special way.  Data is the most critical asset of any organization. The criticality of data makes any activity involving data crucial and delicate. The same logic makes projects involving “Data Migration” sensitive and mandates the handling with utmost care.


As the title suggests we are going to focus on Data Migration involving Salesforce as a platform in general and in particular we are going to reflect on:-

·        What are the “n” questions one should ask before planning a data migration project?

·        How Legacy Data should be moved over to Salesforce?

·        What are the “n” numbers of things to be careful about?

·        What are the “n” things to checked in Salesforce before embarking over a data migration project? 

Questions one should ask oneself while kicking off a Data Migration Project?


1.       What is the nature of system from where data which is being brought in?
a.       Legacy system like some Orcale Database, etc.
b.       Some CRM like Sugar CRM, Net Suite etc?
              The answer to the above question gives you the following vital inputs:-
·    So there is legacy data coming from another system
·    This means that we might need to bring in information like:-
1.       Created Date, Created by information
2.       The above information in SFDC is called “Audit fields/ Information
3.       For adding information about audit fields create a case with salesforce
           Audit Fields
System fields, which are read only and store information like “created date”, “created by id” (reference to user), “last modified date” and “last modified by id” are referred to as audit fields.
Ideally Salesforce treats these fields as “read-only” and these fields are set by the system by default and cannot be modified by the user.
Though under special condition these (audit) fields can be set by special permission from Salesforce by creating a case with “Salesforce” to that effect. The permission to set audit fields is not unlimited and is revoked after some time, if you need that you should be able to set audit fields longer do create another request with Salesforce.

Also note objects like:-
·          Account Feed
·          AccountShare
·          AccountTag
Don’t have audit fields


2.       What is the nature of Data?
a.       Is there currency related data? If yes is that data in different/ multiple currencies? If the answer is yes? Then we need to get “Multi-currency” enabled in our Salesforce organization/ instance.
3.       What is the numbers of  “users” that need to be created? Or how many users have to be created? Lets say if “X” users have to be created are these necessary number of seats in the “Salesforce Instance”?
a.       The answer will decide how you will frame the strategy for carry outing the data load vis-à-vis assigning of information like
·    “Created By Id”
·    “Last modified by Id”
4.       Are there users from different Time Zones? If yes then one has to handle time related data accordingly
5.       In what format data will be provided to you?
6.       What is the minimum information you need for each object, to successfully load data? In technical terms I means what are the “Required Fields”?
7.       What are the “business rules” that have to be taken care of? For example if there are certain business rules implemented in the system by way of “Validation rules or Triggers” then how such situations will be handled? For example if you have a validation that “Date of Sale” cannot be less that today, then how will you handle loading of legacy data, which will be in past. Answer will be perhaps deactivating such validations rules or commenting a portion of trigger during data load.
8.       When you get the data extracts (say csv files) from legacy system check the columns having dates are they in correct format and value?
a.       Many times you will see date represented as numerical date value, if yes-appropriate date conversions need to be done in excel using formulas.
b.        Check if the date values in legacy system and in csv files are same?
9.       W.r.t. text fields check the data length because if you are using excel or csv to massage and manipulate data then text fields which get truncated during such instances its advisable to use: -
a.       A database for data massaging and manipulation
b.       Informatica for making data extractions and loading into database and Salesforce.
10.    Also w.r.t. Salesforce always remember in the user object there are certain felds like “Community Nickname”, “Username” , etc which are unique across the Salesforce platform so if legacy data is coming from Salesforce, data massaging will involve making these data values unique.
There are scores of questions which come to my mind, but lets park them for the next blog …

Keep watching this space for more on “Data Migration”

Saturday, January 19, 2013

Custom Settings revisited !!!


I remember nostalgically the first major technical document I penned. It was for a product/project I had worked on - and the comments of the reviewer were Ispita why do you use such flowery language ~ I guess the person was being generous~ what was implied was ~ Ispita keep it short and simple.
Such comments are always priceless - and act as sharpening stones.
Apart from the two S mentioned above I now add a third dimension - spicy. So anything I write essentially has the three S- Simple, Short , Spicy ( na na don't run your horses of imagination - spice essentially  means -  the action on peppering your textual creation with anecdotes and personal touch , don't you look for yellow journalism - u will get none).

That reminds me this is not about - "How to write in Queen's language?" this blog will talk briefly about  "Custom Settings" and in particular about those which fall in the category of List custom settings.

So what are Custom Settings?

Custom settings are just like custom objects which act as placeholders to hold information which may tied to:-
1) an organization
2) a profile
3) a specific user

This leads us to the obvious question if its similar to custom objects then why create another data structure with similar traits? What purpose does it serve?
In pure technical terms you can say - "All custom settings data is exposed in the application cache, which enables efficient access without the cost of repeated queries to the database."  In layman's language lesser use of select statements and faster code execution.
Also they act as custom application wide global variables ( These are the List custom settings).
Now coming to the type of Custom settings . There are two types of custom settings:
List Custom Settings
This is the set of data which is available to the whole organization and remains the same irrespective of the profile or user accessing it. Its the socialistic type of custom setting same for everyone across the organization.Also since its cached and does not require issuing of costly SOQL queries they don't add to your overall tally of SOQL queries issued and thus help you in not breaching the governor limits.
Hierarchy Custom Settings
But what if you need a set of data which should morph or return different info based on which category of people is accessing it or which user is accessing it? So now the custom setting is not socialistic and return information based upon the rights of the user accessing it- or information is returned keeping in view the hierarchy- those are called the "Hierarchy Custom setting".
Click on the picture below for enlarged view.

Now how to access this via code in apex.
General syntax :-

Map<string , CustomSettingName > Variable Name = CustomSettingName.getAll();   

Here w.r.t. above diagram the syntax will be:-

Map<string , AppSetting> CST = AppSetting.getAll();   
string  strURL1 = CST .get('URL').compartment1__c;    
string  strURL2 = CST .get('URL').compartment2__c;    
So strURL1 contains àhttp://login.salesforce.com   and  strURL2 contains àhttp://test.salesforce.com  

I know you will say custom settings are so easy - but believe me whenever I start using them I always tend to forget some of the steps and go about re-inventing the wheel - so this time I thought of writing the steps for future reference.