Performing Outer Join in Advanced Find

It is currently not possible to perform left outer join using Advanced Find, even though that capability exists in CRM2015. For e.g. if you want to get a list of contacts, that don’t have an invoice, you cannot do it using an ad-hoc Advanced Find.

This limitation is usually overcome by creating a system view that does a regular inner join, and then updating the view’s fetchxml using a tool like FXB or even editing the customisation file directly. There is also the rollup field approach described in But if you are going to do an ad-hoc advanced find query, you really don’t want to create one system view for each entity, for which you want to do an outer join.

I present to you a solution for this problem: AdvancedQueryOuterJoin

This is a plugin that runs on the pre-stage of RetrieveMultiple, and alters the fetchxml with the correct left outer join condition. This means you can do an Advanced Find Query like this one below.


The above query is for Account, who have child contacts, but don’t have any opportunities. When the plugin sees a link entity in the query, with a null primary key condition, it replaces that condition with a left outer join on the parent entity. The code itself is very simple, and you can have a look at the github repo to understand how this works.

You can download the solution from Download the solution file that works best for your CRM instance i.e. 7.0 or 7.1. Please also use the github issues area, for entering details about any problems that you encounter.

EDIT (01/06/15): Using this plugin will not correct the fetchxml you download by clicking the Download Fetch XML in the Advanced Find window. If you are going to run your downloaded query in FXB, make sure you change the Output type to Grid.


If you use FetchResult, you won’t see any results, as the plugin would not have executed in the background.

EDIT (02/10/15): Refer to the newer post for doing this using bookmarklet.

New XrmToolBox Plugin: Export to Excel

Export to Excel is a great feature in Dynamics CRM, but it has always come with two limitations

  1. Number of records that can be exported
  2. Format of the Excel file

Even though CRM says that it is exporting to Excel, what it is exporting to, is not xlsx format. This limitation has been addressed in CRM2015 Update 1. The number of records, you can export, has also been increased to 100,000. But if you still want to export the whole dataset, the easiest way I found, is to export it from the filterview in the MSCRM database. This is obviously not possible, if you are using CRM Online. Hence, I developed this tool, to do just that.


  1. Download the latest version of XrmToolBox.
  2. Install the tool from the XrmToolBox store. Refer for additional details.


Export to Excel Tool
Export to Excel Tool

Using the Tool

  1. Open Export to Excel tool in XrmToolBox
  2. Click on Load Entities. Select an entity you would like to export
  3. The “Batch size” field dictates the default page size. This is initially set to 5,000 records per page
  4. The “Max rows per file” field dictates the maximum number of row per Excel file. This is used to split the excel file. For e.g. if you view/fetchxml produces 500,000 records and you set the “Max rows per file” value to 100,000, your result will be exported to 5 Excel files.
  5. Click on any view that you would like to export. The view’s fetchxml is also displayed
  6. Click Export to Excel. Enter the filename that you would like to save this tool, or leave it to the default value

The tool displays the progress of the file export, while retrieving records from CRM. The total number of records retrieved is also displayed, upon successfully writing the data to the output file. Credits This tool uses code from Tanguy’s View Layout Replicator, for most of the UI and event handling logic.

Source code for this tool can be found at

Leverage Actions to bundle and minify form scripts

EDIT (26/05/15): I forgot to disable the plugin on RetrieveMultiple of webresource, described in the previous post. I also forgot to minify the script that runs on Form Experiment entity. Strangely however, this has reduced the performance. I tested this in CRMOnline, and so I am not sure if this is due to the load on the server at this time of the day. I have modified the performance screenshot, to reflect this.

I tried another approach to dynamically minify and load CRM form script. This time I am utilising Actions. I have tried different ideas with Actions in the past and love how extensible and powerful they are. Of all the approaches I have tried so far, I like this a lot. As usual, if you would rather read code, head to and fork the code.

This is the basic flow of this design:

  1. Form script just has code to invoke the action using current entityname as input parameter
  2. Action uses the Form Load Sequence entity to collate a list of scripts that have to be minified and concatenated. The minified and concatenated script is stored in the Action’s output parameter
  3. The calling form script, evals the returned minified and concatenated script

Before you get defensive aggressive about using evals, please read IMHO, it is perfectly alright to use eval in this scenario. Here are the steps:

Create the Action

The action doesn’t contain much config, other than the input parameter and output parameter. The maximum length of the string, if you set this from the UI is 255, however, if you use a plugin to do the same, there is no limit. We are going to use this behaviour, to set the minifiedscripts with the bundled/concatenated scripts.

Create the Plugin that is triggered by the Action

The Action’s input argument is passed on to the Plugin’s InputParameter and this is how the plugin knows, which entity it is dealing with. It then proceeds to concatenate and minify the scripts.

Plugin Code

using DouglasCrockford.JsMin;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using System;
using System.Linq;
using System.Xml.Linq;
using Contract = System.Diagnostics.Contracts.Contract;

namespace RYR.Experiments.Plugins
    public class PostGetScriptsForEntity : IPlugin
        public void Execute(IServiceProvider serviceProvider)
            Contract.Assert(serviceProvider != null, "serviceProvider is null");
            var pluginExecutionContext =
            var tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));
            Contract.Assert(tracingService != null, "TracingService is null");

                var factory =

                var organizationService = factory.CreateOrganizationService(pluginExecutionContext.UserId);
                var entityName = pluginExecutionContext.InputParameters["entityname"];

                var scriptLoadOrderFetchXml = string.Format(@"
                    <fetch count='1' >
                      <entity name='ryr_formloadsequence' >
                        <attribute name='ryr_name' />
                        <attribute name='ryr_scripts' />
                          <condition attribute='ryr_name' operator='eq' value='{0}' />
                    </fetch>", entityName);
                var concatenatedScripts = new StringBuilder();
                var loadSequenceResults =
                    organizationService.RetrieveMultiple(new FetchExpression(scriptLoadOrderFetchXml))

                if (!loadSequenceResults.Any()) return;

                var scriptsToMerge = loadSequenceResults[0].GetAttributeValue<string>("ryr_scripts").Split(',');
                var webresourceFetchXml = string.Format(@"
                          <entity name='webresource' >
                            <attribute name='content' />
                            <attribute name='name' />
                              <condition attribute='webresourcetype' operator='eq' value='3' />
                              <condition attribute='name' operator='in' >
                        </fetch>", string.Join(string.Empty,
                         scriptsToMerge.Select(x => string.Format("<value>{0}</value>", x))));
                var toBeMergedWebResources = organizationService.RetrieveMultiple(
                    new FetchExpression(webresourceFetchXml)).Entities;

                if (!toBeMergedWebResources.Any()) return;

                foreach (var s in scriptsToMerge)
                    var matchedWebresource = toBeMergedWebResources
                        .FirstOrDefault(x => x.GetAttributeValue<string>("name") == s);
                    if (matchedWebresource != null)
                                "Xrm.Page.ui.setFormNotification('Unable to load {0}', 'ERROR', '{1}');", s,
                pluginExecutionContext.OutputParameters["minifiedscripts"] = new JsMinifier().Minify(concatenatedScripts.ToString());
            catch (Exception e)

Use Sdk.Soap.js to generate Javascript code for Action

Download Sdk.Soap.js  and run the Sdk.SoapActionMessageGenerator.exe executable. Once you choose the organisation details and enter the credentials, the Javascript code for each individual action will be generated.

Modify the generated code to auto-populate entity name

The code generated by the Sdk.SoapActionMessageGenerator is going to be used in the form. We want to simply add this script to any entity, without having to change a single line of code. However, the Action accepts an input parameter called entity, that will obviously be different from entity to entity. Hence we have to make a slight modification to ensure that entityname is set automatically using Client API.


(function() {
this.ryr_GetScriptsForEntityRequest = function (
/// [Add Description]
if (!(this instanceof Sdk.ryr_GetScriptsForEntityRequest)) {
return new Sdk.ryr_GetScriptsForEntityRequest(entityname);


function () {
this.ryr_GetScriptsForEntityRequest = function ()
var entityname =; 
/// [Add Description]
if (!(this instanceof Sdk.ryr_GetScriptsForEntityRequest)) {
return new Sdk.ryr_GetScriptsForEntityRequest(entityname);

I also had problem with strict mode and so I removed it from both Sdk.Soap.js and Sdk.ryr_GetScriptsForEntity.vsdoc.js. Now we merge both these scripts into one and add that as a javascript webresource. This webresource is then added to the form.

Here is the script coming through in the Action response, before getting evaled.


 291 ms (minified and added to the form)
 440 ms (dynamically minified using Action)
Hope this gives you an idea on how you can use Action to solve the script load sequence issue in CRM2015.

Form Script Bundling and Minification

Ever since form script loading was made async, I have tried different ways to ensure that form scripts will be loaded in the sequence I want, and not how CRM wants it to be, a.k.a random. CRM2015 Update 1 has introduced Turbo Forms. Turbo Forms are supposed to drastically improve form render time. The problem of managing script dependencies, is still left to the developer. You basically have two options:

  1. In build stage, bundle and minify the scripts in the order of dependencies and use this in the form.
  2. Write self contained scripts, without any dependencies.

I tried out an approach using Plugin and couple of entities to try solve this. If you would rather read the code, you can download the source from I have also included the managed and unmanaged solutions in the repo.


1. Form Load sequence entity, that stores the entity name and scripts that are to be loaded
 2. A blank javascript webresource, whose name is in this pattern: [entityname].crmform.min.js
 3. A plugin the runs on post-RetrieveMultiple on webresource entity

How it is wired up

[entityname].crmform.min.js is added to the form, that requires bunding and minification. This script just contains a comment, and nothing else.

Create a Form Load Sequence record, that specifies the entity name and the scripts that loaded be minified for this entity.

Register the plugin on post RetrieveMultiple of webresource entity.

How it works

If you have read the plugin code already, you already know. But you haven’t here is how it works:
The plugin retrieves the Query key on the PluginExecutionContext’s InputParameter that contains the QueryExpression object. It then checks, if there is a condition on this QueryExpression with name like crmform.min.js. If so, it retrieves the correct Form Load sequence record for the current entity.

Using the script sequence specified, it also retrieves the script webresources, concatenates and minifies them. This concatenated script is then used to update the OutputParameter’s BusinessEntityCollection. The plugin also checks if the javascript webresources specified in the form load sequence entity, actually exists. It displays an error, if it doesn’t.

ExecutionContext for RetrieveMultiple on webresource – Post Stage

  <Configuration i:nil="true" xmlns:i="" />
  <ConstructorException i:nil="true" xmlns:i="" />
    <z:anyType i:type="PluginExecutionContext" xmlns:z="" xmlns:i="">
      <InputParameters xmlns:a="" xmlns:b="">
          <b:value i:type="a:QueryExpression">
              <a:Columns xmlns:c="">
                  <a:Values xmlns:c="">
                    <c:anyType i:type="d:string" xmlns:d="">ryr_formexperiment.crmform.min.js</c:anyType>
                  <a:EntityName i:nil="true" />
              <a:Filters />
            <a:LinkEntities />
            <a:Orders />
              <a:PagingCookie i:nil="true" />
      <OutputParameters xmlns:a="" xmlns:b="">
          <b:value i:type="a:EntityCollection">
                    <b:value i:type="z:guid">02fe0b47-8800-e511-80ef-c4346bada558</b:value>
                    <b:value i:type="c:string" xmlns:c="">ryr_formexperiment.crmform.min.js</b:value>
                    <b:value i:type="c:string" xmlns:c="">Ly9yeXJfZm9ybWV4cGVyaW1lbnQuY3JtZm9ybS5taW4uanM=</b:value>
                    <b:value i:type="a:OptionSetValue">
                <a:EntityState i:nil="true" />
                    <b:value>Script (JScript)</b:value>
                <a:KeyAttributes xmlns:c="" />
                <a:RelatedEntities />
            <a:PagingCookie><cookie page="1"><webresourceid last="{02FE0B47-8800-E511-80EF-C4346BADA558}" first="{02FE0B47-8800-E511-80EF-C4346BADA558}" /></cookie></a:PagingCookie>
      <OwningExtension xmlns:a="">
        <a:KeyAttributes xmlns:b="" xmlns:c="" />
        <a:Name>RYR.Experiments.PreRetrieveMultipleWebResource: RetrieveMultiple of webresource (Profiler)</a:Name>
        <a:RowVersion i:nil="true" />
      <PostEntityImages xmlns:a="" xmlns:b="" />
      <PreEntityImages xmlns:a="" xmlns:b="" />
      <RequestId i:nil="true" />
      <SharedVariables xmlns:a="" xmlns:b="" />
      <ParentContext i:nil="true" />
  <ExecutionException i:nil="true" xmlns:i="" />
  <ReplayEvents xmlns:a="" />
  <SecureConfiguration i:nil="true" xmlns:i="" />
  <WorkflowInputParameters xmlns:a="" />
  <WorkflowOutputParameters xmlns:a="" />
  <WorkflowStepId i:nil="true" xmlns:i="" />

There is a bit of overhead, as the plugin has to retrieve and minify the webresources.

Scripts minified at build stage and added to the form

It takes 291ms.

Script dynamically bundled and minified by plugin

It takes 481ms.

Take this numbers, with a pinch of salt, as I found the performance can vary quite a bit depending on the time of the day, as I tested this in CRMOnline. I also tested this with cache disabled.

Failures along the way
I tried these approaches which, didn’t quite workout, but I want to document them for future reference.

  1. Adding a non-existent form script into the Form Experiment‘s FormXml doesn’t work, even though there is plugin to take care of the RetrieveMultiple request. CRM doesn’t allow this to happen. The Javascript webresource has to exist, if you want to add this to a form. Here is what you’ll have to add to the root node of the formxml, to hook up the webresource and associated onsave and onload event handlers for the form
 <Library name='[JSWEBRESNAME]' libraryUniqueId='[NEWGUID]' />
 <event name='onload' application='false' active='false'>
   <Handler functionName='[FUNCNAME]' libraryName='[JSWEBRESNAME]' handlerUniqueId='[NEWGUID]' enabled='true' parameters='' passExecutionContext='false' />
 <event name='onsave' application='false' active='false'>
   <Handler functionName='[FUNCNAME]' libraryName='[JSWEBRESNAME]' handlerUniqueId='[NEWGUID]' enabled='true' parameters='' passExecutionContext='false' />


My previous posts on the same subject



  1. Scott Durow – Ghost Webresources
  2. Ben Hosk – Turbo Forms
  3. MSDN – Write Code for CRM Forms

Redux: RetrieveMultiple on large datasets

In trying to optimise, RetrieveMultiple performance I thought I had hit a sweet-spot with using TPL, with RetrieveMultiple request spanning across pages, but after talking to @maustinjones, I now have come to the conclusion, that is probably not the best approach. The dataset size I tested on: contact and account, are large, but not significantly large. After looking at the SQL Trace logs, I have decided to stick with the pagingcookie approach, while dealing with large datasets.

In order to understand, why the pagingcookie approach is more efficient, it is important to understand, how your FetchXml is translated into SQL.

Retrieve all records from account. Total Number of records is 10,177

Approach 1 – TPL with RetrieveMultiple parallised across pages Approach1As you can see, the SQL query that is generated, is not exactly optimal. There are only 10,177 records in total, but 45,533 records (5001 + 10001 + 10177 + 10177 + 10177) are retrieved across 5 requests. The correct number of records (10,177)  are returned only after further in-memory processing.

Approach 2 – PagingCookie

Approach210,179 records (5001+5001+177) are retrieved across three requests, and further processing cuts down the result record set to the correct count of 10,177.

Random Notes:
Looking at how paging cookie translated to SQL, reminds me of this post from PFE Dynamics guys -> Dangers of Guid.NewGuid. I believe that the entity primary key is generated using NEWSEQUENTIALID, and hence it is not advisable to insert records directly into the MSCRM database. This is also probably how “> uniqueidentifier” query is optimised, as the Guid is in sequence, and the clustered index on the primary key can be efficiently used in this query.

So, compared to the TPL approach, paging cookie approach

  1. Reduces the number of queries sent to the database
  2. Respects the count size specified in the fetchxml
  3. Does less in-memory processing
  4. Reduces the chances of encountering a database lock

In spite of all these obversations, and understanding how efficient the pagingcookie queries are, I still cannot comprehend how TPL is faster, even though it inefficiently queries the database.

Credit: Big thank you to @maustinjones in helping me understand the performance pitfills when using TPL, in this scenario.

RetrieveMultiple performance on large datasets

EDIT (21/05/15): After DMing with @maustinjones, I have added stat for paged RetrieveMultiple as well. I have updated the post to include this.

EDIT (22/05/15): Updated post to reflect correct behaviour of no-lock when parallelising. Thanks @maustinjones. Please also refer to the follow up post -> on why paging cookie is the recommended approach.

EDIT (18/07/16): Please watch this -> video from the Dynamics CRM product group, which goes into much more detail about performance techniques to optimise RetrieveMultiple queries.

EDIT (25/01/17): Eric Hagen has written a series of post covering RetrieveMultiple performance and internals. Reading these posts is time well spent ->

MSCRM limits the maximum result size to 5000 records, when you use the RetrieveMultiple request. In order to overcome this and retrieve all the records, you’ll have to use paging. PFEDynamics team have also released an open-source library called PFE Xrm Core Library that utilises Task Parallel Library.

There is also ExecuteMultipleRequest that you can use to send bunch of requests in one go, and process the responses. This just wanted to document by findings, about the performance of these options:

  1. Just using Parallel.ForEach
  2. ExecuteMultipleRequest
  3. PFE Xrm Core
  4. Paged RetrieveMultiple

Run 1 (Batch size 1000):
I have not included the paged RetrieveMultiple in this scenario, as it is too slow, and I am too impatient to wait for it to complete.

Run 2 (Batch size 5000):

Run 3 (Batch size 5000):


  1. Reducing the page size causes a drop in performance. I got better performance with 5000 records in one page, than 1000 records
  2. Just using Parallel.ForEach is faster than PFE Xrm Core (some overhead in instantiating the service perhaps?)
  3. ExecuteMultipleRequest is significantly slower than Parallel.ForEach

Best Practice tip from PFE Dynamics:

If you read the source code for the ParallelServiceProxy class, the help text for the RetrieveMultiple method actually has this note:

IMPORTANT!! This approach should only be used if multiple queries for varying entity types are required or the result set can’t be expressed in a single query. In the latter case,
leverage NoLock=true where possible to reduce database contention.

So ideally, you should be using RetrieveMultiple method in ParallelProxy, to retrieve records from multiple entities. But, in my case I am retrieving from the same entity, hence I am using the no-lock option the fetchxml.

DMed with @maustinjones, and here is an important point to consider, before choosing a technique. 

Query pages shouldn’t be retrieved in parallel. only separate queries altogether

When you use TPL to submit a bunch of RetrieveMultiple requests in one go, there is a chance of encountering database lock issues, even though you use no-lock. So, it is OK to parallelise queries that fetch from different entities, but using PagingCookie is the recommended approach.

To be honest, I did not experience any lock issues during the test, but standard CRM load was not simulated during the test. I am not sure what would have happened, if users were viewing contacts/running reports on contacts, and I am running these tests at the same time.
Database contentention and locking is applicable only in scenarios, where no-lock is not feasible.

TPL just blitzes the other techniques in these tests, but I’ll have to test this further to see if lock contention issues arise, in much larger datasets. Do consider PFE Xrm Core, if you are looking to use RetrieveMultiple from multiple entities. It is written by the PFE Dynamics guys and is open-source.

Best Practice tips from msdn regarding TPL:

This article in msdn is a must read if you are thinking of using TPL ->

I had performance issues with TPL, as I had Console.WriteLine inside the lambda for ForEach, and it basically killed the performance. The article told me why, and so it is quite an useful read.

Note about the code:
I quickly wrote this just to test the performance, and not production quality in mind. If you read the code, you can see I am sending 20 requests at a time, inside an infinite loop to retrieve all the pages. I exit the loop when any one of the pages, returns no result. I haven’t tried ExecuteMultipleRequest inside the Parallel.ForEach. It would be interesting to see what the performance will be in that case.

Code: Here is the code, I tested with, for reference.

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Messages;
using Microsoft.Xrm.Sdk.Query;
using Microsoft.Xrm.Tooling.Connector;
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Net;
using System.Threading.Tasks;
using System.Xml.Linq;

namespace Playground
    class Program
        static void Main(string[] args)
            RetrieveAllPages(@"<fetch count='5000' no-lock='true' page='{0}' >
                                  <entity name='account' >
                                    <attribute name='name' />
                                    <order attribute='accountid' />
                                        <condition attribute='statecode' operator='eq' value='0' />

            RetrieveAllPages(@"<fetch count='5000' no-lock='true' page='{0}' >
                                  <entity name='contact' >
                                    <attribute name='fullname' />
                                    <order attribute='contactid' />
                                      <condition attribute='statecode' operator='eq' value='0' />

        static void RetrieveAllPages(string fetchXml)
            Console.WriteLine("Entity: {0}\n", XElement.Parse(fetchXml).Element("entity").Attribute("name").Value);
            var organisationSvcManager = new OrganizationServiceManager(new Uri("CRMURL"),
                "[username]", "[password]", "[domain]");
            var crmSvc = new CrmServiceClient(new NetworkCredential("[username]", "[password]", "[domain]"), Microsoft.Xrm.Tooling.Connector.AuthenticationType.AD, "CRMURL", "80", "[Org]");
            var isDone = false;
            int pageStart = 1, pageEnd = 20;
            if (crmSvc.IsReady)
                var stopWatch = new Stopwatch();
                var results = new List<Entity>();

                while (!isDone)
                    IDictionary<string, QueryBase> queries = new Dictionary<string, QueryBase>();
                    //Console.WriteLine("Page {0} to {1}", pageStart, pageEnd);
                    for (int i = pageStart; i <= pageStart + 19; i++)
                        queries.Add(i.ToString(), new FetchExpression(string.Format(fetchXml, i)));

                    pageStart = pageEnd + 1;
                    pageEnd = pageStart + 19;
                    var threadNum = 1;
                        (query) =>
                            if (!isDone)
                                var pageResults = crmSvc.OrganizationServiceProxy.RetrieveMultiple(query.Value).Entities;
                                if (!pageResults.Any())
                                    isDone = true;

                Console.WriteLine("Parallel: Executed in {0} sec", stopWatch.ElapsedMilliseconds / 1000);
                Console.WriteLine(results.Count / (stopWatch.ElapsedMilliseconds / 1000) + " per sec\n");

                var resultCount = 0;
                pageStart = 1;
                pageEnd = 20;
                isDone = false;
                while (!isDone)
                    //Console.WriteLine("Page {0} to {1}", pageStart, pageEnd);
                    var executeMultipleRequest = new ExecuteMultipleRequest
                        Requests = new OrganizationRequestCollection(),
                        Settings = new ExecuteMultipleSettings() { ContinueOnError = true, ReturnResponses = true }

                    for (var i = pageStart; i <= pageStart + 19; i++)
                        executeMultipleRequest.Requests.Add(new RetrieveMultipleRequest()
                            Query = new FetchExpression(string.Format(fetchXml, i))
                    var executeMultipleResponses = ((ExecuteMultipleResponse)crmSvc.OrganizationServiceProxy.Execute(executeMultipleRequest)).Responses.ToList();
                    executeMultipleResponses.ForEach(x =>
                        var executeMultipleResultEntities = ((RetrieveMultipleResponse) x.Response).EntityCollection.Entities;
                        if (!executeMultipleResultEntities.Any())
                            isDone = true;
                        resultCount = resultCount + executeMultipleResultEntities.Count;
                    pageStart = pageEnd + 1;
                    pageEnd = pageStart + 19;
                Console.WriteLine("ExecuteMultiple: Executed in {0} sec", stopWatch.ElapsedMilliseconds / 1000);
                Console.WriteLine(results.Count / (stopWatch.ElapsedMilliseconds / 1000) + " per sec\n");

                IDictionary<string, QueryBase> entityQuery = new Dictionary<string, QueryBase>();
                entityQuery.Add("result", new FetchExpression(fetchXml));
                var queryResult = organisationSvcManager.ParallelProxy.RetrieveMultiple(entityQuery, true,
                    (pair, exception) => Console.WriteLine("{0} throwed {1}", pair.Key, exception.Message));
                Console.WriteLine("PFE.Xrm: Executed in {0} sec", stopWatch.ElapsedMilliseconds / 1000);
                Console.WriteLine(queryResult.Values.First().Entities.Count / (stopWatch.ElapsedMilliseconds / 1000) + " per sec\n");

                resultCount = 0;
                var pagedResults = new EntityCollection();
                var fetchToQuery = new FetchXmlToQueryExpressionRequest {FetchXml = string.Format(fetchXml, 1)};
                var retrieveQuery = ((FetchXmlToQueryExpressionResponse)crmSvc.OrganizationServiceProxy.Execute(fetchToQuery)).Query;
                retrieveQuery.PageInfo = new PagingInfo {PageNumber = 1};
                    pagedResults = crmSvc.OrganizationServiceProxy.RetrieveMultiple(retrieveQuery);
                    resultCount += pagedResults.Entities.Count;
                    retrieveQuery.PageInfo.PagingCookie = pagedResults.PagingCookie;
                } while (pagedResults.MoreRecords);
                Console.WriteLine("Paged RetrieveMultiple: Executed in {0} sec", stopWatch.ElapsedMilliseconds / 1000);
                Console.WriteLine(resultCount / (stopWatch.ElapsedMilliseconds / 1000) + " per sec\n");

New calculated field functions in CRMOnline Update 1

I logged a suggestion in Connect long time back, about lack of DATEDIFF function option in calculated field (

Quite to my surprise, CRMOnline Update 1 (Carina) now has a bunch of new DATEDIFF functions. These are


There is also an additional function “NOW”, that gives you the current datetime. According to “NOW” returns SQL Server UTC Time and not the user’s local time. This is true, only if the datetime field is Time Zone Independent. If the datetime field is created as User Local, “NOW” returns user’s local timezone.

If you try to use NOW function in a datetime field with behaviour Time Zone Independent, you’ll get “You can only use a Time-Zone Independent Date Time type field” error message.

The trick to getting UTC time in this case, is to first create the datetime field as User Local, fill in the calculation field action, and only then change the datetime behaviour to Time Zone Independent. Also note that, it is possible to change the datetime behavior from User Local to Time Zone Independent, but not the other way around (from the UI).

I created two calculated fields using NOW function, one is User Local and one is Time Zone Independent. After doing an Advanced Find with the attributes, here is the result.

DIFFINYEARS and NOW in combination, can be used in scenarios like calculating Age from Date of Birth, days since record creation, days to hit a certain deadline. These calculations, which were once accomplished by a running a periodic workflow or service, are trivial to implement with calculated fields. For eg. age calculation

This is the Advanced Find result

These great additions in the Spring Update, make calculated fields more powerful than ever.