Improving entity forms using embedded PowerApps

I have been looking into scenarios with PowerApps and Flow that can benefit Dynamics 365 Customer Engagement user experience. One of the scenarios that can add value right away is on the entity forms. PowerApps can be embedded as an IFrame on the normal entity forms, and can be used similar to Dialogs to offload some processing to PowerApps and Flow.

Here is the finished product.

Embed PowerApps in UCI

This works without any JavaScript at all in UCI, with the normal IFrame control on the form. Make sure to tick the option that passes the record id and objecttype code and untick cross site scripting restriction.

IFrame Properties.png

No scripts are needed on the form to embed the PowerApp.

Form Scripts

Once PowerApps is in place, the current form context can be inferred using the “id” parameter that is passed on to the form.

PowerApps Initial

I use a known Guid during the design phase to assist me with the app design process, as the PowerApps calls the Flow during the OnStart event and sets the ProblemDetails variable.

A Flow can be associated to an event, from the Action->Flows area.

Associate Flow.png

When the PowerApps loads, it calls the Flow with the Guid, to retrieve the case details. The Flow that responds to PowerApps with these details on the case: Title, Customer Name, Type of Customer (Account or Contact).

Case Details Flow.png

In this Flow I just use “Respond to PowerApps”action and return the three outputs.

Return to PowerApps.png

I used variables to store the Client, which could be the Account’s name or Contact’s FullName, depending on what is on the case. The client type could be either Account or Contact. Account Details and Contact details are retrieved based on the result of the Client Type branch.

For the second Flow, the user presses the “Check” button which performs some additional checks based on business criteria. For this Flow, I used the “Response” action, which allows me to return JSON results. I stored the cases I am interested in on an array variable.

For each case.png

From the variable, I used Select action to grab only the properties I am interested in.


I can then use the “Response” action to return these to PowerApps.


One weird thing that I encountered in PowerApps/Flow integration, is that I would simply see the result as “True” from Flow, when I tried to return the return the response straight from the variable.

True Response.png

When I used Parse JSON and then Select to reduce the properties it started working. This can happen when there is something wrong with the schema validation, but I am not sure how this can happen when I copy-paste the JSON response from the previous steps to auto-generate the schema.

One more thing: When the Flow that is associated with the PowerApps changes, just make sure to disassociate and reassociate the Flow. I had issues when I did not do this, due to PowerApps caching the Flow definition.






Using Virtual Entities to query metadata

After my previous post, I continued to explore virtual entities to see what real life problems I can solve using them. One problem I could think of was metadata. How awesome would it be, if I can use Advanced Find to query entity and attribute metadata, or visualise them as a normal entity! It is not a dream anymore. I have developed an open-source solution to do this.

Here is some of the sample queries:

Query all attributes of type customer

Query by Attribute Type

Query by Attribute Type Results

Query all mandatory attributes

Query All Required.png

Query All Required Results.png

Query by Attribute Type and Entity

Query by Attribute Type and Entity.png

Query by Attribute Type and Entity Results.png

Query all Many to Many intersect entities

Query MM Entity.png

Query MM Entity Results

Query entities that have quick create enabled

Query Quick Create

Quick Create Results

You can open the entity and see more details.

Entity Form

You can open the attribute and view more details as well.

Attribute Form

All this is awesomeness is possible using the power of virtual entities. There are two virtual entities that you can query. They are called Entity and Attribute.

VE Solution

You can download the source code and managed solution from

This step is important

After importing the managed solution,  change the data source for the attribute entity from “Entity Datasource” to “Attribute Datasource”. You have to do this from the Customization area and not from the managed solution.


Change Datasource.png

This is because by default, the system does not allow relationships between two virtual entities that have different datasources. This exception is shown when you try to do this.

Solution Exception

Stack Trace.png

In order to workaround this exception, you keep the data source same for the “Entity” virtual entity (parent) and “Attribute” (child) virtual entity, create the relationship and then change to the right datasource. Hence, the managed solution has the datasource set to “Entity Datasource” for the “Attribute” virtual entity, which has to be changed manually after importing the solution.

I hope this solution would be really useful for administrators. Please let me know any feedback on the post or on GitHub issues.


Managed Solution:



Virtual Entities for tracking recently used items

Virtual entities is a powerful feature that can be used not only to bring data from external sources, but also from inside Dynamics CRM/Dynamics 365 Customer Engagement.

Jason Lattimer already has a post ( that goes through how to setup the custom datasource/data provider. So, go and read that first as it has all the screenshots and I would be duplicating the content, if I go through the steps again.

Gotcha 1:

There is a exception when you create the datasource.

Datasource Creation Error

You can simply ignore this and refresh the Plugin Registration tool.

Gotcha 2: If you don’t want the user the open up an individual record, you don’t have to implement Retrieve message. It is optional. Since, I just want a collated entity, I did not register any plugin for the Retrieve.

Data Provider

Gotcha 3: You have to open up the newly created Data Provider entity, and enter the external name. If you don’t enter this, you will be unable to create the data source, as it will always error out.

Data Source Primary Key Attribute

Create New Data Source

Data Source List

Objective: MRU items should be accessible from Advanced Find. As an Administrator, I would like to query this data, and see metrics around user participation, entity usage, activity by week/month/year etc.

This is the Most Recently Used area.

Recent Items

This is the Advanced Find on the virtual entity, which is driven by the same data.

Advanced Find Results

As you can see, the data matches up. All the heavy lifting is done by the plugin, that retrieves the records from the “UserEntityUISettings” entity, parses the XML, sorts by user and accessed on and then populated the virtual entity “Recent Items”.

You can query by “Type Equals”, “User Equals” and “User Equals Current User”.

Advanced Find

I can also do a PowerBI report that is driven by the same virtual entity.

PowerBI Dashboard

Source code ->

Managed Solution ->

I hope this helps people to use virtual entities to retrieve data from inside CRM as well – a sort of collation mechanism for reporting.


Integrating Slack to Dynamics 365 Customer Engagement

In the previous post I described how easy it is to use Microsoft Flow to interact with Dynamics 365 Customer Engagement, by letting Azure Functions handle the core logic. In this post, I will show how to integrate Slack to Dynamics 365 Customer Engagement using Flow and Functions.

This is the objective: In my Slack channel, I quickly want to query the record count using a slash command, without have to jump into XrmToolBox/Dynamics 365 Customer Engagement application itself. I took the record count as a simple use case. You can create multiple slash commands, with each one doing a different targeted action in Dynamics 365.

The first step is to create the new app in Slack. Navigate to

New Slack App.png

Since this is an internal app that I won’t be distributing, I am choosing a simple name. If you plan to distribute this app, choose a more appropriate name.

Now you will be taken to the app’s initial config screen.

New App Initial Screen.png

We will be creating a new slash command that will return the record count of the entity from Dynamics 365 Customer Engagement. Click on “Create a new command”

Slash Commands.png

Choose the name for the slash command. I am just going with “/count”.

Add new slash command.png


The critical part here is the Request URL. This the URL that Slack will POST to with some information. What is the information and how does this look like? I used RequestBin* (see footnote) to find out this information.

Request Bin.png


Note the two relevant parameters:

  • command – This the actual slash command the user executed
  • text: This is the text that comes after the slash command

For e.g., if I typed “/count account” into the Slack chat window, the command parameter’s value will be “/count” and the text parameter’s value will be “account“. During the development phase, I put in the RequestBin’s URL in the Request URL. We will come back later, once the Flow is complete and replace this placeholder URL, with the actual Flow URL.

Now you can see the list of slash commands in this app.

List of slash commands.png

Now click the “Basic Information” screen on the left, and then on “Install your app to the workspace”. This should expand the section, and you can now actually install the app into your workspace by clicking on “Install App to Workspace”.

Slack App Information.png

Grant the required permissions for the app.

Authorise App.png

Now it is time to develop the Flow, which looks very similar to my previous post about Flow and Functions. The difference here is, that the Flow is triggered by HTTP POST, and not manually using a Flow button. Flow will receive the slash command from Slack. Here is what the Flow looks like.

Flow Execution Log

Here is what the Flow does:

  1. When HTTP POST request is received from Slack, it posts a message back to Slack asking the user to wait while the record count is retrieved.
  2. Checks if the slash command is “count”
  3. If the slash command is “count”, call the Azure Function using the custom connection (refer previous post, on how to do create a custom connection to the Azure Function that you can use in Flow)
  4. Parse the response received from Azure Function, which queries Dynamics 365 Customer Engagement for the entity’s record count
  5. Send a mobile notification that shows up if the user has Flow app installed
  6. Send a message back to the channel that the slash command was executed on, with the record count

There are three important bits in the Flow:

The first is getting the slash command from the POST message.

Parse command.png

The second is posting into the right Slack channel i.e. the channel that was the source of the slash command. You can get the channel from the “channel_name” parameter.

Post message step.png

The third is parsing the JSON returned by the Azure Function. This is schema of the JSON returned.

    "type": "object",
    "properties": {
        "entityName": {
            "type": "string"
        "count": {
            "type": "number"

You can get the Flow URL by clicking on the HTTP step that is the first step of the Flow.

Flow URL.png

Grab the whole HTTP URL and plug it in on the slash command’s request URL.

Now, you can use the slash command on your workspace to get the record count.

Slack WorkspaceSlack Workspace result

Note: When I worked on this post last month, RequestBin had the capability to create private bins. But, when I looked into this again this week it looks like they have taken away this capability, due to abuse ->

Request Bin message.png

You would have to self-host to inspect the POST message from Slack. The other option is to create the Flow with just the HTTP request step and look into the execution log, to see what was posted like below.

HTTP Post.png


Introduction to integrating Azure Functions & Flow with Dynamics 365

I haven’t paid much attention to what is happening in the Azure space (Functions, Flow, Logic Apps etc.), because I was under the impression that it is a daunting task to setup the integration i.e. Azure AD registration, getting tokens, Auth header and the whole shebang.

As a beginner trying to understand the stack and how to integrate the various applications, I have been postponing exploring this due to the boiler-plate involved setting this up. But then, I read this post from Yaniv Arditi: Execute a Recurring Job in Microsoft Dynamics 365 with Azure Scheduler. Things started clicking, and I decided to spend some days exploring the Functions & Flow.

I started with a simple use case: As a Dynamics 365 Customer Engagement administrator, I need the ability to do some simple tasks from my mobile during commute. Flow button fits this requirement perfectly. The scenario I looked into solving is, how to manage the Dynamics 365 Customer Engagement trace log settings from Flow app on my mobile, in case I get a call about a plugin error on my way to work, and need the logs waiting for me, when I get to work.

As I wanted to get a working application as fast as possible, I did not start writing the Functions code from Visual Studio. Instead, I tested my code from LINQPad as it is easier to import Nuget packages and also get Intellisense (Premium version). If you want to do execute Azure Functions locally, read Azure Functions Tools for Visual Studio on docs site. I did install and play with it, once I got completed the Flow+Function integration. Also, when you install the Azure Functions Tools for Visual Studio you get the capability to run and debug the functions locally. How awesome is that ❤️!

There are two minor annoyances that I encountered with Visual Studio development locally:

  1. There is no Intellisense for csx files. Hopefully this will be fixed soon. The suggested approach in the mean time appears to be “Pre-compiled Azure Functions“. But, I did not try in this exploration phase. It also improves the Function execution time from cold start.
  2. I had to install the Nuget packages locally using Install-Package even though these were specified on project.json. I could not debug the Azure Functions locally without this, as the Nuget restore did not seem to happen automatically on build.

Now, I will now specify the steps involved in creating the Azure Flow button to update the Trace Log setting in Dynamics 365 Customer Engagement.

Step 1: Head to the Azure Portal (

Azure Portal.png

Step 2: Search for Functions App, select the row that says “Function App” and click on create from the right most pane.

Functions App.png

Step 3: Specify the Functions App name and click “Create”.

Function App Settings

Step 4: Navigate to the newly created Function App from the notification area. It is also good to “Pin to dashboard” for easier access next time you login to the portal.

Azure Notifications

Step 5: Click on the “Application Settings” link from the initial Functions App screen.

Functions Initial Screen.png

Step 6: Choose the Platform as 64-bit. I got compilation errors with CrmSdk nuget packages when this was set to 32-bit. You will also have to add the connection string to your CRM instance. The connection string name that I have specified is “CRM”. You may want to make this bit more descriptive.

Functions General settings

Connection String.png

Step 7: Now is the exciting part. Click on the “+” button and then click on the “Custom Function” link.

Custom Function.png

Step 8: This new function will execute on a HTTP trigger and coded using C#.

Functions Http Trigger

Step 9: After this, I sporadically experienced a blank right hand pane with nothing in it. If this happens, simply do a page refresh and repeat steps 6-8. If everything goes well, you should see this screen. I left the Authorize level as “Function” which means that the Auth key needs to be in the URL for invocation.

New Function creation screen

Step 10: You are now presented with some quick start code. Click on the “View Files” pane, which is collapsed on the right hand side.

Default Functions Code.png

Step 11: Click on “Add” and enter the file name as “project.json”

Add Project Json

Step 12: Paste the following JSON into the “project.json” file and press “Enter”. Now paste the below JSON for retrieving the CRM SDK assemblies from Nuget and press “Save”. The Nuget packages should begin to download.

  "frameworks": {
      "dependencies": {
        "Microsoft.CrmSdk.CoreAssemblies": "",
        "Microsoft.CrmSdk.XrmTooling.CoreAssembly": ""

Project Json Updated.png

Step 13: Now open the “run.csx” file, paste in the follow code and save.

using System.Net;
using System.Configuration;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using Microsoft.Crm.Sdk.Messages;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Tooling.Connector;

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
    log.Info("C# HTTP trigger function processed a request.");
    var queryParameters = req.GetQueryNameValuePairs();

    // Get request body
    dynamic data = await req.Content.ReadAsAsync<object>();
    string traceLogLevel = data?.traceloglevel;
    var client = new CrmServiceClient(ConfigurationManager.ConnectionStrings["CRM"].ConnectionString);
	var organizationSetting = client.RetrieveMultiple(new FetchExpression("<fetch><entity name='organization'><attribute name='plugintracelogsetting' /></entity></fetch>")).Entities.First();
	var oldTraceLogValue = (TraceLog)organizationSetting.GetAttributeValue<OptionSetValue>("plugintracelogsetting").Value;
    var newTraceLogValue = (int)Enum.Parse(typeof(TraceLog), traceLogLevel);
	organizationSetting["plugintracelogsetting"] = new OptionSetValue(newTraceLogValue);
    if(oldTraceLogValue.ToString() == traceLogLevel)
        return req.CreateResponse(HttpStatusCode.OK, $"TraceLog Level has not changed from {traceLogLevel}. No update.");
    return traceLogLevel == null
        ? req.CreateResponse(HttpStatusCode.BadRequest, "TraceLog Level not found on the query string or in the request body")
        : req.CreateResponse(HttpStatusCode.OK, $"Trace Log updated from {oldTraceLogValue} to {traceLogLevel}");

enum TraceLog{

Step 14: You can now execute this function by click “Run” and using the JSON in the screenshot for the POST body. The “traceloglevel” can be one of three values: Off, Exception and All.

Execute Function.png

As you can see the function:

  1. Connected to the organization specified in the Application Settings using the connection string
  2. Retrieved the current trace setting and updated it, if there is a change, using the SDK
  3. Returned the response as text/plain.

If you want to execute the same using Postman or Fiddler, you can grab the Function URL as well. Note that the AuthToken is in the URL.

Function Url

Step 15: Since I am going to do an update, I don’t want the function call to trigger the change. So, just turn off “GET” and save. This means that the “traceloglevel” will only be updated on a “POST”, and not on a “GET” with query string.

Functions Integrate.png

Step 16: Now it is time to export the API definition JSON for consumption by Flow.

API Definition.png

Step 17: Choose “Function Preview” as the API Definition Key and then click “Generate API Definition Template” button to generate the Swagger JSON.

API Definition Generate.png

Step 18: Now click on the “Authenticate” button and enter the function auth key (see Step 14) in the API Key textbox and click on “Authenticate” button in the dialog box.


You should see a green tick next to the apiKeyQuery. This means that the key has been accepted.


Step 19: Now it is time to add the post body structure to the Swagger JSON. I used the Swagger editor to play around with the schema and understand how this works. Thank you Nishant Rana for this tip.

Swagger JSON

You should now able able to POST to this function easily and inspect the responses.

Swagger Response.png

Step 20: Now click on the “Export to PowerApps+Flow” button and then on the “Download” button. You should now be prompted to save ApiDef.json into your file system.

Export to PowerApps Flow.png

Step 21: Now it is time to navigate to Flow

Microsoft Flow

Step 22: You can now create a custom connector to hookup Function and Flow.

Custom Connector.png

Step 23: It is now time to import the Swagger JSON file from Step 20. Choose “Create custom connector” and then “Import an OpenAPI file”. In this dialog box, choose the Swagger JSON file from Step 20.

Step 24: Specify the details about the custom connector. This will be used to later search the connector when you build the Flow.

Connector Information.png

Step 25: Just click next, as the API key will be specified on the connection, not on the connector. The URL query string parameter is “code”.

Connector Api Key

Step 26: Since I just have only “ModifyTraceLogSetting” action, this is the only one that shows up. If you have multiple functions on the Functions app, multiple operations should be displayed on this screen.

Connector Action Definitions

Step 27: If you navigate down, you can see that the connector has picked up the message body that is to be sent with the POST.

Connector Message Body.png

Step 28: If you click on the “traceloglevel” parameter, you see see details about the POST body.

Connector Post Message Param.png

Step 29: This is the time to create the connection that will be used by the connector.

Connector Test.png

Step 30: Enter the Function API key that you got from Step 14. This will be used to invoke the Function.

Connections Api Key

Step 31: The connection does not show up straight away. You will have to click on the little refresh icon that is to the right of the Connections section. You can now test the connection by clicking the “Test Operation” button, and choosing the parameter value for “traceloglevel” that will be sent with the POST body. You can also see the live response from the Function on this screen.

Connections Test with body.png

Connections Result

Step 32: Once you have saved your connector, you will see something like this below, on the list of custom connectors.

Custom Connector View

Step 33: Now is the time to create the Flow. Choose My Flows -> Create from blank -> Search hundreds of connectors and triggers

Create FlowCreate blank flow

Step 34: Enter the Flow name and since this will be invoked from the Flow app on mobile, choose “Flow button for mobile” as the connector.

Flow Button.png

Step 35: The Flow button will be obviously triggered manually.

Manually trigger flow.png

Step 36: When the user clicks the Flow button, it is time to grab the input, which in the case will the Trace Log Level setting. Choose “Add a list of options” and also a name for the input.

Trigger Flow Input

Step 37: You don’t want the user to enter free-text or numbers, hence you present a list of options from which the user will choose one.

Trace Level Options.png

Step 38: After clicking “Add an action”, you can now choose the custom connector that you created. Search and locate your custom connector.

Flow Custom Connector.png

Step 39: Flow has magically populated the actions that are exposed by this connector. In this case there is only one action to modify the Trace Log setting.

Flow Custom Connector Action.png

Step 40: In this step you don’t want to choose a value at design time, rather map the user entered value to the custom connector. So, choose “Enter custom value”.

Trace Log Level Custom Connector.png

Step 41: The name of the input in Step 37 is “Trace Level”, so choose this value as the binding value that will be used in the custom connector.

Trace Log Level Custom Connector Bind

Step 42: In this case, I have a simple action. I just want to receive mobile notification.

Trace Log Notification.png

Step 43: I just want to receive a notification on my mobile, since I have Flow app installed. When my custom connector calls the function that updates the trace log level, the response text that is returned by the function comes through on the body on the Flow app.

This text is displayed as a notification. If you have a JSON returned by the Function and Flow app, you have to use the parseJSON manipulation to grab the right property. In this case, it is not required as the response is plaintext.

Send Mobile Notification.png

Send Mobile Notification Body.png

Step 44: When the Flow design is complete it should look like this.

Flow Design Complete.png

Step 45: You can run the Flow from either the Flow app on mobile or right from here. I click “Run Now” to check if everything is OK. You can also specify the “Trace Level” here that will be passed to the Function.

Run Flow.png

Run Flow Trace Level Parameter.png

Step 46: I can check the status of the Flow easily. The cool thing about this screen it that it logs so much information that is useful while troubleshooting what went wrong.

Flow Execution Log.png

I can also invoke this Flow on my mobile, using the Flow App. I get a native notification when the Flow completes.

What’s next

While I was experimenting with Flow and Function, I wanted to test integration between Slack and Dynamics 365. For proof of concept, I am running a custom command (“/recordcount”) on Slack channel to retrieve records from Dynamics 365.

Slack Channel.png

I will blog about this next.

Conclusion: I am really excited about the future of Flow & Functions and what this brings to the table for both developers, who want to get their hands dirty and power-users, who want something that they can hook up easily without writing any code.

If you have any feedback, suggestions or errors in this post, please comment below, so that I can learn and improve.

Export all attachments using LINQPad

I was playing around with LINQPad today and wrote this C# code to export all attachments from CRM. You can customise the query to export only certain attachments if required. You could also modify the code to gather the output location from the user, instead of asking them to choose between “My Documents” or “Desktop”. This could also be potentially written as an XrmToolBox tool.

I executed the code in LINQPad v5.26 and Dynamics CRM 2016 OnPremise 8.1 environment. I tried to retrieve the attachment using LINQ, but decided to use normal QueryByAttribute with paging for performance reasons.

<h4>Choose an output path</h4>
var folders = new List<Environment.SpecialFolder> { Environment.SpecialFolder.Desktop, Environment.SpecialFolder.MyDocuments };
folders.ForEach(x => new Hyperlinq(() => DumpFiles(Environment.GetFolderPath(x)), x.ToString()).Dump());

void DumpFiles(string selectedFolder)
	new Hyperlinq(selectedFolder).Dump("Chosen output path");
	var progress = new Util.ProgressBar("Writing files: ").Dump();
	progress.HideWhenCompleted = true;
	var retrieveQuery = new QueryByAttribute("annotation")
		ColumnSet = new ColumnSet("documentbody","filename"),
		PageInfo = new PagingInfo{ Count = 500, PageNumber = 1 }
	retrieveQuery.AddAttributeValue("isdocument", true);
	var resultsDc = new DumpContainer().Dump($"Results");
	EntityCollection results;
	int totalRecordCount = 0;
		results = ((RetrieveMultipleResponse)this.Execute(new RetrieveMultipleRequest { Query = retrieveQuery })).EntityCollection;
		var files = results.Entities.Cast<Annotation>();
		totalRecordCount += results.Entities.Count;
		resultsDc.Content = $"Completed Page {retrieveQuery.PageInfo.PageNumber}, Files: {totalRecordCount}";
		int fileNumber = 0;
		foreach (var f in files)
			var fileContent = Convert.FromBase64String(f.DocumentBody);
			File.WriteAllBytes(Path.Combine(selectedFolder, f.FileName), fileContent);
			progress.Caption = $"Page {retrieveQuery.PageInfo.PageNumber} - Writing files: {fileNumber}/{retrieveQuery.PageInfo.Count}";
			progress.Percent = fileNumber * 100 / retrieveQuery.PageInfo.Count;
		retrieveQuery.PageInfo.PagingCookie = results.PagingCookie;
	} while (results.MoreRecords);
	resultsDc.Content = $"{totalRecordCount} files saved.";

LINQPad Annotation Export User Input.png

LINQPad Annotation Export.png

Basic CRUD using Xrm.WebApi

UPDATE (30/10): Official documentation has been published -> Andrii got it right. IMHO this feels a little clunky and incomplete, as you need to now the message parameters along with the types. Luckily it appears Jason is already working on this issue -> and once this this will make life easy again.

UPDATE (23/10): Part 2 ( & Part 3 ( have been published. I am not sure if this is how MS intends this to be used. I’ll will wait for official MS documentation for confirmation regarding this.

UPDATE (18/10): It appears Andrii got to this topic first -> I should have probably subscribed to his RSS feed – could have saved some time for me. Anyway there is also a Part 2 that he has not posted yet, so I am looking forward to see what I missed.

Dynamics 365 Customer Engagement v9 has added CRUD functionality to query the WebAPI endpoint using Client API.

Xrm Web Api.png

Based on my initial analysis, this seems to be a work in progress and more functions will be added over time. This is some sample code how you can do the basic CRUD using this new feature. This is not an exhaustive documentation, but considering that there is nothing about this in the official documentation, it is a starting point.

Create : Method signature is ƒ (entityType, data)

Sample code to create 3 contact records

[ Array(3).keys()].forEach(x => Xrm.WebApi.createRecord('contact', {
    firstname: 'Test',
    lastname: `Contact${x}`
}).then(c => console.log(`${x}: Contact with id ${} created`))
  .fail(e => console.log(e.message)))

WebApi Create.png

Retrieve: Method signature is ƒ (entityName, entityId, options)

Sample code to retrieve contact record based on the primary key

Xrm.WebApi.retrieveRecord('contact', 'cadf8ac6-17b1-e711-a842-000d3ad11148', '$select=telephone1')
  .then(x => console.log(`Telephone: ${x.telephone1}`))
  .fail(e => console.log(e.message))

WebApi Retrieve

RetrieveMultiple: Method signature is f(entityType, options, maxPageSize)

Sample code to retrieve 10 contact records without any conditions.

Xrm.WebApi.retrieveMultipleRecords('contact', '$select=fullname,telephone1', 10)
  .then(x => x.entities.forEach(c => console.log(`Contact id: ${c.contactid}, fullname: ${c.fullname}, telephone1: ${c.telephone1}`)))
  .fail(e => console.log(e.message))

WebApi RetrieveMultiple.png

Update: Method signature is ƒ (entityName, entityId, data)

Sample code to update field on contact record

Xrm.WebApi.updateRecord('contact', 'cadf8ac6-17b1-e711-a842-000d3ad11148', {
    telephone1: '12345'
}).then(x => console.log(`Contact with id ${} updated`))
  .fail(x => console.log(x.message))<span 				data-mce-type="bookmark" 				id="mce_SELREST_start" 				data-mce-style="overflow:hidden;line-height:0" 				style="overflow:hidden;line-height:0" 			></span>

WebApi Update.png

Delete: Method signature is ƒ (entityName, entityId)

Xrm.WebApi.deleteRecord('contact', '88E682D8-18B1-E711-A842-000D3AD11148')
  .then(c => console.log('Contact deleted'))
  .fail(x => console.log(x.message))

WebApi Delete.png

What is not yet done/appears to be in progress

  1. Xrm.WebApi.offline not yet implemented
  2. Ability to construct custom OData requests to pass into Xrm.WebApi.execute (Refer Andrii’s post)
  3. Batching multiple requests (Refer Andrii’s post)

You can use this on your client side code on v9. It is quite basic at the moment, but you don’t need to include any external libraries. But in more advanced scenarios, you can always use Xrm WebAPI Client till these features are made available in the Client API.