Getting entity record counts

There are couple of ways to gets entity record counts: you could use XrmToolBox’s Record Counter Tool, or write the FetchXml to count the records as you navigate from one page to the next. But, if you just want to use the SDK here is a simple method: a new message called RetrieveTotalRecordCountRequest. Here is how to use it in C#:

var r = new RetrieveTotalRecordCountRequest();
r.EntityNames = new[] {
"contact", "account"
};
var m = ((RetrieveTotalRecordCountResponse)this.Execute(r)).EntityRecordCountCollection;
m.Dump();

Retrieve Record Count

Here is the same call in JavaScript:

var entities = ['contact','account'];
fetch(`${Xrm.Utility.getGlobalContext().getClientUrl()}/api/data/v9.1/RetrieveTotalRecordCount(EntityNames=@EntityNames)?@EntityNames=${encodeURIComponent(JSON.stringify(entities))}`, {
	credentials: 'include',
	headers: {
		'accept': 'application/json',
		'content-type': 'application/json; charset=utf-8',
		'odata-maxversion': '4.0',
		'odata-version': '4.0',
	},
	method: 'GET',
	mode: 'cors'
})
.then(x => x.json())
.then(r => console.log(r.EntityRecordCountCollection.Keys.forEach((k, i) => console.log(`${k}: ${r.EntityRecordCountCollection.Values[i]}`))));

Retrieve Record Count JS

Hope this is useful.

Reference:

https://docs.microsoft.com/en-us/dotnet/api/microsoft.crm.sdk.messages.retrievetotalrecordcountrequest?view=dynamics-general-ce-9

Advertisements

CDS, Microsoft Flow and DateTime formats

EDIT (7/12): You can download the Flow from https://1drv.ms/u/s!AvzjERKFC6gOwCqKnnAvMZA6mDsW if you just want to jump straight into exploring the actual Flow, instead of reading the article.

DateTime and Timezones seem to be the flavor of the month, so I thought I will take a crack at the problem as well. Coming from CRM background and spending a lot of time fiddling around the internal schema to understand the plumbings I came up with the following approach which I believe is decoupled and flexible.

In CDS, there is an entity called “UserSettings”, which stores a whole bunch of information regarding the user preferences. If you have used the awesome XrmToolBox tool called “UserSettings Utility”, you would remember this screen.

Timezone.png

Formats

The “UserSettings” entity is the source of this information. So, why not use the same for managing the timezones and datetime formats?

For re-usability, I want to create a Flow that just returns me the timezone information based on the executing user. Here is how it looks.

Get Timezone and Formats.png

Let us understand this step by step.

Step 1:

This is the HTTP trigger, which can take in a parameter for activedirectoryguid. This is the unique identifier for the Office 365 user.

Step 1.png

Step 2:

Get the Office 365 profile of the current user. If the activedirectoryguid is not passed in step 1, the intention is to use the current user’s Office 365 information.

Step 2.png

Step 3 & 4:

Initialise the activedirectoryguid variable, and query the CDS entity “Users”, and retrieve the user that matches the activedirectoryguid.

Step 3 and 4.png

Since activedirectoryguid can be passed on the trigger as well, I use the following coalesce expression, to set the variable to either the passed value on the trigger or from the Office 365 profile in step 2. This is the expression.

coalesce(triggerBody()?['activedirectoryguid'],body('Get_my_profile_(V2)')?['id'])

Step 5 & 6:

Retrieve “User Settings” entity and the associated “Time Zone Definitions” record based on the user’s timezone.

Step 5 and 6.png

The filter for User Settings is

first(body('Get_System_Users')?['value'])['systemuserid']

The filter for Time Zone Definitions is

first(body('Get_User_Settings')?['value'])['timezonecode']

Step 7:

This is the last step, where we return all the format and timezone information.

Step 7

Below are the expressions for the returned properties:

timezone

first(body('Get_User_Timezone')?['value'])['standardname']

dateformat

replace(first(body('Get_User_Settings')?['value'])['dateformatstring'],'/',first(body('Get_User_Settings')?['value'])['dateseparator'])

timeformat

replace(first(body('Get_User_Settings')?['value'])['timeformatstring'],':',first(body('Get_User_Settings')?['value'])['timeseparator'])

dateseparator

first(body('Get_User_Settings')?['value'])['dateseparator']

timeseparator

first(body('Get_User_Settings')?['value'])['timeseparator']

Now let us look at some sample output. For a user located in US, here is how the output of the Flow looks like.

US Output

Compare this with someone who is in Australian timezone.

Australia Output

Since this Flow now contains the logic for getting the formats and timezones, it can be utilised for another Flow that needs this information. For example look at the sample Flow below

Calling Flow

Calling Flow1

Below is the expression, I use to convert the “createdon” returned by the CDS Get Record, step which returns the datetime in UTC.

convertFromUtc(body('Get_record')?['createdon'],body('Parse_JSON')?['timezone'],concat(body('Parse_JSON')?['dateformat'],' ',body('Parse_JSON')?['timeformat']))

If you compare this Flow output and the record properties from Dynamics 365, it becomes obvious that the datetime and format is correctly displayed. Note that the CDS connector returns the datetime in UTC.

Output

CRM Record Properties

Initially I wanted to call the Flow that returns the formats and timezone using the “Start Flow” action on the Flow Connector, but it doesn’t seem to be picked up the response, so I had to resort to the whole HTTP action, which is not ideal.

It seems “Start Flow”, simply enables the Flow, does not actually run the Flow. Not sure why it is named in a misleading way.

Hope this is helpful.

 

 

Using Flow to notify solution imports

EDIT (03/04/2019): I made further changes to display the solution URL, so that you can click and find out the details about the solution that was imported.

EDIT (20/12/2018): I updated my Flow and made some improvements. You simply have to set the Timezone on the triggering action and you are all set. The Flow will email the solution list, to the user running the Flow. The download link has been updated to point to the updated solution.

I would not call it sneaky, but sometimes when I find the Dynamics 365 CE UI or behaviour has changed slightly, I can attribute it to some update that was applied to the environment. There are email notifications for major updates, but none for minor updates or patches that can happen frequently. So, I decided to solve this problem using Flow.

Every solution import into the system causes an Import Job record to be created. If a Flow can be scheduled to run everyday, and query the Import Job records that were started the previous day, we can easily keep track of what is happening in the environment.

The first step is to trigger the Flow on a preset schedule, and read the Import Job records.

Import Job Flow.png

Below are the expressions that I use for the boundary dates:

  • addDays(convertFromUtc(utcNow(),’AUS Eastern Standard Time’), -1, ‘yyyy-MM-dd’)
  • convertFromUtc(utcNow(),’AUS Eastern Standard Time’,’yyyy-MM-dd’)

The next step is to pick up only the information we need from the returned result, and project it to a form that is conducive for email.

Import Job Compose

With the exception of Solution Name, all the other properties are retrieved from the XML on the Data property. Below are the formula’s for those:

  1. Publisherxpath(xml(items(‘Parse_Import_Job_XML_and_re-map’)?[‘data’]),’string(//Publisher/Descriptions/Description[1]/@description)’)
  2. Started OnconvertFromUtc(items(‘Parse_Import_Job_XML_and_re-map’)?[‘startedon’],’AUS Eastern Standard Time’,’dd-MM-yyyy HH:mm:ss’)
  3. Old Versionxpath(xml(items(‘Parse_Import_Job_XML_and_re-map’)?[‘data’]),’string(//upgradeSolutionPackageInformation/fileVersion)’)
  4. New Versionxpath(xml(items(‘Parse_Import_Job_XML_and_re-map’)?[‘data’]),’string(//upgradeSolutionPackageInformation/currentVersion)’)

The last few steps are to email out the results, if any solutions were imported the previous day.

Email Solution.png

Email Step.png

The result is a barely formatted table, with the list of solutions that were imported the previous day.

Solution Import Email

With this approach, no one can sneak up a solution import on you. You have visibility over what is going on in the system.

You can download and install the Flow solution using these links:

Unmanaged: https://1drv.ms/u/s!AvzjERKFC6gOwlMIAzgRSXWhtXth

Managed: https://1drv.ms/u/s!AvzjERKFC6gOwlKvIWhW-Pb_vHiT

Hope this is helpful to stay on top of imports.

Improving entity forms using embedded PowerApps

I have been looking into scenarios with PowerApps and Flow that can benefit Dynamics 365 Customer Engagement user experience. One of the scenarios that can add value right away is on the entity forms. PowerApps can be embedded as an IFrame on the normal entity forms, and can be used similar to Dialogs to offload some processing to PowerApps and Flow.

Here is the finished product.

Embed PowerApps in UCI

This works without any JavaScript at all in UCI, with the normal IFrame control on the form. Make sure to tick the option that passes the record id and objecttype code and untick cross site scripting restriction.

IFrame Properties.png

No scripts are needed on the form to embed the PowerApp.

Form Scripts

Once PowerApps is in place, the current form context can be inferred using the “id” parameter that is passed on to the form.

PowerApps Initial

I use a known Guid during the design phase to assist me with the app design process, as the PowerApps calls the Flow during the OnStart event and sets the ProblemDetails variable.

A Flow can be associated to an event, from the Action->Flows area.

Associate Flow.png

When the PowerApps loads, it calls the Flow with the Guid, to retrieve the case details. The Flow that responds to PowerApps with these details on the case: Title, Customer Name, Type of Customer (Account or Contact).

Case Details Flow.png

In this Flow I just use “Respond to PowerApps”action and return the three outputs.

Return to PowerApps.png

I used variables to store the Client, which could be the Account’s name or Contact’s FullName, depending on what is on the case. The client type could be either Account or Contact. Account Details and Contact details are retrieved based on the result of the Client Type branch.

For the second Flow, the user presses the “Check” button which performs some additional checks based on business criteria. For this Flow, I used the “Response” action, which allows me to return JSON results. I stored the cases I am interested in on an array variable.

For each case.png

From the variable, I used Select action to grab only the properties I am interested in.

Select.png

I can then use the “Response” action to return these to PowerApps.

Response.png

One weird thing that I encountered in PowerApps/Flow integration, is that I would simply see the result as “True” from Flow, when I tried to return the return the response straight from the variable.

True Response.png

When I used Parse JSON and then Select to reduce the properties it started working. This can happen when there is something wrong with the schema validation, but I am not sure how this can happen when I copy-paste the JSON response from the previous steps to auto-generate the schema.

One more thing: When the Flow that is associated with the PowerApps changes, just make sure to disassociate and reassociate the Flow. I had issues when I did not do this, due to PowerApps caching the Flow definition.

References;

https://preview.flow.microsoft.com/en-us/blog/howto-upload-return-file/

https://docs.microsoft.com/en-us/powerapps/maker/canvas-apps/get-sessionid

 

 

 

Using Virtual Entities to query metadata

After my previous post, I continued to explore virtual entities to see what real life problems I can solve using them. One problem I could think of was metadata. How awesome would it be, if I can use Advanced Find to query entity and attribute metadata, or visualise them as a normal entity! It is not a dream anymore. I have developed an open-source solution to do this.

Here is some of the sample queries:

Query all attributes of type customer

Query by Attribute Type

Query by Attribute Type Results

Query all mandatory attributes

Query All Required.png

Query All Required Results.png

Query by Attribute Type and Entity

Query by Attribute Type and Entity.png

Query by Attribute Type and Entity Results.png

Query all Many to Many intersect entities

Query MM Entity.png

Query MM Entity Results

Query entities that have quick create enabled

Query Quick Create

Quick Create Results

You can open the entity and see more details.

Entity Form

You can open the attribute and view more details as well.

Attribute Form

All this is awesomeness is possible using the power of virtual entities. There are two virtual entities that you can query. They are called Entity and Attribute.

VE Solution

You can download the source code and managed solution from https://github.com/rajyraman/Metadata-Virtual-Entity/releases.

This step is important

After importing the managed solution,  change the data source for the attribute entity from “Entity Datasource” to “Attribute Datasource”. You have to do this from the Customization area and not from the managed solution.

Customise.png

Change Datasource.png

This is because by default, the system does not allow relationships between two virtual entities that have different datasources. This exception is shown when you try to do this.

Solution Exception

Stack Trace.png

In order to workaround this exception, you keep the data source same for the “Entity” virtual entity (parent) and “Attribute” (child) virtual entity, create the relationship and then change to the right datasource. Hence, the managed solution has the datasource set to “Entity Datasource” for the “Attribute” virtual entity, which has to be changed manually after importing the solution.

I hope this solution would be really useful for administrators. Please let me know any feedback on the post or on GitHub issues.

Sourcecode: https://github.com/rajyraman/Metadata-Virtual-Entity

Managed Solution: https://github.com/rajyraman/Metadata-Virtual-Entity/releases/latest

 

 

Virtual Entities for tracking recently used items

Virtual entities is a powerful feature that can be used not only to bring data from external sources, but also from inside Dynamics CRM/Dynamics 365 Customer Engagement.

Jason Lattimer already has a post (https://jlattimer.blogspot.com.au/2017/12/creating-custom-virtual-entity-data.html) that goes through how to setup the custom datasource/data provider. So, go and read that first as it has all the screenshots and I would be duplicating the content, if I go through the steps again.

Gotcha 1:

There is a exception when you create the datasource.

Datasource Creation Error

You can simply ignore this and refresh the Plugin Registration tool.

Gotcha 2: If you don’t want the user the open up an individual record, you don’t have to implement Retrieve message. It is optional. Since, I just want a collated entity, I did not register any plugin for the Retrieve.

Data Provider

Gotcha 3: You have to open up the newly created Data Provider entity, and enter the external name. If you don’t enter this, you will be unable to create the data source, as it will always error out.

Data Source Primary Key Attribute

Create New Data Source

Data Source List

Objective: MRU items should be accessible from Advanced Find. As an Administrator, I would like to query this data, and see metrics around user participation, entity usage, activity by week/month/year etc.

This is the Most Recently Used area.

Recent Items

This is the Advanced Find on the virtual entity, which is driven by the same data.

Advanced Find Results

As you can see, the data matches up. All the heavy lifting is done by the plugin, that retrieves the records from the “UserEntityUISettings” entity, parses the XML, sorts by user and accessed on and then populated the virtual entity “Recent Items”.

You can query by “Type Equals”, “User Equals” and “User Equals Current User”.

Advanced Find

I can also do a PowerBI report that is driven by the same virtual entity.

PowerBI Dashboard

Source code -> https://github.com/rajyraman/Recent-Items-Virtual-Entity

Managed Solution -> https://github.com/rajyraman/Recent-Items-Virtual-Entity/releases

I hope this helps people to use virtual entities to retrieve data from inside CRM as well – a sort of collation mechanism for reporting.

Reference:

https://docs.microsoft.com/en-us/dynamics365/customer-engagement/developer/virtual-entities/sample-generic-ve-plugin

https://jlattimer.blogspot.com.au/2017/12/creating-custom-virtual-entity-data.html

Integrating Slack to Dynamics 365 Customer Engagement

In the previous post I described how easy it is to use Microsoft Flow to interact with Dynamics 365 Customer Engagement, by letting Azure Functions handle the core logic. In this post, I will show how to integrate Slack to Dynamics 365 Customer Engagement using Flow and Functions.

This is the objective: In my Slack channel, I quickly want to query the record count using a slash command, without have to jump into XrmToolBox/Dynamics 365 Customer Engagement application itself. I took the record count as a simple use case. You can create multiple slash commands, with each one doing a different targeted action in Dynamics 365.

The first step is to create the new app in Slack. Navigate to https://api.slack.com/apps/new

New Slack App.png

Since this is an internal app that I won’t be distributing, I am choosing a simple name. If you plan to distribute this app, choose a more appropriate name.

Now you will be taken to the app’s initial config screen.

New App Initial Screen.png

We will be creating a new slash command that will return the record count of the entity from Dynamics 365 Customer Engagement. Click on “Create a new command”

Slash Commands.png

Choose the name for the slash command. I am just going with “/count”.

Add new slash command.png

 

The critical part here is the Request URL. This the URL that Slack will POST to with some information. What is the information and how does this look like? I used RequestBin* (see footnote) to find out this information.

Request Bin.png

 

Note the two relevant parameters:

  • command – This the actual slash command the user executed
  • text: This is the text that comes after the slash command

For e.g., if I typed “/count account” into the Slack chat window, the command parameter’s value will be “/count” and the text parameter’s value will be “account“. During the development phase, I put in the RequestBin’s URL in the Request URL. We will come back later, once the Flow is complete and replace this placeholder URL, with the actual Flow URL.

Now you can see the list of slash commands in this app.

List of slash commands.png

Now click the “Basic Information” screen on the left, and then on “Install your app to the workspace”. This should expand the section, and you can now actually install the app into your workspace by clicking on “Install App to Workspace”.

Slack App Information.png

Grant the required permissions for the app.

Authorise App.png

Now it is time to develop the Flow, which looks very similar to my previous post about Flow and Functions. The difference here is, that the Flow is triggered by HTTP POST, and not manually using a Flow button. Flow will receive the slash command from Slack. Here is what the Flow looks like.

Flow Execution Log

Here is what the Flow does:

  1. When HTTP POST request is received from Slack, it posts a message back to Slack asking the user to wait while the record count is retrieved.
  2. Checks if the slash command is “count”
  3. If the slash command is “count”, call the Azure Function using the custom connection (refer previous post, on how to do create a custom connection to the Azure Function that you can use in Flow)
  4. Parse the response received from Azure Function, which queries Dynamics 365 Customer Engagement for the entity’s record count
  5. Send a mobile notification that shows up if the user has Flow app installed
  6. Send a message back to the channel that the slash command was executed on, with the record count

There are three important bits in the Flow:

The first is getting the slash command from the POST message.

Parse command.png

The second is posting into the right Slack channel i.e. the channel that was the source of the slash command. You can get the channel from the “channel_name” parameter.

Post message step.png

The third is parsing the JSON returned by the Azure Function. This is schema of the JSON returned.

{
    "type": "object",
    "properties": {
        "entityName": {
            "type": "string"
        },
        "count": {
            "type": "number"
        }
    }
}

You can get the Flow URL by clicking on the HTTP step that is the first step of the Flow.

Flow URL.png

Grab the whole HTTP URL and plug it in on the slash command’s request URL.

Now, you can use the slash command on your workspace to get the record count.

Slack WorkspaceSlack Workspace result

Note: When I worked on this post last month, RequestBin had the capability to create private bins. But, when I looked into this again this week it looks like they have taken away this capability, due to abuse -> https://github.com/Runscope/requestbin.

Request Bin message.png

You would have to self-host to inspect the POST message from Slack. The other option is to create the Flow with just the HTTP request step and look into the execution log, to see what was posted like below.

HTTP Post.png