CDS, Microsoft Flow and DateTime formats

EDIT (7/12): You can download the Flow from https://1drv.ms/u/s!AvzjERKFC6gOwCqKnnAvMZA6mDsW if you just want to jump straight into exploring the actual Flow, instead of reading the article.

DateTime and Timezones seem to be the flavor of the month, so I thought I will take a crack at the problem as well. Coming from CRM background and spending a lot of time fiddling around the internal schema to understand the plumbings I came up with the following approach which I believe is decoupled and flexible.

In CDS, there is an entity called “UserSettings”, which stores a whole bunch of information regarding the user preferences. If you have used the awesome XrmToolBox tool called “UserSettings Utility”, you would remember this screen.

Timezone.png

Formats

The “UserSettings” entity is the source of this information. So, why not use the same for managing the timezones and datetime formats?

For re-usability, I want to create a Flow that just returns me the timezone information based on the executing user. Here is how it looks.

Get Timezone and Formats.png

Let us understand this step by step.

Step 1:

This is the HTTP trigger, which can take in a parameter for activedirectoryguid. This is the unique identifier for the Office 365 user.

Step 1.png

Step 2:

Get the Office 365 profile of the current user. If the activedirectoryguid is not passed in step 1, the intention is to use the current user’s Office 365 information.

Step 2.png

Step 3 & 4:

Initialise the activedirectoryguid variable, and query the CDS entity “Users”, and retrieve the user that matches the activedirectoryguid.

Step 3 and 4.png

Since activedirectoryguid can be passed on the trigger as well, I use the following coalesce expression, to set the variable to either the passed value on the trigger or from the Office 365 profile in step 2. This is the expression.

coalesce(triggerBody()?['activedirectoryguid'],body('Get_my_profile_(V2)')?['id'])

Step 5 & 6:

Retrieve “User Settings” entity and the associated “Time Zone Definitions” record based on the user’s timezone.

Step 5 and 6.png

The filter for User Settings is

first(body('Get_System_Users')?['value'])['systemuserid']

The filter for Time Zone Definitions is

first(body('Get_User_Settings')?['value'])['timezonecode']

Step 7:

This is the last step, where we return all the format and timezone information.

Step 7

Below are the expressions for the returned properties:

timezone

first(body('Get_User_Timezone')?['value'])['standardname']

dateformat

replace(first(body('Get_User_Settings')?['value'])['dateformatstring'],'/',first(body('Get_User_Settings')?['value'])['dateseparator'])

timeformat

replace(first(body('Get_User_Settings')?['value'])['timeformatstring'],':',first(body('Get_User_Settings')?['value'])['timeseparator'])

dateseparator

first(body('Get_User_Settings')?['value'])['dateseparator']

timeseparator

first(body('Get_User_Settings')?['value'])['timeseparator']

Now let us look at some sample output. For a user located in US, here is how the output of the Flow looks like.

US Output

Compare this with someone who is in Australian timezone.

Australia Output

Since this Flow now contains the logic for getting the formats and timezones, it can be utilised for another Flow that needs this information. For example look at the sample Flow below

Calling Flow

Calling Flow1

Below is the expression, I use to convert the “createdon” returned by the CDS Get Record, step which returns the datetime in UTC.

convertFromUtc(body('Get_record')?['createdon'],body('Parse_JSON')?['timezone'],concat(body('Parse_JSON')?['dateformat'],' ',body('Parse_JSON')?['timeformat']))

If you compare this Flow output and the record properties from Dynamics 365, it becomes obvious that the datetime and format is correctly displayed. Note that the CDS connector returns the datetime in UTC.

Output

CRM Record Properties

Initially I wanted to call the Flow that returns the formats and timezone using the “Start Flow” action on the Flow Connector, but it doesn’t seem to be picked up the response, so I had to resort to the whole HTTP action, which is not ideal.

It seems “Start Flow”, simply enables the Flow, does not actually run the Flow. Not sure why it is named in a misleading way.

Hope this is helpful.

 

 

Advertisements

Using Flow to notify solution imports

I would not call it sneaky, but sometimes when I find the Dynamics 365 CE UI or behaviour has changed slightly, I can attribute it to some update that was applied to the environment. There are email notifications for major updates, but none for minor updates or patches that can happen frequently. So, I decided to solve this problem using Flow.

Every solution import into the system causes an Import Job record to be created. If a Flow can be scheduled to run everyday, and query the Import Job records that were started the previous day, we can easily keep track of what is happening in the environment.

The first step is to trigger the Flow on a preset schedule, and read the Import Job records.

Import Job Flow.png

Below are the expressions that I use for the boundary dates:

  • addDays(convertFromUtc(utcNow(),’AUS Eastern Standard Time’), -1, ‘yyyy-MM-dd’)
  • convertFromUtc(utcNow(),’AUS Eastern Standard Time’,’yyyy-MM-dd’)

The next step is to pick up only the information we need from the returned result, and project it to a form that is conducive for email.

Import Job Compose

With the exception of Solution Name, all the other properties are retrieved from the XML on the Data property. Below are the formula’s for those:

  1. Publisherxpath(xml(items(‘Parse_Import_Job_XML_and_re-map’)?[‘data’]),’string(//Publisher/Descriptions/Description[1]/@description)’)
  2. Started OnconvertFromUtc(items(‘Parse_Import_Job_XML_and_re-map’)?[‘startedon’],’AUS Eastern Standard Time’,’dd-MM-yyyy HH:mm:ss’)
  3. Old Versionxpath(xml(items(‘Parse_Import_Job_XML_and_re-map’)?[‘data’]),’string(//upgradeSolutionPackageInformation/fileVersion)’)
  4. New Versionxpath(xml(items(‘Parse_Import_Job_XML_and_re-map’)?[‘data’]),’string(//upgradeSolutionPackageInformation/currentVersion)’)

The last few steps are to email out the results, if any solutions were imported the previous day.

Email Solution.png

Email Step.png

The result is a barely formatted table, with the list of solutions that were imported the previous day.

Email Solution Result

With this approach, no one can sneak up a solution import on you. You have visibility over what is going on in the system.

You can download and install the Flow solution from https://onedrive.live.com/?id=0EA80B851211E3FC%21105&cid=0EA80B851211E3FC#cid=0EA80B851211E3FC&id=0EA80B851211E3FC%218012&parId=0EA80B851211E3FC%21105&action=defaultclick

Hope this is helpful to stay on top of imports.

Using Virtual Entities to query metadata

After my previous post, I continued to explore virtual entities to see what real life problems I can solve using them. One problem I could think of was metadata. How awesome would it be, if I can use Advanced Find to query entity and attribute metadata, or visualise them as a normal entity! It is not a dream anymore. I have developed an open-source solution to do this.

Here is some of the sample queries:

Query all attributes of type customer

Query by Attribute Type

Query by Attribute Type Results

Query all mandatory attributes

Query All Required.png

Query All Required Results.png

Query by Attribute Type and Entity

Query by Attribute Type and Entity.png

Query by Attribute Type and Entity Results.png

Query all Many to Many intersect entities

Query MM Entity.png

Query MM Entity Results

Query entities that have quick create enabled

Query Quick Create

Quick Create Results

You can open the entity and see more details.

Entity Form

You can open the attribute and view more details as well.

Attribute Form

All this is awesomeness is possible using the power of virtual entities. There are two virtual entities that you can query. They are called Entity and Attribute.

VE Solution

You can download the source code and managed solution from https://github.com/rajyraman/Metadata-Virtual-Entity/releases.

This step is important

After importing the managed solution,  change the data source for the attribute entity from “Entity Datasource” to “Attribute Datasource”. You have to do this from the Customization area and not from the managed solution.

Customise.png

Change Datasource.png

This is because by default, the system does not allow relationships between two virtual entities that have different datasources. This exception is shown when you try to do this.

Solution Exception

Stack Trace.png

In order to workaround this exception, you keep the data source same for the “Entity” virtual entity (parent) and “Attribute” (child) virtual entity, create the relationship and then change to the right datasource. Hence, the managed solution has the datasource set to “Entity Datasource” for the “Attribute” virtual entity, which has to be changed manually after importing the solution.

I hope this solution would be really useful for administrators. Please let me know any feedback on the post or on GitHub issues.

Sourcecode: https://github.com/rajyraman/Metadata-Virtual-Entity

Managed Solution: https://github.com/rajyraman/Metadata-Virtual-Entity/releases/latest

 

 

Virtual Entities for tracking recently used items

Virtual entities is a powerful feature that can be used not only to bring data from external sources, but also from inside Dynamics CRM/Dynamics 365 Customer Engagement.

Jason Lattimer already has a post (https://jlattimer.blogspot.com.au/2017/12/creating-custom-virtual-entity-data.html) that goes through how to setup the custom datasource/data provider. So, go and read that first as it has all the screenshots and I would be duplicating the content, if I go through the steps again.

Gotcha 1:

There is a exception when you create the datasource.

Datasource Creation Error

You can simply ignore this and refresh the Plugin Registration tool.

Gotcha 2: If you don’t want the user the open up an individual record, you don’t have to implement Retrieve message. It is optional. Since, I just want a collated entity, I did not register any plugin for the Retrieve.

Data Provider

Gotcha 3: You have to open up the newly created Data Provider entity, and enter the external name. If you don’t enter this, you will be unable to create the data source, as it will always error out.

Data Source Primary Key Attribute

Create New Data Source

Data Source List

Objective: MRU items should be accessible from Advanced Find. As an Administrator, I would like to query this data, and see metrics around user participation, entity usage, activity by week/month/year etc.

This is the Most Recently Used area.

Recent Items

This is the Advanced Find on the virtual entity, which is driven by the same data.

Advanced Find Results

As you can see, the data matches up. All the heavy lifting is done by the plugin, that retrieves the records from the “UserEntityUISettings” entity, parses the XML, sorts by user and accessed on and then populated the virtual entity “Recent Items”.

You can query by “Type Equals”, “User Equals” and “User Equals Current User”.

Advanced Find

I can also do a PowerBI report that is driven by the same virtual entity.

PowerBI Dashboard

Source code -> https://github.com/rajyraman/Recent-Items-Virtual-Entity

Managed Solution -> https://github.com/rajyraman/Recent-Items-Virtual-Entity/releases

I hope this helps people to use virtual entities to retrieve data from inside CRM as well – a sort of collation mechanism for reporting.

Reference:

https://docs.microsoft.com/en-us/dynamics365/customer-engagement/developer/virtual-entities/sample-generic-ve-plugin

https://jlattimer.blogspot.com.au/2017/12/creating-custom-virtual-entity-data.html

Export all attachments using LINQPad

I was playing around with LINQPad today and wrote this C# code to export all attachments from CRM. You can customise the query to export only certain attachments if required. You could also modify the code to gather the output location from the user, instead of asking them to choose between “My Documents” or “Desktop”. This could also be potentially written as an XrmToolBox tool.

I executed the code in LINQPad v5.26 and Dynamics CRM 2016 OnPremise 8.1 environment. I tried to retrieve the attachment using LINQ, but decided to use normal QueryByAttribute with paging for performance reasons.

Util.RawHtml("
<h4>Choose an output path</h4>
").Dump();
var folders = new List<Environment.SpecialFolder> { Environment.SpecialFolder.Desktop, Environment.SpecialFolder.MyDocuments };
folders.ForEach(x => new Hyperlinq(() => DumpFiles(Environment.GetFolderPath(x)), x.ToString()).Dump());

void DumpFiles(string selectedFolder)
{
	Util.ClearResults();
	new Hyperlinq(selectedFolder).Dump("Chosen output path");
	var progress = new Util.ProgressBar("Writing files: ").Dump();
	progress.HideWhenCompleted = true;
	var retrieveQuery = new QueryByAttribute("annotation")
	{
		ColumnSet = new ColumnSet("documentbody","filename"),
		PageInfo = new PagingInfo{ Count = 500, PageNumber = 1 }
	};
	retrieveQuery.AddAttributeValue("isdocument", true);
	var resultsDc = new DumpContainer().Dump($"Results");
	EntityCollection results;
	int totalRecordCount = 0;
	do
	{
		results = ((RetrieveMultipleResponse)this.Execute(new RetrieveMultipleRequest { Query = retrieveQuery })).EntityCollection;
		var files = results.Entities.Cast<Annotation>();
		totalRecordCount += results.Entities.Count;
		resultsDc.Content = $"Completed Page {retrieveQuery.PageInfo.PageNumber}, Files: {totalRecordCount}";
		int fileNumber = 0;
		foreach (var f in files)
		{
			fileNumber++;
			var fileContent = Convert.FromBase64String(f.DocumentBody);
			File.WriteAllBytes(Path.Combine(selectedFolder, f.FileName), fileContent);
			progress.Caption = $"Page {retrieveQuery.PageInfo.PageNumber} - Writing files: {fileNumber}/{retrieveQuery.PageInfo.Count}";
			progress.Percent = fileNumber * 100 / retrieveQuery.PageInfo.Count;
		}
		retrieveQuery.PageInfo.PageNumber++;
		retrieveQuery.PageInfo.PagingCookie = results.PagingCookie;
	} while (results.MoreRecords);
	resultsDc.Content = $"{totalRecordCount} files saved.";
}

LINQPad Annotation Export User Input.png

LINQPad Annotation Export.png

Using Chrome DevTools to debug Bot Framework

When debugging the Bot Framework it is trivial to debug if you are developing in C#. It was bit of pain for me to do the same is node, as JavaScript is not my everyday language. But, after researching googling about this, I can say that it is really easy in node as well.

Below are the version details that I tried this on:

  • node: 8.1.2
  • npm: 5.0.3
  • OS: Windows 10
  • Chrome: Canary 61.0.3137.0
  • Bot Framework Emulator: 3.5.29

Below is my package.json for a simple “Hello World” bot.

{
  "name": "echo-bot",
  "version": "0.1.0",
  "license": "MIT",
  "description": "Echo bot example",
  "scripts": {
    "start": "node --inspect app.js",
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "dependencies": {
    "botbuilder": "^3.8.4",
    "dotenv": "^4.0.0",
    "restify": "^4.3.0"
  }
}

Since I am developing this in Visual Studio Code, this is my tasks.json

{
    // Use IntelliSense to learn about possible Node.js debug attributes.
    // Hover to view descriptions of existing attributes.
    // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
    "version": "0.2.0",
    "configurations": [
        {
            "type": "node",
            "request": "launch",
            "cwd": "${workspaceRoot}",
            "runtimeExecutable": "npm",
            "runtimeArgs": [
                "start"
            ],
            "name": "Launch Program"
        }
    ]
}

Now, to run the bot application in VSCode using Ctrl+F5 or typing “npm start” in the command prompt.

To debug the bot in Chrome DevTools, type “chrome://inspect/#devices“. Then click the “Open dedicated DevTools for Node” link. The page should be like this once you do this.

Chrome DevTools to debug Bot Framework.png

Chrome will now display the node application that it can detect in the Remote Target area. Click in the inspect link to open the DevTools. Now, set your breakpoint and use the emulator to send a message.

Debug botframework source

In the screenshot above you can see that I am already debugging using Chrome DevTools and debugger is waiting in the breakpoint that I set.

Reference:

https://nodejs.org/en/docs/inspector/

 

 

 

 

 

 

 

User not found error

Word of warning: This change is unsupported as it involves editing the CRM database directly. Use it at your own risk.

I encountered a weird error in CRM, that basically said that the user cannot be found. When I checked the event log, this appeared to be the underlying exception:

Exception type: CrmException
Exception message: No Microsoft Dynamics CRM user exists with the specified domain name and user ID
at Microsoft.Crm.Authentication.Claims.AuthenticationProvider.GetOrganizationId(ClaimsPrincipal principal)
at Microsoft.Crm.Authentication.Claims.AuthenticationProvider.Authenticate(HttpApplication application)
at Microsoft.Crm.Authentication.AuthenticationStep.Authenticate(HttpApplication application)
at Microsoft.Crm.Authentication.AuthenticationPipeline.Authenticate(HttpApplication application)
at Microsoft.Crm.Authentication.AuthenticationEngine.Execute(Object sender, EventArgs e)
at System.Web.HttpApplication.SyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)

But, the user was indeed there from before and all I did was reactivate a long dormant user. My initial suspicion was that this had something to do with AD. I ran the following query to check the SID of the user against the MSCRM_CONFIG database:

select c.FriendlyName,b.FullName,d.AuthInfo,b.DomainName
from SystemUserOrganizations a
inner join [ORG_MSCRM].dbo.SystemUser b on a.CrmUserId=b.SystemUserId
inner join Organization c on c.Id=a.OrganizationId
inner join SystemUserAuthentication d on d.UserId=a.UserId
where b.DomainName='CONTOSO\powerm'

This gave me the SID of the user when the user was first created in CRM. I next ran this command in to get the current SID of the user:

wmic useraccount where name='powerm' get sid

The SID did not match to the one already in CRM, so I had to update it in the MSCRM_CONFIG database:

update d
set d.AuthInfo='W:[SID OUTPUT FROM THE DOS COMMAND PROMPT]'
from SystemUserOrganizations a
inner join [ORG_MSCRM].dbo.SystemUser b on a.CrmUserId=b.SystemUserId
inner join Organization c on c.Id=a.OrganizationId
inner join SystemUserAuthentication d on d.UserId=a.UserId
where b.DomainName='CONTOSO\powerm'

Once I updated this, the user was able to login with the same AD account.