Triggering Flow from NFC cards

I recently came back from Sydney and I now have 4 new Opal cards. Opal card is used in public transport across Sydney and these cards use NFC technology. Since I no longer have any use for these cards in Melbourne, I wanted to do some thing productive with these cards. I also wanted to try a low cost alternative for triggering Flow from a hardware that is not flic (Opal cards are free).

Reading NFC card is not a native functionality of Flow, so I decide to use something that is capable of reading NFC card and also can call a HTTP endpoint. Since I am using Android, there is an app that meets this need perfectly. It is called Automate. This app has been around for a while, and you can develop Automate Flows that can use native hardware capability of Android.

Here is how my Flow looks in Automate.

Automate Flow

The are start important blocks to get this Automate Flow working:

  1. Read NFC Tag
  2. HTTP Request
  3. Flow Start

Read NFC block is used to read the NFC card. I map the NFC tag id to the variable called “TagId”. You can use the “Read tag” button in this to identify the NFC tagid and then use it in the switch/case statements in Microsoft Flow.

Read NFC

 

The next step in the one where I call Microsoft Flow, which is triggered by HTTP Request.

HTTP Request

The same Automate Flow has to be triggered again after Microsoft Flow is called using HTTP Request, so that a new fiber is started to continue reading the NFC card.

Start Flow

This is the Microsoft Flow that is triggered by HTTP request.

NFC HTTP Flow

The HTTP trigger accepts the tagId in the URL parameter of the HTTP Request.

HTTP Trigger

Based on what tag has been scanned, I can then perform the appropriate action. I use the switch statement for this purpose.

Flow Switch.png

Here is a quick demo of Automate and Flow working together in harmony.

Automate Flow.gif

Here are somethings that are now possible with the NFC capability:

  1. Deploy solution from DEV to TEST, using Azure Function, PowerShell and Xrm.Data.PowerShell module. I experimented with this and it works nicely even though PowerShell support in Azure Function is only experimental
  2. Call a RunBook in Azure Automate using HTTP Webhook

I hope this is useful in scenarios where you need alternate ways to trigger Microsoft Flow.

 

 

Advertisements

Quick Tip: CDS Base URL in Flow

I have a Flow that sends out email at 8 a.m everyday that lists the solutions that were imported in the past 24 hours. In the initial version of the Flow, the email was missing an link to the actual solution. But, after making some changes, the email now includes a clickable link to the solution that was imported.

Flow Email.png

As you can see there is a link in the last column. This link is not hardcoded. The base URL changes, based on the CDS environment the Flow is deployed to. The trick is to grab this URL from the “List records” action which includes the full URL to each record in the result. You just need the first record in the result set to use in the expression in the next step.

RetrieveMultiple.png

As you can see the @odata.id key contains the full URL to the record, from which you just need the base URL. Once you grab the base URL, you can easily compose the full URLs to the areas that you want the link to navigate to e.g. open the record, open solution, new record, open list etc.

Below is the expression that I use to get the base URL only.

first(split(first(body(‘[ACTION_NAME]’)?[‘value’])?[‘@odata.id’],’/api/’))
If you use this expression and assign it to a variable, you can set the variable with the base URL.
Variable.png
This technique is quite useful if you are sending emails with clickable links that navigate to a CDS record or area.

 

Flowception: Creating solution enabled Flow with Flow

Solution is a feature that has been used in the CRM space for a long time. Solutions support for Flow was announced last year -> https://flow.microsoft.com/en-us/blog/solutions-in-microsoft-flow/. One thing the post does not mention is that Flows have to be created from the solution record. If you have an existing Flow, that you want to package up into a solution, you cannot do that. To workaround this limitation, I have created a Flow to clone an exisiting Flow and make it solution enabled.

The Flow itself it not that long. Here is how it looks.

The Flow definition fits on a readable screenshot!

The first step is to select the existing Flow that you want to clone, into a solution enabled Flow. This can be done using the Flow Management Connector’s “Get Flow” action.

Solution enabled Flows, like solution enabled canvas apps, are also stored in the CDS database. The entity it used to store the Flow is called Process (logical name: workflow). It stores both the Flow definition, as well the the connection references.

LINQ output

However, the connection references are stored differently between CDS and Flow. Here is how the connections are stored in Flow.

[{
	"connectionName": "shared-flowmanagemen-f22a175e-d99e-4e41-8404-f6823b2d4d5e",
	"displayName": "Flow management",
	"id": "/providers/Microsoft.PowerApps/apis/shared_flowmanagement"
}, {
	"connectionName": "46c0ebf24ba6458f9a582abde1185b12",
	"displayName": "Common Data Service",
	"id": "/providers/Microsoft.PowerApps/apis/shared_commondataservice"
}]

Here is how the connections are stored inside CDS workflow.

{
	"shared_flowmanagement": {
		"connectionName": "shared-flowmanagemen-f22a175e-d99e-4e41-8404-f6823b2d4d5e",
		"source": "Invoker",
		"id": "/providers/Microsoft.PowerApps/apis/shared_flowmanagement",
		"tier": "NotSpecified"
	},
	"shared_commondataservice": {
		"connectionName": "46c0ebf24ba6458f9a582abde1185b12",
		"source": "Invoker",
		"id": "/providers/Microsoft.PowerApps/apis/shared_commondataservice",
		"tier": "NotSpecified"
	}
}

As you can see one is an array and another is an object. So, this means the connection JSON has to be reshaped, when we create a Modern Flow Process record, directly in CDS. We will use select action to reshape the data, and then do a replace to cleanup the JSON.

Flow Connection JSON.png

Connection References

The action below is the one that creates the Solution enabled Flow. You create the “Process” record using the CDS connection, and populate the Flow JSON in “Client Data” field.

Create Workflow.png

This is the formula I use in the concat.

concat
(
	'{"schemaVersion":"1.0.0.0","properties": { "definition": ', 
	body('Get_Flow')['properties']['definition'],
	', "connectionReferences": ', 
	variables('connectionReference'), '}}'
)

This creates the JSON that is accepted for the “Modern Flow” process record.

In the last step we activate (start) the newly created Flow.

Enable Flow.png

The Flow’s GUID is stored in a field called “workflowidunique” on the Process entity. So, we can use this to the retrieve the Flow, and activate it.

The crazy part of this Flow is that I was able to run the Flow on itself and add it to the solution, hence the title of the post.

Flow run.png

You can now add the Flow into the Solution, from web.powerapps.com

Solution.png

The newly created Flow, will have the same step Flow name, you specific in the first Get Flow step, prefixed with “Solution: “

Add Flow.png

The solution with the Flow can now be exported and imported into a new CDS environment. I hope this helps you to package some of you old Flows into a solution. This Flow can further improved by listing all Flows in the environment and doing the same process or cloning it, rather than specifying a specific Flow at design time.

You can download the Flow from https://1drv.ms/u/s!AvzjERKFC6gOwWC7Ywi5fgguPW1s

If you have any comments/feedback, please share them on the post or tweet me @rajyraman.

CDS, Microsoft Flow and DateTime formats

EDIT (7/12): You can download the Flow from https://1drv.ms/u/s!AvzjERKFC6gOwCqKnnAvMZA6mDsW if you just want to jump straight into exploring the actual Flow, instead of reading the article.

DateTime and Timezones seem to be the flavor of the month, so I thought I will take a crack at the problem as well. Coming from CRM background and spending a lot of time fiddling around the internal schema to understand the plumbings I came up with the following approach which I believe is decoupled and flexible.

In CDS, there is an entity called “UserSettings”, which stores a whole bunch of information regarding the user preferences. If you have used the awesome XrmToolBox tool called “UserSettings Utility”, you would remember this screen.

Timezone.png

Formats

The “UserSettings” entity is the source of this information. So, why not use the same for managing the timezones and datetime formats?

For re-usability, I want to create a Flow that just returns me the timezone information based on the executing user. Here is how it looks.

Get Timezone and Formats.png

Let us understand this step by step.

Step 1:

This is the HTTP trigger, which can take in a parameter for activedirectoryguid. This is the unique identifier for the Office 365 user.

Step 1.png

Step 2:

Get the Office 365 profile of the current user. If the activedirectoryguid is not passed in step 1, the intention is to use the current user’s Office 365 information.

Step 2.png

Step 3 & 4:

Initialise the activedirectoryguid variable, and query the CDS entity “Users”, and retrieve the user that matches the activedirectoryguid.

Step 3 and 4.png

Since activedirectoryguid can be passed on the trigger as well, I use the following coalesce expression, to set the variable to either the passed value on the trigger or from the Office 365 profile in step 2. This is the expression.

coalesce(triggerBody()?['activedirectoryguid'],body('Get_my_profile_(V2)')?['id'])

Step 5 & 6:

Retrieve “User Settings” entity and the associated “Time Zone Definitions” record based on the user’s timezone.

Step 5 and 6.png

The filter for User Settings is

first(body('Get_System_Users')?['value'])['systemuserid']

The filter for Time Zone Definitions is

first(body('Get_User_Settings')?['value'])['timezonecode']

Step 7:

This is the last step, where we return all the format and timezone information.

Step 7

Below are the expressions for the returned properties:

timezone

first(body('Get_User_Timezone')?['value'])['standardname']

dateformat

replace(first(body('Get_User_Settings')?['value'])['dateformatstring'],'/',first(body('Get_User_Settings')?['value'])['dateseparator'])

timeformat

replace(first(body('Get_User_Settings')?['value'])['timeformatstring'],':',first(body('Get_User_Settings')?['value'])['timeseparator'])

dateseparator

first(body('Get_User_Settings')?['value'])['dateseparator']

timeseparator

first(body('Get_User_Settings')?['value'])['timeseparator']

Now let us look at some sample output. For a user located in US, here is how the output of the Flow looks like.

US Output

Compare this with someone who is in Australian timezone.

Australia Output

Since this Flow now contains the logic for getting the formats and timezones, it can be utilised for another Flow that needs this information. For example look at the sample Flow below

Calling Flow

Calling Flow1

Below is the expression, I use to convert the “createdon” returned by the CDS Get Record, step which returns the datetime in UTC.

convertFromUtc(body('Get_record')?['createdon'],body('Parse_JSON')?['timezone'],concat(body('Parse_JSON')?['dateformat'],' ',body('Parse_JSON')?['timeformat']))

If you compare this Flow output and the record properties from Dynamics 365, it becomes obvious that the datetime and format is correctly displayed. Note that the CDS connector returns the datetime in UTC.

Output

CRM Record Properties

Initially I wanted to call the Flow that returns the formats and timezone using the “Start Flow” action on the Flow Connector, but it doesn’t seem to be picked up the response, so I had to resort to the whole HTTP action, which is not ideal.

It seems “Start Flow”, simply enables the Flow, does not actually run the Flow. Not sure why it is named in a misleading way.

Hope this is helpful.

 

 

Using Flow to notify solution imports

EDIT (03/04/2019): I made further changes to display the solution URL, so that you can click and find out the details about the solution that was imported.

EDIT (20/12/2018): I updated my Flow and made some improvements. You simply have to set the Timezone on the triggering action and you are all set. The Flow will email the solution list, to the user running the Flow. The download link has been updated to point to the updated solution.

I would not call it sneaky, but sometimes when I find the Dynamics 365 CE UI or behaviour has changed slightly, I can attribute it to some update that was applied to the environment. There are email notifications for major updates, but none for minor updates or patches that can happen frequently. So, I decided to solve this problem using Flow.

Every solution import into the system causes an Import Job record to be created. If a Flow can be scheduled to run everyday, and query the Import Job records that were started the previous day, we can easily keep track of what is happening in the environment.

The first step is to trigger the Flow on a preset schedule, and read the Import Job records.

Import Job Flow.png

Below are the expressions that I use for the boundary dates:

  • addDays(convertFromUtc(utcNow(),’AUS Eastern Standard Time’), -1, ‘yyyy-MM-dd’)
  • convertFromUtc(utcNow(),’AUS Eastern Standard Time’,’yyyy-MM-dd’)

The next step is to pick up only the information we need from the returned result, and project it to a form that is conducive for email.

Import Job Compose

With the exception of Solution Name, all the other properties are retrieved from the XML on the Data property. Below are the formula’s for those:

  1. Publisherxpath(xml(items(‘Parse_Import_Job_XML_and_re-map’)?[‘data’]),’string(//Publisher/Descriptions/Description[1]/@description)’)
  2. Started OnconvertFromUtc(items(‘Parse_Import_Job_XML_and_re-map’)?[‘startedon’],’AUS Eastern Standard Time’,’dd-MM-yyyy HH:mm:ss’)
  3. Old Versionxpath(xml(items(‘Parse_Import_Job_XML_and_re-map’)?[‘data’]),’string(//upgradeSolutionPackageInformation/fileVersion)’)
  4. New Versionxpath(xml(items(‘Parse_Import_Job_XML_and_re-map’)?[‘data’]),’string(//upgradeSolutionPackageInformation/currentVersion)’)

The last few steps are to email out the results, if any solutions were imported the previous day.

Email Solution.png

Email Step.png

The result is a barely formatted table, with the list of solutions that were imported the previous day.

Solution Import Email

With this approach, no one can sneak up a solution import on you. You have visibility over what is going on in the system.

You can download and install the Flow solution using these links:

Unmanaged: https://1drv.ms/u/s!AvzjERKFC6gOwlMIAzgRSXWhtXth

Managed: https://1drv.ms/u/s!AvzjERKFC6gOwlKvIWhW-Pb_vHiT

Hope this is helpful to stay on top of imports.

Using Virtual Entities to query metadata

After my previous post, I continued to explore virtual entities to see what real life problems I can solve using them. One problem I could think of was metadata. How awesome would it be, if I can use Advanced Find to query entity and attribute metadata, or visualise them as a normal entity! It is not a dream anymore. I have developed an open-source solution to do this.

Here is some of the sample queries:

Query all attributes of type customer

Query by Attribute Type

Query by Attribute Type Results

Query all mandatory attributes

Query All Required.png

Query All Required Results.png

Query by Attribute Type and Entity

Query by Attribute Type and Entity.png

Query by Attribute Type and Entity Results.png

Query all Many to Many intersect entities

Query MM Entity.png

Query MM Entity Results

Query entities that have quick create enabled

Query Quick Create

Quick Create Results

You can open the entity and see more details.

Entity Form

You can open the attribute and view more details as well.

Attribute Form

All this is awesomeness is possible using the power of virtual entities. There are two virtual entities that you can query. They are called Entity and Attribute.

VE Solution

You can download the source code and managed solution from https://github.com/rajyraman/Metadata-Virtual-Entity/releases.

This step is important

After importing the managed solution,  change the data source for the attribute entity from “Entity Datasource” to “Attribute Datasource”. You have to do this from the Customization area and not from the managed solution.

Customise.png

Change Datasource.png

This is because by default, the system does not allow relationships between two virtual entities that have different datasources. This exception is shown when you try to do this.

Solution Exception

Stack Trace.png

In order to workaround this exception, you keep the data source same for the “Entity” virtual entity (parent) and “Attribute” (child) virtual entity, create the relationship and then change to the right datasource. Hence, the managed solution has the datasource set to “Entity Datasource” for the “Attribute” virtual entity, which has to be changed manually after importing the solution.

I hope this solution would be really useful for administrators. Please let me know any feedback on the post or on GitHub issues.

Sourcecode: https://github.com/rajyraman/Metadata-Virtual-Entity

Managed Solution: https://github.com/rajyraman/Metadata-Virtual-Entity/releases/latest

 

 

Virtual Entities for tracking recently used items

Virtual entities is a powerful feature that can be used not only to bring data from external sources, but also from inside Dynamics CRM/Dynamics 365 Customer Engagement.

Jason Lattimer already has a post (https://jlattimer.blogspot.com.au/2017/12/creating-custom-virtual-entity-data.html) that goes through how to setup the custom datasource/data provider. So, go and read that first as it has all the screenshots and I would be duplicating the content, if I go through the steps again.

Gotcha 1:

There is a exception when you create the datasource.

Datasource Creation Error

You can simply ignore this and refresh the Plugin Registration tool.

Gotcha 2: If you don’t want the user the open up an individual record, you don’t have to implement Retrieve message. It is optional. Since, I just want a collated entity, I did not register any plugin for the Retrieve.

Data Provider

Gotcha 3: You have to open up the newly created Data Provider entity, and enter the external name. If you don’t enter this, you will be unable to create the data source, as it will always error out.

Data Source Primary Key Attribute

Create New Data Source

Data Source List

Objective: MRU items should be accessible from Advanced Find. As an Administrator, I would like to query this data, and see metrics around user participation, entity usage, activity by week/month/year etc.

This is the Most Recently Used area.

Recent Items

This is the Advanced Find on the virtual entity, which is driven by the same data.

Advanced Find Results

As you can see, the data matches up. All the heavy lifting is done by the plugin, that retrieves the records from the “UserEntityUISettings” entity, parses the XML, sorts by user and accessed on and then populated the virtual entity “Recent Items”.

You can query by “Type Equals”, “User Equals” and “User Equals Current User”.

Advanced Find

I can also do a PowerBI report that is driven by the same virtual entity.

PowerBI Dashboard

Source code -> https://github.com/rajyraman/Recent-Items-Virtual-Entity

Managed Solution -> https://github.com/rajyraman/Recent-Items-Virtual-Entity/releases

I hope this helps people to use virtual entities to retrieve data from inside CRM as well – a sort of collation mechanism for reporting.

Reference:

https://docs.microsoft.com/en-us/dynamics365/customer-engagement/developer/virtual-entities/sample-generic-ve-plugin

https://jlattimer.blogspot.com.au/2017/12/creating-custom-virtual-entity-data.html