Building Exposure Bot using Google Actions, Azure Functions and Power Automate

Almost everyone now has a smartphone. The two dominant platforms are: Google Android and Apple iOS. Both these platforms have virtual assistant built into the OS. Android’s virtual assistant is called Google Assistant, and Apple’s virtual assistant is called Siri. In this post, I will show you how you can build a custom action for Google Assistant. The equivalent for custom action in Bot Framework is Skills.

Here is the high level architecture diagram of the solution.

Architecture diagram of the virtual assistant solution
Architecture of virtual assistant solution

The starting point to build the virtual assistant solution is create a new Actions project. You need to head over to https://console.actions.google.com/ to create a new Actions project.

Google Actions projects screen
Google Actions projects

The first thing you need to do is name your bot. I have named mine as “Exposure Bot”. This means that the user can trigger the bot by saying “Talk to exposure bot” to the Google Assistant.

Settings screen
Settings screen

We then need to create one or more scenes to handle the user’s input. After the user has triggered the bot using the phrase “Talk to exposure bot”, she is then transferred to a scene called triggerExposure_scene.

Transitioning from main scene to triggerExposure scene
Transitioning from main scene to triggerExposure_scene

We can now design the main scene that handles multiple inputs. This scene is called triggerExposure_scene.

trigger exposure scene
trigger exposure scene

In this scene you can 3 intents to decide what to do based on user input. We have three intents on the Actions project.

  1. Suburb_intent – When the user says something like “Show me the numbers for Melbourne”, it will be mapped to this intent. A Flow (webhook) will be called to get the response for this intent
  2. Notifications_intent – When the user says something like “Send me daily notifications”, it will be mapped to this intent. The user is transitioned over to a scene called subscription_scene so that we can get accurate GPS information for the subscription and user’s permission for sending push notifications
  3. IncomingPushNotifications_Intent – There is a scheduled Flow that sends exposure notifications three times a day to the user. When the user clicks on that notification, this intent is invoked. But, as you can see when that happens, the conversation immediately ends. The user should never hit this intent in this scene.

Now let us look at intents next. Intent is basically figuring out what the user intends to do, based on training phrases. For example, below are the training phrases for Suburb_intent. Notice that you can map the types (Entity in PVA) right inside the training phrase itself.

List of intents
Intents

For figuring out whether the user wants to receive push notifications we have different set of training phrases.

Push notifications intent
Push Notifications Intent

The suburb type itself is just free text.

Suburb type
Suburb type

All the above scenes and intents are used when the user is chatting with the bot. But, in the case of push notifications the IncomingPushNotifications_Intent will be invoked only when the user clicks on the notification in Android. Here is how that scene looks. Notice that it is global intent, which means it can be invoked even in the middle of the conversation.

Push Notifications scene
Push Notifications scene

When the user clicks on the push notification, she is transferred to the pushNotification_scene, which grabs the current location using Android’s native location API and calls Flow to get the current exposure data.

Push Notification scene
Push Notification scene

This is how suburb_intent also gets the current exposure data.

Suburb intent webhook
Suburb intent webhook

We have similar webhook call for setting up the push notification subscription. Notice how we are grabbing the location and notification permission and then calling the Flow will the message name,

Push notification subscription scene
Push notification subscription scene

We are calling webhooks in couple of places. But, where are we defining the Flow that is called? It is in the webhook area. The key to call the Flow is on the URL itself. I am not aware of any alternate method to transparently use OAuth in Google Actions. If you know a better way to handle authentication in this context, please add your comment on the post.

Webhook setup
Webhook setup

There are two more things you need to make sure:

  • Service Account is created for the project
Service Account

Both enabling Actions API and creating service account for the project are documented in https://codelabs.developers.google.com/codelabs/actions-user-engagement/#3, which you can refer for additional instructions. After you download the JSON from service account copy only the key and keep it aside. We would need it for adding to key vault. Don’t include the —BEGIN PRIVATE KEY and –END PRIVATE KEY parts. Copy just the key as highlighted.

Service Account key
Service Account Key

The next part is setting up the Azure resources: Function App, Storage Account, App Insights and Key Vault. Head over to https://github.com/rajyraman/Google-Actions-Push-Notifications where you have a deploy to Azure button. Click the button and fill in the Google Account Email, Google Secret and you Azure User Id.

Deploy Azure Resources
Deploy Azure Resources

You can click on the visualise button to see what resources will be deployed.

Both Google Account Email and Google Secret are in the Service Account JSON file which you would have downloaded. In order to get the User Id run the following command in AZ CLI.

 az ad signed-in-user show --query objectId --output tsv

After the resources are provisioned, you need to deploy the Function code to the Azure, by running this command.

func azure functionapp publish FUNCTIONAPPNAME

If you want to deploy from command line and have Bicep and Az CLI installed, I have also included a PowerShell script that you can use to deploy from your local machine.

Deployment Script
Deployment Script

Since the Function App has OpenAPI enabled, you can also access it from the browser to quickly understand the Function App.

Swagger UI
Swagger UI

On production it is not a good idea to exposure this to everyone, as the documentation might have confidential data, so you can lock the Swagger UI down further or remove it altogether. Refer https://github.com/Azure/azure-functions-openapi-extension/blob/main/docs/openapi.md#configuration for additional information.

The Function App uses Durable Entity that is persisted in Azure Storage. Since we just need to store the Google User Id and Location for which exposure notification is required, it is a good fit for small applications like this one.

I have also uploaded all the Google Actions code to the repo, so in order to deploy this to Google Action, you can use the gactions CLI.

Google Actions source code
Google Actions source code

The command to run is

gactions login
gactions push

Before deploying you need to update the Flow URL to the URL of Get Exposure Data Flow. This Flow gets the COVID 19 Exposure Data provided by Data.VIC Victorian Exposure Sites API.

Update Flow URL for webhook
Update Flow URL for webhook

You can download the Managed Solution with Custom Connectors and Flows from https://1drv.ms/u/s!AvzjERKFC6gOyAZf2Mw1f2QkgYft?e=HL1bnV. Here is how the solution looks.

Flow and Custom Connector Solution

After importing you also have to change the base URL of the Google Assistant custom connector to point to the correct base URL of your Function App. This is the Function App that manages subscription for locations, Google API authentication and sending push notification to the user.

After all of this is setup you would be able to chat with the bot from your Google Assistant.

Push notifications from exposure bot
Push notifications from exposure bot
Google Assistant – Exposure Bot

All Actions messages are captures in logs, so you can always refer them if you are stuck.

Google Cloud Logs

You can refer the source code on https://github.com/rajyraman/Google-Actions-Push-Notifications

Credit: flaticon.com for the icons used in custom connector

References:

  1. Push Notifications on Google Actions
  2. gactions CLI
  3. Discover Azure Functions: OpenAPI and Power Apps
  4. Conversational Actions

Resubmitting failed Logic Apps using Power Automate

Even though each Logic App action has a retry policy, after a certain number of retries, the Logic App engine gives up and fails the whole execution. In these scenarios the Logic App needs to be re-run. You can use Power Automate to handle this scenario. You can download the Power Automate solution from https://1drv.ms/u/s!AvzjERKFC6gOx3-NptgmlSyD4FVu?e=vwxyAE

After you import the solution, you need to set the value for the environment variable: Azure Subscription Id to your Azure Subscription where the Logic Apps are.

Environment variable to store subscriptionid
Subscription Id environment variable

You need to have a HTTP with Azure AD connection that can be mapped to the HTTP with Azure AD Connection Reference in the solution. Both the Base Resource URL and Azure AD Resource URL should be: https://management.azure.com/

HTTP with Azure AD connection
HTTP with Azure AD connection to Azure RM API

You also need to have a Azure Resource Manager connection that can be mapped to the Azure Resource Manager Connection Reference.

The Flow runs on a HTTP Trigger, so it is manual at the moment. But, you are easily modify it to a Schedule trigger.

This is the key part where the Flow resubmits failed executions.

Re-running all failed Logic App executions
Segment to resubmit failed executions

The Flow returns JSON with both the old runId and the new runId of the resubmitted Logic App.

Flow response with Logic App execution detail
Output from Flow

This Flow can be made even smarter by persisting this JSON into CosmosDB or Table Storage, so that you can stop retrying Logic Apps that have failed more than 2 times. This is so that you don’t keep retrying executions that will fail due to some data validation errors.

Acknowledgements:

  1. Pieter Veenstra – Pieter’s method – https://sharepains.com/2019/07/09/compose-apply-to-each-power-automate/
  2. Pieter Veenstra – Unnest nested arrays – https://sharepains.com/2021/02/10/unnest-nested-arrays-in-power-automate/

Paging while using FetchXML in Dataverse Connector

If you want to retrieve more than 5,000 records in a Flow using List Rows action from Dataverse, you need to page through the records. Flow does not automatically do this for you. This is not a new topic. It is already been explored by Linn and Debajit. You can read their posts below:

  1. Linn – https://linnzawwin.blogspot.com/2021/01/retrieve-more-than-100000-dataverse.html
  2. Debajit – https://debajmecrm.com/how-to-query-more-than-5k-cds-records-using-fetchxml-in-powerautomate-microsoft-flow/

When I was looking into this same problem, I used two things differently:

  1. Using interationIndexes for paging
  2. Using xml for encoding paging cookie, instead of manually encoding “<” to “&lt;”, “>” to “&gt;” etc.

Here is the Flow

I use iterationIndexes like this for the page for each iteration.

The paging cookie returned in the first page looks like this

<cookie page="1"><systemuserid last="{7A44238F-9894-EB11-B1AC-002248153EDE}" first="{5C4D8EE8-62FF-E911-A811-000D3A799417}" /></cookie>

I then use the expression below to sanitise the characters so that I can use it in the subsequent page. It looks a bit clunky and verbose, but the key thing here is that when xml function to encode the XML, it also sanitises it. encodingJSON is a temporary object used to store the paging cookie XML. This JSON is what gets converted to XML and cleaned up with split and substring.

if(equals(iterationIndexes('Do_until'),0),
	'',
	concat(
		'paging-cookie=''', 
		substring(
			first(
				skip(
					split(
						string(xml(setProperty(variables('encodingJSON'),'x',variables('pagingCookieCleansed'))))
					,'<')
				,1)
			)
		,2),
	'''')
)

Here is the Flow running through all pages.

You can download this sample Flow from https://1drv.ms/u/s!AvzjERKFC6gOx3ZbdK8YOBYat0CV?e=M41ONd

Triggering Flow from NFC cards

I recently came back from Sydney and I now have 4 new Opal cards. Opal card is used in public transport across Sydney and these cards use NFC technology. Since I no longer have any use for these cards in Melbourne, I wanted to do some thing productive with these cards. I also wanted to try a low cost alternative for triggering Flow from a hardware that is not flic (Opal cards are free).

Reading NFC card is not a native functionality of Flow, so I decide to use something that is capable of reading NFC card and also can call a HTTP endpoint. Since I am using Android, there is an app that meets this need perfectly. It is called Automate. This app has been around for a while, and you can develop Automate Flows that can use native hardware capability of Android.

Here is how my Flow looks in Automate.

Automate Flow

The are start important blocks to get this Automate Flow working:

  1. Read NFC Tag
  2. HTTP Request
  3. Flow Start

Read NFC block is used to read the NFC card. I map the NFC tag id to the variable called “TagId”. You can use the “Read tag” button in this to identify the NFC tagid and then use it in the switch/case statements in Microsoft Flow.

Read NFC

 

The next step in the one where I call Microsoft Flow, which is triggered by HTTP Request.

HTTP Request

The same Automate Flow has to be triggered again after Microsoft Flow is called using HTTP Request, so that a new fiber is started to continue reading the NFC card.

Start Flow

This is the Microsoft Flow that is triggered by HTTP request.

NFC HTTP Flow

The HTTP trigger accepts the tagId in the URL parameter of the HTTP Request.

HTTP Trigger

Based on what tag has been scanned, I can then perform the appropriate action. I use the switch statement for this purpose.

Flow Switch.png

Here is a quick demo of Automate and Flow working together in harmony.

Automate Flow.gif

Here are somethings that are now possible with the NFC capability:

  1. Deploy solution from DEV to TEST, using Azure Function, PowerShell and Xrm.Data.PowerShell module. I experimented with this and it works nicely even though PowerShell support in Azure Function is only experimental
  2. Call a RunBook in Azure Automate using HTTP Webhook

I hope this is useful in scenarios where you need alternate ways to trigger Microsoft Flow.

 

 

Quick Tip: CDS Base URL in Flow

I have a Flow that sends out email at 8 a.m everyday that lists the solutions that were imported in the past 24 hours. In the initial version of the Flow, the email was missing an link to the actual solution. But, after making some changes, the email now includes a clickable link to the solution that was imported.

Flow Email.png

As you can see there is a link in the last column. This link is not hardcoded. The base URL changes, based on the CDS environment the Flow is deployed to. The trick is to grab this URL from the “List records” action which includes the full URL to each record in the result. You just need the first record in the result set to use in the expression in the next step.

RetrieveMultiple.png

As you can see the @odata.id key contains the full URL to the record, from which you just need the base URL. Once you grab the base URL, you can easily compose the full URLs to the areas that you want the link to navigate to e.g. open the record, open solution, new record, open list etc.

Below is the expression that I use to get the base URL only.

first(split(first(body(‘[ACTION_NAME]’)?[‘value’])?[‘@odata.id’],’/api/’))
If you use this expression and assign it to a variable, you can set the variable with the base URL.
Variable.png
This technique is quite useful if you are sending emails with clickable links that navigate to a CDS record or area.

 

Excel to JSON using Flow

I was reading through Scott’s recent post -> https://www.hanselman.com/blog/ConvertingAnExcelWorksheetIntoAJSONDocumentWithCAndNETCoreAndExcelDataReader.aspx and I was thinking that this could be done in a codeless way using Microsoft Flow. So, I tried to do this in Flow, and it was really easy to do this – under 15mins, as all the plumbing is already there. This is my Flow.

Excel to JSON Flow.png

The Flow is triggered by a HTTP GET and reads from an Excel file stored in OneDrive.

Trigger and Source.png

The output is already a JSON, but not a valid one probably one, as it has spaces and special characters in the key names, which map to the column names of the table.

Excel JSON.png

So, we can simply re-shape the data using Select, and dump the JSON response.

Select and Response.png

Now, when I call my trigger URL, the Flow executes and returns me the JSON.

Response JSON.png

With Flow add-on for Excel, you could run the same Flow from Excel itself, if you want to.

Hope you find this useful.

 

Flowception: Creating solution enabled Flow with Flow

Solution is a feature that has been used in the CRM space for a long time. Solutions support for Flow was announced last year -> https://flow.microsoft.com/en-us/blog/solutions-in-microsoft-flow/. One thing the post does not mention is that Flows have to be created from the solution record. If you have an existing Flow, that you want to package up into a solution, you cannot do that. To workaround this limitation, I have created a Flow to clone an exisiting Flow and make it solution enabled.

The Flow itself it not that long. Here is how it looks.

The Flow definition fits on a readable screenshot!

The first step is to select the existing Flow that you want to clone, into a solution enabled Flow. This can be done using the Flow Management Connector’s “Get Flow” action.

Solution enabled Flows, like solution enabled canvas apps, are also stored in the CDS database. The entity it used to store the Flow is called Process (logical name: workflow). It stores both the Flow definition, as well the the connection references.

LINQ output

However, the connection references are stored differently between CDS and Flow. Here is how the connections are stored in Flow.

[{
	"connectionName": "shared-flowmanagemen-f22a175e-d99e-4e41-8404-f6823b2d4d5e",
	"displayName": "Flow management",
	"id": "/providers/Microsoft.PowerApps/apis/shared_flowmanagement"
}, {
	"connectionName": "46c0ebf24ba6458f9a582abde1185b12",
	"displayName": "Common Data Service",
	"id": "/providers/Microsoft.PowerApps/apis/shared_commondataservice"
}]

Here is how the connections are stored inside CDS workflow.

{
	"shared_flowmanagement": {
		"connectionName": "shared-flowmanagemen-f22a175e-d99e-4e41-8404-f6823b2d4d5e",
		"source": "Invoker",
		"id": "/providers/Microsoft.PowerApps/apis/shared_flowmanagement",
		"tier": "NotSpecified"
	},
	"shared_commondataservice": {
		"connectionName": "46c0ebf24ba6458f9a582abde1185b12",
		"source": "Invoker",
		"id": "/providers/Microsoft.PowerApps/apis/shared_commondataservice",
		"tier": "NotSpecified"
	}
}

As you can see one is an array and another is an object. So, this means the connection JSON has to be reshaped, when we create a Modern Flow Process record, directly in CDS. We will use select action to reshape the data, and then do a replace to cleanup the JSON.

Flow Connection JSON.png

Connection References

The action below is the one that creates the Solution enabled Flow. You create the “Process” record using the CDS connection, and populate the Flow JSON in “Client Data” field.

Create Workflow.png

This is the formula I use in the concat.

concat
(
	'{"schemaVersion":"1.0.0.0","properties": { "definition": ', 
	body('Get_Flow')['properties']['definition'],
	', "connectionReferences": ', 
	variables('connectionReference'), '}}'
)

This creates the JSON that is accepted for the “Modern Flow” process record.

In the last step we activate (start) the newly created Flow.

Enable Flow.png

The Flow’s GUID is stored in a field called “workflowidunique” on the Process entity. So, we can use this to the retrieve the Flow, and activate it.

The crazy part of this Flow is that I was able to run the Flow on itself and add it to the solution, hence the title of the post.

Flow run.png

You can now add the Flow into the Solution, from web.powerapps.com

Solution.png

The newly created Flow, will have the same step Flow name, you specific in the first Get Flow step, prefixed with “Solution: “

Add Flow.png

The solution with the Flow can now be exported and imported into a new CDS environment. I hope this helps you to package some of you old Flows into a solution. This Flow can further improved by listing all Flows in the environment and doing the same process or cloning it, rather than specifying a specific Flow at design time.

You can download the Flow from https://1drv.ms/u/s!AvzjERKFC6gOwWC7Ywi5fgguPW1s

If you have any comments/feedback, please share them on the post or tweet me @rajyraman.

CDS, Microsoft Flow and DateTime formats

EDIT (7/12): You can download the Flow from https://1drv.ms/u/s!AvzjERKFC6gOwCqKnnAvMZA6mDsW if you just want to jump straight into exploring the actual Flow, instead of reading the article.

DateTime and Timezones seem to be the flavor of the month, so I thought I will take a crack at the problem as well. Coming from CRM background and spending a lot of time fiddling around the internal schema to understand the plumbings I came up with the following approach which I believe is decoupled and flexible.

In CDS, there is an entity called “UserSettings”, which stores a whole bunch of information regarding the user preferences. If you have used the awesome XrmToolBox tool called “UserSettings Utility”, you would remember this screen.

Timezone.png

Formats

The “UserSettings” entity is the source of this information. So, why not use the same for managing the timezones and datetime formats?

For re-usability, I want to create a Flow that just returns me the timezone information based on the executing user. Here is how it looks.

Get Timezone and Formats.png

Let us understand this step by step.

Step 1:

This is the HTTP trigger, which can take in a parameter for activedirectoryguid. This is the unique identifier for the Office 365 user.

Step 1.png

Step 2:

Get the Office 365 profile of the current user. If the activedirectoryguid is not passed in step 1, the intention is to use the current user’s Office 365 information.

Step 2.png

Step 3 & 4:

Initialise the activedirectoryguid variable, and query the CDS entity “Users”, and retrieve the user that matches the activedirectoryguid.

Step 3 and 4.png

Since activedirectoryguid can be passed on the trigger as well, I use the following coalesce expression, to set the variable to either the passed value on the trigger or from the Office 365 profile in step 2. This is the expression.

coalesce(triggerBody()?['activedirectoryguid'],body('Get_my_profile_(V2)')?['id'])

Step 5 & 6:

Retrieve “User Settings” entity and the associated “Time Zone Definitions” record based on the user’s timezone.

Step 5 and 6.png

The filter for User Settings is

first(body('Get_System_Users')?['value'])['systemuserid']

The filter for Time Zone Definitions is

first(body('Get_User_Settings')?['value'])['timezonecode']

Step 7:

This is the last step, where we return all the format and timezone information.

Step 7

Below are the expressions for the returned properties:

timezone

first(body('Get_User_Timezone')?['value'])['standardname']

dateformat

replace(first(body('Get_User_Settings')?['value'])['dateformatstring'],'/',first(body('Get_User_Settings')?['value'])['dateseparator'])

timeformat

replace(first(body('Get_User_Settings')?['value'])['timeformatstring'],':',first(body('Get_User_Settings')?['value'])['timeseparator'])

dateseparator

first(body('Get_User_Settings')?['value'])['dateseparator']

timeseparator

first(body('Get_User_Settings')?['value'])['timeseparator']

Now let us look at some sample output. For a user located in US, here is how the output of the Flow looks like.

US Output

Compare this with someone who is in Australian timezone.

Australia Output

Since this Flow now contains the logic for getting the formats and timezones, it can be utilised for another Flow that needs this information. For example look at the sample Flow below

Calling Flow

Calling Flow1

Below is the expression, I use to convert the “createdon” returned by the CDS Get Record, step which returns the datetime in UTC.

convertFromUtc(body('Get_record')?['createdon'],body('Parse_JSON')?['timezone'],concat(body('Parse_JSON')?['dateformat'],' ',body('Parse_JSON')?['timeformat']))

If you compare this Flow output and the record properties from Dynamics 365, it becomes obvious that the datetime and format is correctly displayed. Note that the CDS connector returns the datetime in UTC.

Output

CRM Record Properties

Initially I wanted to call the Flow that returns the formats and timezone using the “Start Flow” action on the Flow Connector, but it doesn’t seem to be picked up the response, so I had to resort to the whole HTTP action, which is not ideal.

It seems “Start Flow”, simply enables the Flow, does not actually run the Flow. Not sure why it is named in a misleading way.

Hope this is helpful.

 

 

Using Flow to notify solution imports

EDIT (19/08/2021): I have rewritten the Flow using the latest Dataverse connector. I have also slightly refactored the Flow, so it no longer looks the Flow in this post’s screenshot.

EDIT (03/04/2019): I made further changes to display the solution URL, so that you can click and find out the details about the solution that was imported.

EDIT (20/12/2018): I updated my Flow and made some improvements. You simply have to set the Timezone on the triggering action and you are all set. The Flow will email the solution list, to the user running the Flow. The download link has been updated to point to the updated solution.

I would not call it sneaky, but sometimes when I find the Dynamics 365 CE UI or behaviour has changed slightly, I can attribute it to some update that was applied to the environment. There are email notifications for major updates, but none for minor updates or patches that can happen frequently. So, I decided to solve this problem using Flow.

Every solution import into the system causes an Import Job record to be created. If a Flow can be scheduled to run everyday, and query the Import Job records that were started the previous day, we can easily keep track of what is happening in the environment.

The first step is to trigger the Flow on a preset schedule, and read the Import Job records.

Import Job Flow.png

Below are the expressions that I use for the boundary dates:

  • addDays(convertFromUtc(utcNow(),’AUS Eastern Standard Time’), -1, ‘yyyy-MM-dd’)
  • convertFromUtc(utcNow(),’AUS Eastern Standard Time’,’yyyy-MM-dd’)

The next step is to pick up only the information we need from the returned result, and project it to a form that is conducive for email.

Import Job Compose

With the exception of Solution Name, all the other properties are retrieved from the XML on the Data property. Below are the formula’s for those:

  1. Publisherxpath(xml(items(‘Parse_Import_Job_XML_and_re-map’)?[‘data’]),’string(//Publisher/Descriptions/Description[1]/@description)’)
  2. Started OnconvertFromUtc(items(‘Parse_Import_Job_XML_and_re-map’)?[‘startedon’],’AUS Eastern Standard Time’,’dd-MM-yyyy HH:mm:ss’)
  3. Old Versionxpath(xml(items(‘Parse_Import_Job_XML_and_re-map’)?[‘data’]),’string(//upgradeSolutionPackageInformation/fileVersion)’)
  4. New Versionxpath(xml(items(‘Parse_Import_Job_XML_and_re-map’)?[‘data’]),’string(//upgradeSolutionPackageInformation/currentVersion)’)

The last few steps are to email out the results, if any solutions were imported the previous day.

Email Solution.png

Email Step.png

The result is a barely formatted table, with the list of solutions that were imported the previous day.

Solution Import Email

With this approach, no one can sneak up a solution import on you. You have visibility over what is going on in the system.

You can download and install the Flow using this link:

Package: https://1drv.ms/u/s!AvzjERKFC6gOx3mkKpCciN-KpmC8?e=vM3DIw

Hope this is helpful to stay on top of imports.

Building a MVP Notifier using Flow and Azure Functions

Since MVPs are now announced every month, I have found it hard to track down the new awardees in Business Applications area before 3rd of every month. So, I thought I will build a notifier using Flow and Functions. This is the logic:

  1. Seed the inital MVP data from mvp.microsoft.com, to figure out the new MVPs every month into Azure Tables
  2. Schedule the Flow at 1st of every month at 5 p.m. PDT. Hopefully by this time, everyone has filled out atleast their name in the MVP profile.
  3. Retrieve the MVP data again from mvp.microsoft.com
  4. Figure out the new MVP and post a message to Slack. Add the new MVP details to Azure Tables

The logic that retrieves the details from mvp.microsoft.com uses Azure Functions. Below is the project.json for that Function App.

{
  "frameworks": {
    "net46":{
      "dependencies": {
        "AngleSharp": "0.9.10",
        "Newtonsoft.Json": "11.0.0.2"
      }
    }
   }
}

Next, is the actual logic that returns MVP list for the category passed in the URL.

using System.Net;
using AngleSharp;
using AngleSharp.Dom;
using AngleSharp.Dom.Html;
using System;
using System.Runtime.InteropServices;
using System.Text;
using Newtonsoft.Json;

public static async Task Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info("C# HTTP trigger function processed a request.");

    // parse query parameter
    string mvpCategory = req.GetQueryNameValuePairs()
        .FirstOrDefault(q => string.Compare(q.Key, "category", true) == 0)
        .Value;

    if (mvpCategory == null)
    {
        // Get request body
        dynamic data = await req.Content.ReadAsAsync();
        mvpCategory = data?.category;
    }

	var pageNumber = 1;
	var webClient = new WebClient();
	IHtmlDocument doc;
	IHtmlCollection mvpProfiles = null;
	List mvps = new List();
	var htmlParser = new AngleSharp.Parser.Html.HtmlParser(new Configuration().WithDefaultLoader());
	do
	{
		var data = webClient.DownloadData(new Uri($"https://mvp.microsoft.com/en-us/MvpSearch?ex={WebUtility.UrlEncode(mvpCategory)}&sc=s&pn={pageNumber++}&ps=48"));
		doc = htmlParser.Parse(Encoding.UTF8.GetString(data));

		mvpProfiles = doc.QuerySelectorAll(".profileListItem");
		var currentPage = mvpProfiles
		.Select(d =>
		{
			var nameAndLocation = d.QuerySelector(".profileListItemFullName a");
            var mvpUrl = nameAndLocation.GetAttribute("href");
			return new MVPInfo
			{
				Name = nameAndLocation.TextContent,
				Url = $"https://mvp.microsoft.com{mvpUrl}",
				Country = d.QuerySelector(".profileListItemLocation .subItemContent").TextContent,
                Id = mvpUrl.Substring(mvpUrl.LastIndexOf("-")+1)
			};
		})
		.ToList();
		mvps.AddRange(currentPage);
	} while (mvpProfiles.Any());

    var json = JsonConvert.SerializeObject(mvps, Formatting.Indented);
    return mvpCategory == null
        ? req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a MVP category on the query string")
        : new HttpResponseMessage(HttpStatusCode.OK)
          {
                Content = new StringContent(json, Encoding.UTF8, "application/json")
          };
}

public class MVPInfo
{
	public string Name { get; set; }
	public string Url { get; set; }
	public string Country { get; set; }
    public string Id { get; set; }
}

It uses the AngleSharp library to parse the response returned by mvp.microsoft.com search.

Below is the Flow that uses this Function, to populate the Azure Table.

Flow Overview.png

As you can see below the Flow runs at 5 p.m on 1st of every month, and gets the current MVP list from Azure Table, MVPAwards.

Azure Tables.png

The partion key for the table is month and year and the row key is the actual MVP Id.

Azure Tables Definition.png

Since this Flow and the associated Function were built for my personal use, I call the Function directly without using a custom connector. I pass the award category on the URL of the Functions itself. If you want to use a custom connector, refer one of my earlier posts.

Azure Function Call.png

After parsing the JSON returned by the Azure Function, I can use the “select” function to map the returned data, so that I can compare the data based on the MVP Id and insert the new MVPs into the Azure Table, if needed. The partion key will the current year and month (yyyyMM).

Flow Select.png

Next is the crucial bit, where I do the actual comparison.

Compare MVPs.png

The “filter” function can be used to see whether there are any matches in MVP list that was retrieved from Azure Table i.e. the previous month MVPs. If no results were returned, that means they were not a MVP last month. So, they are a new MVP and this can be stored in Azure Tables and can also be posted to Slack.

IF Then Logic.png

Post to Slack is using the Webhooks registered on the channel.

Slack Web Hooks.png

After the Flow has finished running, the output gets posted to Slack.

MVP awards.png

References:

  1. https://powerusers.microsoft.com/t5/Building-Flows/Comparing-File-Lists/td-p/64178
  2. https://api.slack.com/incoming-webhooks