Building Exposure Bot using Google Actions, Azure Functions and Power Automate

Almost everyone now has a smartphone. The two dominant platforms are: Google Android and Apple iOS. Both these platforms have virtual assistant built into the OS. Android’s virtual assistant is called Google Assistant, and Apple’s virtual assistant is called Siri. In this post, I will show you how you can build a custom action for Google Assistant. The equivalent for custom action in Bot Framework is Skills.

Here is the high level architecture diagram of the solution.

Architecture diagram of the virtual assistant solution
Architecture of virtual assistant solution

The starting point to build the virtual assistant solution is create a new Actions project. You need to head over to https://console.actions.google.com/ to create a new Actions project.

Google Actions projects screen
Google Actions projects

The first thing you need to do is name your bot. I have named mine as “Exposure Bot”. This means that the user can trigger the bot by saying “Talk to exposure bot” to the Google Assistant.

Settings screen
Settings screen

We then need to create one or more scenes to handle the user’s input. After the user has triggered the bot using the phrase “Talk to exposure bot”, she is then transferred to a scene called triggerExposure_scene.

Transitioning from main scene to triggerExposure scene
Transitioning from main scene to triggerExposure_scene

We can now design the main scene that handles multiple inputs. This scene is called triggerExposure_scene.

trigger exposure scene
trigger exposure scene

In this scene you can 3 intents to decide what to do based on user input. We have three intents on the Actions project.

  1. Suburb_intent – When the user says something like “Show me the numbers for Melbourne”, it will be mapped to this intent. A Flow (webhook) will be called to get the response for this intent
  2. Notifications_intent – When the user says something like “Send me daily notifications”, it will be mapped to this intent. The user is transitioned over to a scene called subscription_scene so that we can get accurate GPS information for the subscription and user’s permission for sending push notifications
  3. IncomingPushNotifications_Intent – There is a scheduled Flow that sends exposure notifications three times a day to the user. When the user clicks on that notification, this intent is invoked. But, as you can see when that happens, the conversation immediately ends. The user should never hit this intent in this scene.

Now let us look at intents next. Intent is basically figuring out what the user intends to do, based on training phrases. For example, below are the training phrases for Suburb_intent. Notice that you can map the types (Entity in PVA) right inside the training phrase itself.

List of intents
Intents

For figuring out whether the user wants to receive push notifications we have different set of training phrases.

Push notifications intent
Push Notifications Intent

The suburb type itself is just free text.

Suburb type
Suburb type

All the above scenes and intents are used when the user is chatting with the bot. But, in the case of push notifications the IncomingPushNotifications_Intent will be invoked only when the user clicks on the notification in Android. Here is how that scene looks. Notice that it is global intent, which means it can be invoked even in the middle of the conversation.

Push Notifications scene
Push Notifications scene

When the user clicks on the push notification, she is transferred to the pushNotification_scene, which grabs the current location using Android’s native location API and calls Flow to get the current exposure data.

Push Notification scene
Push Notification scene

This is how suburb_intent also gets the current exposure data.

Suburb intent webhook
Suburb intent webhook

We have similar webhook call for setting up the push notification subscription. Notice how we are grabbing the location and notification permission and then calling the Flow will the message name,

Push notification subscription scene
Push notification subscription scene

We are calling webhooks in couple of places. But, where are we defining the Flow that is called? It is in the webhook area. The key to call the Flow is on the URL itself. I am not aware of any alternate method to transparently use OAuth in Google Actions. If you know a better way to handle authentication in this context, please add your comment on the post.

Webhook setup
Webhook setup

There are two more things you need to make sure:

  • Service Account is created for the project
Service Account

Both enabling Actions API and creating service account for the project are documented in https://codelabs.developers.google.com/codelabs/actions-user-engagement/#3, which you can refer for additional instructions. After you download the JSON from service account copy only the key and keep it aside. We would need it for adding to key vault. Don’t include the —BEGIN PRIVATE KEY and –END PRIVATE KEY parts. Copy just the key as highlighted.

Service Account key
Service Account Key

The next part is setting up the Azure resources: Function App, Storage Account, App Insights and Key Vault. Head over to https://github.com/rajyraman/Google-Actions-Push-Notifications where you have a deploy to Azure button. Click the button and fill in the Google Account Email, Google Secret and you Azure User Id.

Deploy Azure Resources
Deploy Azure Resources

You can click on the visualise button to see what resources will be deployed.

Both Google Account Email and Google Secret are in the Service Account JSON file which you would have downloaded. In order to get the User Id run the following command in AZ CLI.

 az ad signed-in-user show --query objectId --output tsv

After the resources are provisioned, you need to deploy the Function code to the Azure, by running this command.

func azure functionapp publish FUNCTIONAPPNAME

If you want to deploy from command line and have Bicep and Az CLI installed, I have also included a PowerShell script that you can use to deploy from your local machine.

Deployment Script
Deployment Script

Since the Function App has OpenAPI enabled, you can also access it from the browser to quickly understand the Function App.

Swagger UI
Swagger UI

On production it is not a good idea to exposure this to everyone, as the documentation might have confidential data, so you can lock the Swagger UI down further or remove it altogether. Refer https://github.com/Azure/azure-functions-openapi-extension/blob/main/docs/openapi.md#configuration for additional information.

The Function App uses Durable Entity that is persisted in Azure Storage. Since we just need to store the Google User Id and Location for which exposure notification is required, it is a good fit for small applications like this one.

I have also uploaded all the Google Actions code to the repo, so in order to deploy this to Google Action, you can use the gactions CLI.

Google Actions source code
Google Actions source code

The command to run is

gactions login
gactions push

Before deploying you need to update the Flow URL to the URL of Get Exposure Data Flow. This Flow gets the COVID 19 Exposure Data provided by Data.VIC Victorian Exposure Sites API.

Update Flow URL for webhook
Update Flow URL for webhook

You can download the Managed Solution with Custom Connectors and Flows from https://1drv.ms/u/s!AvzjERKFC6gOyAZf2Mw1f2QkgYft?e=HL1bnV. Here is how the solution looks.

Flow and Custom Connector Solution

After importing you also have to change the base URL of the Google Assistant custom connector to point to the correct base URL of your Function App. This is the Function App that manages subscription for locations, Google API authentication and sending push notification to the user.

After all of this is setup you would be able to chat with the bot from your Google Assistant.

Push notifications from exposure bot
Push notifications from exposure bot
Google Assistant – Exposure Bot

All Actions messages are captures in logs, so you can always refer them if you are stuck.

Google Cloud Logs

You can refer the source code on https://github.com/rajyraman/Google-Actions-Push-Notifications

Credit: flaticon.com for the icons used in custom connector

References:

  1. Push Notifications on Google Actions
  2. gactions CLI
  3. Discover Azure Functions: OpenAPI and Power Apps
  4. Conversational Actions

Resubmitting failed Logic Apps using Power Automate

Even though each Logic App action has a retry policy, after a certain number of retries, the Logic App engine gives up and fails the whole execution. In these scenarios the Logic App needs to be re-run. You can use Power Automate to handle this scenario. You can download the Power Automate solution from https://1drv.ms/u/s!AvzjERKFC6gOx3-NptgmlSyD4FVu?e=vwxyAE

After you import the solution, you need to set the value for the environment variable: Azure Subscription Id to your Azure Subscription where the Logic Apps are.

Environment variable to store subscriptionid
Subscription Id environment variable

You need to have a HTTP with Azure AD connection that can be mapped to the HTTP with Azure AD Connection Reference in the solution. Both the Base Resource URL and Azure AD Resource URL should be: https://management.azure.com/

HTTP with Azure AD connection
HTTP with Azure AD connection to Azure RM API

You also need to have a Azure Resource Manager connection that can be mapped to the Azure Resource Manager Connection Reference.

The Flow runs on a HTTP Trigger, so it is manual at the moment. But, you are easily modify it to a Schedule trigger.

This is the key part where the Flow resubmits failed executions.

Re-running all failed Logic App executions
Segment to resubmit failed executions

The Flow returns JSON with both the old runId and the new runId of the resubmitted Logic App.

Flow response with Logic App execution detail
Output from Flow

This Flow can be made even smarter by persisting this JSON into CosmosDB or Table Storage, so that you can stop retrying Logic Apps that have failed more than 2 times. This is so that you don’t keep retrying executions that will fail due to some data validation errors.

Acknowledgements:

  1. Pieter Veenstra – Pieter’s method – https://sharepains.com/2019/07/09/compose-apply-to-each-power-automate/
  2. Pieter Veenstra – Unnest nested arrays – https://sharepains.com/2021/02/10/unnest-nested-arrays-in-power-automate/

Paging while using FetchXML in Dataverse Connector

If you want to retrieve more than 5,000 records in a Flow using List Rows action from Dataverse, you need to page through the records. Flow does not automatically do this for you. This is not a new topic. It is already been explored by Linn and Debajit. You can read their posts below:

  1. Linn – https://linnzawwin.blogspot.com/2021/01/retrieve-more-than-100000-dataverse.html
  2. Debajit – https://debajmecrm.com/how-to-query-more-than-5k-cds-records-using-fetchxml-in-powerautomate-microsoft-flow/

When I was looking into this same problem, I used two things differently:

  1. Using interationIndexes for paging
  2. Using xml for encoding paging cookie, instead of manually encoding “<” to “&lt;”, “>” to “&gt;” etc.

Here is the Flow

I use iterationIndexes like this for the page for each iteration.

The paging cookie returned in the first page looks like this

<cookie page="1"><systemuserid last="{7A44238F-9894-EB11-B1AC-002248153EDE}" first="{5C4D8EE8-62FF-E911-A811-000D3A799417}" /></cookie>

I then use the expression below to sanitise the characters so that I can use it in the subsequent page. It looks a bit clunky and verbose, but the key thing here is that when xml function to encode the XML, it also sanitises it. encodingJSON is a temporary object used to store the paging cookie XML. This JSON is what gets converted to XML and cleaned up with split and substring.

if(equals(iterationIndexes('Do_until'),0),
	'',
	concat(
		'paging-cookie=''', 
		substring(
			first(
				skip(
					split(
						string(xml(setProperty(variables('encodingJSON'),'x',variables('pagingCookieCleansed'))))
					,'<')
				,1)
			)
		,2),
	'''')
)

Here is the Flow running through all pages.

You can download this sample Flow from https://1drv.ms/u/s!AvzjERKFC6gOx3ZbdK8YOBYat0CV?e=M41ONd

Using Custom API as a trigger for Flow

Dropping new goodies straight to Microsoft Docs, without any formal announcement, has now been normalised. Couple of Virtual Table features have been “announced” without much fanfare this way. The ability to trigger Flows from Custom API is one such unannounced feature.

Custom API is a feature in Dataverse that is very similar to Custom Process Actions (formerly known as Actions). You can refer Compare Custom Process Action and Custom API doc to understand the differences. There are many useful tools to help you work with Custom API in XrmToolBox. They are

  1. Custom API Manager by David Rivard
  2. Custom API Tester by Jonas Rapp
  3. Custom Action to Custom API Convertor by Mark Carrington

The UI to create a new custom API is a bit to many clicks. So, we will use Custom API Manager to create our API and Custom API Tester to trigger it. You can easily create a new Custom API using Custom API Manager.

XrmToolBox Custom API Manager Tool

The important points to note when you create a Custom API that can be used as trigger are:

  1. You cannot have the IsFunction set to true
  2. You cannot have IsPrivate set to true
  3. You cannot have Allowed Custom Processing Step Type set to None

After you have created your Custom API, you need to create

  1. Root Catalog
  2. Atleast one Child Catalog
  3. One Catalog Assignment record for each Custom API or Table
  4. You cannot add Catalog Assignment records straight to the root catalog

You can do these right inside the solution. Here is how my root catalog record looks like.

Root Catalog

The unique name of the catalog needs to have a publisher prefix, otherwise you will get this error.

Root Catalog no prefix exception

After creating the root catalog, you can create the child catalog from the related records area in the form.

Child Catalog list

This is how my child catalog looks like

Child Catalog Record

I am going to add all my Custom APIs to this Custom APIs sub-catalog as Catalog Assignment. You can add both Custom APIs and Tables/Entities (if the Custom API is bound to an Table) from this screen.

Catalog Assignment

Here is how my solution looks like after creating Catalog, Custom APIs and Catalog Assignment records.

Solution with Custom API and Catalog records

Next step is to create the Flow with the “When an action is performed” trigger. In this trigger you need to choose the Root catalog, the sub-catalog and the custom API in that sub-catalog.

If the Custom API is bound to an Table, you need to choose the Table and then the Custom API, as it filters down the Custom API by Table.

Custom API bound to Table trigger

Here is how my trigger looks like. Since my Custom API is not bound to any entity, I choose none for the Table name.

Unbound Custom API Trigger

My Flow will now run when I trigger the Custom API. I can do this using Custom API tester.

Custom API Tester tool

I can also trigger the Custom API from Flow itself.

Trigger Custom API from Flow

This will cause trigger my Flow that was waiting for that Custom API call.

Flow executed on Custom API call

One thing to note is that you created the custom API with Is Private set to true, you will still see that custom API in the Flow trigger, but when you save the Flow, you will get this exception.

Private Custom API error

One word of warning: This is a preview feature. So, don’t use it in production yet. This is a welcome feature, and it opens up Flow to more integration scenarios.

References:

  1. Trigger flows when a Microsoft Dataverse action is called
  2. Catalog and Catalog Assignment Tables (Thanks to Jim Daly for sharing this docs link)