Resubmitting failed Logic Apps using Power Automate

Even though each Logic App action has a retry policy, after a certain number of retries, the Logic App engine gives up and fails the whole execution. In these scenarios the Logic App needs to be re-run. You can use Power Automate to handle this scenario. You can download the Power Automate solution from https://1drv.ms/u/s!AvzjERKFC6gOx3-NptgmlSyD4FVu?e=vwxyAE

After you import the solution, you need to set the value for the environment variable: Azure Subscription Id to your Azure Subscription where the Logic Apps are.

Environment variable to store subscriptionid
Subscription Id environment variable

You need to have a HTTP with Azure AD connection that can be mapped to the HTTP with Azure AD Connection Reference in the solution. Both the Base Resource URL and Azure AD Resource URL should be: https://management.azure.com/

HTTP with Azure AD connection
HTTP with Azure AD connection to Azure RM API

You also need to have a Azure Resource Manager connection that can be mapped to the Azure Resource Manager Connection Reference.

The Flow runs on a HTTP Trigger, so it is manual at the moment. But, you are easily modify it to a Schedule trigger.

This is the key part where the Flow resubmits failed executions.

Re-running all failed Logic App executions
Segment to resubmit failed executions

The Flow returns JSON with both the old runId and the new runId of the resubmitted Logic App.

Flow response with Logic App execution detail
Output from Flow

This Flow can be made even smarter by persisting this JSON into CosmosDB or Table Storage, so that you can stop retrying Logic Apps that have failed more than 2 times. This is so that you don’t keep retrying executions that will fail due to some data validation errors.

Acknowledgements:

  1. Pieter Veenstra – Pieter’s method – https://sharepains.com/2019/07/09/compose-apply-to-each-power-automate/
  2. Pieter Veenstra – Unnest nested arrays – https://sharepains.com/2021/02/10/unnest-nested-arrays-in-power-automate/

Paging while using FetchXML in Dataverse Connector

If you want to retrieve more than 5,000 records in a Flow using List Rows action from Dataverse, you need to page through the records. Flow does not automatically do this for you. This is not a new topic. It is already been explored by Linn and Debajit. You can read their posts below:

  1. Linn – https://linnzawwin.blogspot.com/2021/01/retrieve-more-than-100000-dataverse.html
  2. Debajit – https://debajmecrm.com/how-to-query-more-than-5k-cds-records-using-fetchxml-in-powerautomate-microsoft-flow/

When I was looking into this same problem, I used two things differently:

  1. Using interationIndexes for paging
  2. Using xml for encoding paging cookie, instead of manually encoding “<” to “&lt;”, “>” to “&gt;” etc.

Here is the Flow

I use iterationIndexes like this for the page for each iteration.

The paging cookie returned in the first page looks like this

<cookie page="1"><systemuserid last="{7A44238F-9894-EB11-B1AC-002248153EDE}" first="{5C4D8EE8-62FF-E911-A811-000D3A799417}" /></cookie>

I then use the expression below to sanitise the characters so that I can use it in the subsequent page. It looks a bit clunky and verbose, but the key thing here is that when xml function to encode the XML, it also sanitises it. encodingJSON is a temporary object used to store the paging cookie XML. This JSON is what gets converted to XML and cleaned up with split and substring.

if(equals(iterationIndexes('Do_until'),0),
	'',
	concat(
		'paging-cookie=''', 
		substring(
			first(
				skip(
					split(
						string(xml(setProperty(variables('encodingJSON'),'x',variables('pagingCookieCleansed'))))
					,'<')
				,1)
			)
		,2),
	'''')
)

Here is the Flow running through all pages.

You can download this sample Flow from https://1drv.ms/u/s!AvzjERKFC6gOx3ZbdK8YOBYat0CV?e=M41ONd

Using Custom API as a trigger for Flow

Dropping new goodies straight to Microsoft Docs, without any formal announcement, has now been normalised. Couple of Virtual Table features have been “announced” without much fanfare this way. The ability to trigger Flows from Custom API is one such unannounced feature.

Custom API is a feature in Dataverse that is very similar to Custom Process Actions (formerly known as Actions). You can refer Compare Custom Process Action and Custom API doc to understand the differences. There are many useful tools to help you work with Custom API in XrmToolBox. They are

  1. Custom API Manager by David Rivard
  2. Custom API Tester by Jonas Rapp
  3. Custom Action to Custom API Convertor by Mark Carrington

The UI to create a new custom API is a bit to many clicks. So, we will use Custom API Manager to create our API and Custom API Tester to trigger it. You can easily create a new Custom API using Custom API Manager.

XrmToolBox Custom API Manager Tool

The important points to note when you create a Custom API that can be used as trigger are:

  1. You cannot have the IsFunction set to true
  2. You cannot have IsPrivate set to true
  3. You cannot have Allowed Custom Processing Step Type set to None

After you have created your Custom API, you need to create

  1. Root Catalog
  2. Atleast one Child Catalog
  3. One Catalog Assignment record for each Custom API or Table
  4. You cannot add Catalog Assignment records straight to the root catalog

You can do these right inside the solution. Here is how my root catalog record looks like.

Root Catalog

The unique name of the catalog needs to have a publisher prefix, otherwise you will get this error.

Root Catalog no prefix exception

After creating the root catalog, you can create the child catalog from the related records area in the form.

Child Catalog list

This is how my child catalog looks like

Child Catalog Record

I am going to add all my Custom APIs to this Custom APIs sub-catalog as Catalog Assignment. You can add both Custom APIs and Tables/Entities (if the Custom API is bound to an Table) from this screen.

Catalog Assignment

Here is how my solution looks like after creating Catalog, Custom APIs and Catalog Assignment records.

Solution with Custom API and Catalog records

Next step is to create the Flow with the “When an action is performed” trigger. In this trigger you need to choose the Root catalog, the sub-catalog and the custom API in that sub-catalog.

If the Custom API is bound to an Table, you need to choose the Table and then the Custom API, as it filters down the Custom API by Table.

Custom API bound to Table trigger

Here is how my trigger looks like. Since my Custom API is not bound to any entity, I choose none for the Table name.

Unbound Custom API Trigger

My Flow will now run when I trigger the Custom API. I can do this using Custom API tester.

Custom API Tester tool

I can also trigger the Custom API from Flow itself.

Trigger Custom API from Flow

This will cause trigger my Flow that was waiting for that Custom API call.

Flow executed on Custom API call

One thing to note is that you created the custom API with Is Private set to true, you will still see that custom API in the Flow trigger, but when you save the Flow, you will get this exception.

Private Custom API error

One word of warning: This is a preview feature. So, don’t use it in production yet. This is a welcome feature, and it opens up Flow to more integration scenarios.

References:

  1. Trigger flows when a Microsoft Dataverse action is called
  2. Catalog and Catalog Assignment Tables (Thanks to Jim Daly for sharing this docs link)

Future of developers in Power Platform

Generally all my posts are problem-solution type posts. I never write anything that is purely my opinion. But, considering the recent developments, I thought I should sit down and write down my thoughts. Before we even consider whether developers have a future in Power Platform, we need to first delve into the historical context of the developer’s role and their tasks in Dynamics CRM.

In the beginning of time

I come from a technical background, starting with ASP.NET WebForms, C# and Dynamics CRM OnPremise. Back in those days my role mostly revolved around Plugin, custom workflow assemblies, a bit of JavaScript to do data validation, hiding/unhiding or enabling/disabling fields. I also occasionally had to use Deployment Manager to do some backup/restore or UR installations.

A CRM Developer had to just know C#, CRM SDK, understand a bit of JavaScript to write crmForm.all. You never had to be highly skilled in these languages, since you need to know just enough to write a plugin, custom workflow and JavaScript. This might also be the case because CRM developers were previously C# or SharePoint developers. Since your code is being run in a managed environment, you did not need to worry about a whole lot of things a .NET Developer needed to worry about like caching, multi-threading/parallelisation, memory leaks, unit tests (💣).

ALM was non-existent because there were no concept of Solutions. Even when Solutions were introduced, there was not a whole bunch of enthusiasm for Managed Solutions, which is, I would say, is still the case. Most people just used unmanaged because it was the default. A developer’s role in the ALM (?!) front was to export the solution on DEV and import the solution into TEST/PROD, and keep a copy of the solution zip file in shared network folder.

Then came Solution Packager (2012), AdxStudio ALM Toolkit (2013), XRM CI Framework (2014). These along with TFS helped folks to enable unpack/checkin/repack/deploy the solution. There were only a few people who were passionate about ALM back then, even with this new tooling, since it was just as easy to take a database backup (in the OnPrem world), export and import the unmanaged solution, and restore the database from backup if there was some solution issue.

Once the size of projects grew, along with the number of other developers in team, it became necessary to write unit tests and integration tests. Everyone was simply adopting what was available in the .NET world. I was using NUnit, Moq and Fakes. Then came FakeXrmEasy (2015), a testing framework specifically for Dynamics CRM. It became easy to do unit testing by setting up test records in-memory and use the fake execution context to validate plugin/workflow behaviour. Jasmine and Sinon.JS was used for testing client side code, but these were front end developer frameworks, not Dynamics CRM specific.

So, by the end of 2016, Dynamics CRM developers were mostly doing plugin, custom workflow, JavaScript development, deployment, unit testing and DevOps using TFS. Since this is not based on feedback from other developers across the world in 2016, you could also interpret this as what I was doing back in those days.

Project Sienna – The one that got away

My only encounter with precursor to canvas apps, which was known as Project Sienna (2013) was very brief. As someone who was doing full-time dev work by this point, I downloaded the app to purely to play with it, drop a few controls to understand what it was doing, and lost interest because I compared it to what I can do I in a WinForms application or ASP.NET application. As someone who can build apps, it felt like a downgrade rather than a productivity addon. So, my life continued to revolve around mostly C#, JavaScript and Dynamics CRM.

CRM Online vs OnPremise – EV vs Petrol

My first encounter with CRM Online was around 2013. The biggest shock for me was the lack of access to the underlying database or IIS logs. It took away two items critical to on my debugging process. But it was merely a precursor of things to come, when things transitioned into the SaaS world. But, it was good in a way as you didn’t have to spoil your weekend looking through IIS Logs, Trace Logs, Querying database, trying to come up with a solution. All you needed to do was raise a support ticket.

So when things moved to CRM Online, developers had to just schedule the deployment window, rather then doing database backups, install the Update Rollup etc. This was a good change for developers, as they can focus on dev work and ALM, and not be the Windows Admin or Database Admin.

Lot of times developers had to write SSRS report as well, which meant you need to understand complex T-SQL, create indexes/stats etc. With CRMOnline you can only use FetchXML, which meant that lot of the reports could not be authored. When more SQL Reports moved to PowerBI, due to the limitations of SSRS Report, this gave more time to CRM Developers to work on their areas of interest.

Power Platform – Shock and Awe

Prior to Power Platform, there were two worlds: the world of a CRM Developer who might occasionally do Azure things, and SharePoint + canvas apps world. I think even now, when people say Power Apps they mean canvas app, while the CRM Developers use the term Model Driven Apps, instead of Power Apps. Even though Microsoft is persisting with its unification efforts, these two terms will most likely continue to exist.

Power Apps entered public preview in April, 2016 and it became generally available in Oct, 2016. With GA, we got the Common Data Service and Common Data Model. Even after this, I still continued to do CRM things, only because I was stuck in the CRM2015 OnPremise world, while Power Platform was beginning to take shape. People from the Office 365 E5 world, eagerly boarded the Power Platform rocket ship that started flying million miles an hour.

  • XRM became Common Data Model
  • People got confused between Common Data Model and Common Data Service
  • People were surprised to find that there can be multiple “Common” Data Service per tenant
  • Cool kids started using Flow rather than Workflows
  • It became even cooler to use Logic Apps over Flow
  • Flow/Workflow parity aka “in the fullness of time”, became a fun topic to discuss about
  • Choosing a right CDS connector was the “have you restarted your machine”
  • SharePoint/Canvas apps people still did not care for Solutions, which remained a CRM thing
  • It looked like canvas apps was a gaming platform
  • SharePoint List was the most popular “database” for canvas apps

No Code/Low Code – Reality Distortion Field

The origins of NoCode/Low Code is probably the whole SQL vs NoSQL. While NoSQL is technically feasible, and there are umpteen NoSQL databases that are alternative to SQL Server or Oracle, can you really do everything you can do with code, using low code? Microsoft’s stand these days seems to be that and it is not meant to be a replacement for code, but more like a power tool.

Also with code you commit them into source control, version control, write tests to mitigate bugs, but you still write Power Fx code inside Power Apps without any of these additional “overheads”. You can of course do this now with Power Apps Language Tools and testing to a degree with Test Studio, but back then it was just the appx based deployment and manual testing.

Initially the marketing for Power Apps was around how it is the great enabler, how it is transferring power to the people, how you can quickly build something without writing any code, how you can transition into Power Apps from any non-tech role if you just learn Power Apps. It was all about striking an emotional chord. There were “Happy Gilmore” and “Good Will Hunting” moments, because it is all about an unexpected outsider making an impact. But the campaign inadvertently turned code, IT Admins and developers into bogeyman and gatekeepers. If you don’t use Power Apps/Flow, you might have to write code became the scary proposition. Is writing code that scary?

Screenshot of Bane's speech

It might be inefficient to write code, when you could use low code tools to save dev time. But, that was not how I remember the initial marketing efforts. It was focused on how a regular business user can have the power (#LessCodeMorePower) using low code tools and how they can create apps to make their life better. It was not pitched as a productivity addon for a CRM/Power Apps developer. But, considering that it was predominantly marketing to the “no code” crowd, it is not hugely surprising.

For e.g. consider this tweet from a developer perspective.

If you are a developer it is hard not to feel disillusioned about the future. If one person could achieve something in a short time, that a whole team of developers could not, what does that do to your self-esteem and craftmanship? Does this mean that you should give up on coding and switch to low code tools? Short answer is No.

Developers are not going to end up like Zune*

According to State of Octoverse – 2020 there were 60M+ new repositories created between Oct 2019 and Sep 2020. So, code and developers are not going away any time time soon. Even if low code tools get incrementally powerful year after year, there will always be a gap that needs to be bridged with code. Also, sometimes, it might be much simpler to do something in code, rather than build a 40 step multi-branch Flow or a canvas app with duct-taped Power Fx expressions. If you have strong and inflexible opinion about either low-code or pro-code, the end result might not be optimal. So, it is important to keep an open mind.

There was a period of time, where developers could basically focus only on the CRM components i.e. form script, plugin, custom workflow step etc. and nothing else. As Power Platform + Azure convergence seems to be the emerging pattern in lot of the upcoming projects, developers also need to diversify, rather that rely only model-driven app.

Since Power Fx is now a language, everyone who uses it could technically call themselves a developer. It might start showing up inside other products in the Power Platform.

Power Fx started with Power Apps canvas apps and that is where you can experience it now. We are in the process of extracting the language from that product so that we can use it in more Microsoft Power Platform products and make it available here for you to use. That’s going to take some time and we will report on our progress here and on the Power Apps blog.

https://github.com/microsoft/Power-Fx

Power Fx might take off, or it might not, but pro-developers need to at least keep a watch on how it shapes up. Is it all hype/marketing, or is it something that can improve your productivity? Things can change quite dramatically like Nokia vs Apple or Chrome vs the rest in a short period of time.

https://zdnet1.cbsistatic.com/hub/i/2018/11/26/ba7e61c7-73c6-442d-b93a-c8f1c892ac32/4b4d5930ea733c2cb9f2a299e0b7ee94/apple-nokia-revenue.jpg
Apple beats Nokia in under 2 years
https://upload.wikimedia.org/wikipedia/commons/7/71/StatCounter-browser-ww-yearly-2009-2020_%28updated_until_November%29.png
Chrome took 2 years to become the dominant browser

Here are some areas developers can focus on:

  1. Power Apps Component Framework + React + TypeScript – PCF Components can be used in Power Apps portals, canvas apps and model-driven apps, so it saves lot of dev time since you don’t need to target individual products
  2. Virtual Tables – This is even more exciting now as you get CRUD support
  3. Azure Dev Ops – Citizen Developers are going to rely on the developers to manage the pipeline, releases, review/approve pull requests, work through solution layering, environment variable, connection reference issues etc. Since you might be using this for other Azure components like Azure Functions, Logic Apps etc. this is a good one to learn
  4. Azure Static Web Apps – If you need to build custom portal entirely in code, or single page app to embed inside an existing portal, interact with APIs, this is a good one to consider
  5. Azure Functions – While you might be able to implement what a single Function does using Logic Apps or Flow, it is hard to replicate what Durable Function or a Function with SignalR binding does. Functions give you greater control, since it is entirely code. As a developer in a Fusion team, you might be incharge of creating the API using Functions, expose it use Azure APIM, so that citizen developers can consume it using custom connector
  6. Canvas Apps/Flow/Logic Apps – Even though these are low-code, having code skills might quite beneficial when you write the expressions or Power Fx. It might also be a good opportunity to learn about UX, app design principles, accessibility etc
  7. Power Apps Virtual Agent Custom Skills – While you could call Flow and use the response in a PVA bot, using custom skills is a code first way to do it
  8. Docker – Containerisation and Dev Containers are so useful in getting a consistent dev environment setup. It also gives you the ability to switch between your local machine and CodeSpaces. So, it is good to learn and experiment with this

End of the day, if you love to code and would like to keep coding, learn new things, play with emerging tech, you still have choices, but most of them seem to be in Azure these days. It might not be a bad thing, as you could potentially switch-over to Full Stack or Azure Dev, if low code takes away lot of the opportunities for a developer. It might also end up being better career wise, as having Azure/PCF/React/TypeScript skills might open up a whole new world of opportunities outside of Microsoft partner ecosystem.

GitHub Self-Hosted Runner inside a container

EDIT (02/03/2021): You can use tmate action inside a Windows runner.

If you are using GitHub Actions, you have four choices for the runner:

  1. Windows Runner
  2. Ubuntu Runner
  3. MacOS Runner (Preview)
  4. Self-Hosted Runner

GitHub Actions for Microsoft Power Platform support only Windows Runner (as of Dec 2020). If you are building some workflows using the GitHub hosted runner, the only way you can troubleshoot the workflow is by using the logs, that can be turned on using the ACTIONS_RUNNER_DEBUG and ACTIONS_STEP_DEBUG environment variables.

You cannot use act or tmate because you are using Windows Runner. One way to work around the jam is to use self-hosted runner, if you are troubleshooting the Action. Rather than configuring the runner on your local machine, what if you could run it inside a container? This way you can run multiple runners on the same machine, all using the same underlying Windows Servercore image.

You need to install the following in your local machine:

  1. Docker Desktop
  2. VSCode
  3. Docker Extension for VSCode

During or after installation of Docker, switch to Windows Container, rather than Linux containers, as we need to use the Windows Servercore image. Now, clone this repo https://github.com/rajyraman/docker-github-self-hosted-runner on your local machine. Next, go to https://github.com/settings/tokens and create a Personal Access Token with repo scope.

Personal Access Token

Once you have the token, you can run the runner-setup PowerShell script to create the environment file for the container that has the repo url where the Actions with self-hosted runner will run, and the personal access token required to connect to the GitHub API.

PowerShell to configure env file

Now run docker-compose up –build and you should see the container being built and started with the GitHub Runner.

Building the container
Actions Runner

You should now be able to see the runner on the repo that you setup the env file for.

Now let us create a new workflow that utilises the self-hosted runner. The key this here is to mention the self-hosted runner on “runs-on”.

Sample Workflow

Since this workflow uses workflow_dispatch trigger, you can run it manually. If you now run the workflow, you should start see the logs appear on the container’s console.

Action Logs

With the latest version of the Docker extension for VSCode, you can also explore the container’s file system, which is quite handy, in case you want to inspect the artifacts produced by the workflow.

Exploring container’s file system

I hope this tip will help you when you are stuck building workflows with Power Platform Actions.

Quickly scaffolding out your new PCF Component in GitHub

tl;dr; You can use PCF Field Template and PCF Dataset Template to quickly create your new PCF Component in GitHub.

It is not an overstatement to say that there is still so much buzz around PCF components, even though there are no monthly announcements like canvas apps or Power Automate. Just look at the sheer number of components in pcf.gallery and you will be amazed by the creativity of the community.

To help people with creating new PCF components, you have Yeoman PCF Generator to do the scaffolding. Power Apps community is significantly lagging behind in using DevContainers, and believe me, by 2022, Dev Containers and GitHub Codespaces will become so common place, that we don’t blink an eyelid.

With that in mind, I thought it will good if I can move the scaffolding process entirely to GitHub rather than doing this locally. PCF CLI is not cross platform. This is the blocker at the moment for Codespaces, but that is being worked on.

Chat Transcript from Ignite

There are two repos:

In order to create a new PCF Repo, just click the “Use this Template” button.

When you create a new repo from the template, use this convention “PCF-ComponentName-Component“. The important thing is to have hyphen in the repo name, otherwise you will have issues in scaffolding out the repo, and PCF CLI will throw an exception.

Your new repo will also have a GitHub workflow to build the component and release the solution. It can be run manually. It will also run automatically when you commit something the repo with a tag that begins with “v” e.g. v1.5. You can refer https://git-scm.com/book/en/v2/Git-Basics-Tagging for information on Git tags.

I hope this will help people creating new PCF Components.

Developing PCF Components inside a container

Before anyone can start developing PCF components, they have to prepare their local machine and get it ready. This usually means that a whole bunch of tools and frameworks have to be installed. As per Microsoft documentation you need these pre-requisites:

  1. Node JS
  2. .NET 4.6.2 Developer Pack
  3. Power Apps CLI
  4. Visual Studio Build Tools (for Solution packaging)

What if you could start with just VSCode without wasting time on any of this, with a bit of templating thrown in? It is possible, with one secret ingredient: Docker.

When I started this exploration around April, I tried using Linux containers. The reason being both VS Dev Containers and GitHub Codespaces only support Linux containers. But, I faced an issue straight-away during npm build.

npm build error

I tried to edit the files in node_modules and make it work but I encountered more issues, due to the fact that PCF CLI specifically targets Windows only. So, I gave up on that idea and until about a couple of weeks back.

I thought, why not use Windows containers in Docker, rather than Linux containers. The first image of my choice was nanoserver, because of the image size. But, nanoserver images does not seem to play nicely when I wanted to install VS Build Tools and chocolatey in the image, so I ended up moving to Windows Server Core.

Below are the relevant links you can look into for the Docker file and the image:

  1. Repo – https://github.com/rajyraman/pcf-docker
  2. Image – https://github.com/users/rajyraman/packages/container/package/pcf

Now lets us see how you can use this image to start your PCF Development.

Step 1: Install Docker Desktop from https://www.docker.com/get-started

Step 2: Once Docker is up and running, right click on the icon and choose “Switch to Windows Containers”. The screenshot below is displaying Linux Containers because I am already using Windows Containers

Docker Right Click Menu

Step 3: Choose the folder where you want your sourcecode, and create two items: a folder called src and a file called pcf.env

PCF Folder Structure

Step 4: Add the details about your PCF Component on the pcf.env file. These correspond to the commandline args in PCF CLI. Below is a sample

namespace=RYR
name=TestComponent
template=field
publishername=natraj
publisherprefix=ryr

Step 5: Install Docker extension for VSCode, if you want to use the GUI, rather than commands for working with containers.

Step 5: Run the command below in VSCode’s PowerShell Terminal window to create the container from the image in GitHub Container Registry. I have specified the name of the container as “TestComponent” in the example below, but pick a different name based on the component name for the container.

docker run -it -v $pwd\src:c:\src –name TestComponent -p 8181:8181 –env-file pcf.env ghcr.io/rajyraman/pcf

Once you run this command, Docker will automatically pull the image from the registry and start building the local container.

It will also start running the PCF CLI commands inside the container.

Now if you check your local machine, you would see all the files generated by PCF CLI.

You can also see the container up and running.

You can also build the repo inside the container. Confirm that you are in the container’s terminal and not on your local machine’s terminal.

You can also build the solution using MSBuild inside the container.

Since the container’s port 8181 is mapped to the local machine’s port 8081, you can also run npm start and open the test harness on your local machine.

If you want to exit out of the container’s prompt into your host machine’s prompt just run “exit” command.

To see the list of all the containers, including this ones that are not running run the command below, or use the Docker extension.

docker ps -a

If the container is not running, run the command below to start the container and open the terminal. In the example below, the name of my container is “TestComponent”.

docker start -ai (docker ps –filter “name=TestComponent” -aq)

Now, you can just focus on writing your PCF Component without installing all the required pre-requisites on your local machine.

Automating PCF Build using GitHub Actions

Guido runs a great site called pcf.gallery where there are over 200+ PCF components. The components on the site are hosted on GitHub and have solutions that can be installed on CDS. Some components have managed solution, some have unmanaged, and some have both. Since lot of the components are open-source and are hosted in GitHub, I thought it would be good to automate this using GitHub Actions and ensure every component has both managed and unmanaged solutions.

GitHub Actions is similar to Azure DevOps. If you want to know more about GitHub Actions and how it compares to Azure DevOps, listen to DevOps and GitHub Actions with Edward Thomson on Hanselminutes. I created a starter repo with a workflow yml that anyone can use in their PCF repo hosted on GitHub. You can access this repo on https://github.com/rajyraman/pcf-actions-starter

In order to use this workflow on your repo do these steps:

  1. Clone this repo and copy the .github folder from the cloned repo into the root of your repo.
  2. Update the msbuildtarget environment variable on the yml to point to the folder with the “Other” folder. This folder would have been created when you ran the “pac solution init” command.PCF YAML
  3. The “Other” folder will have the xml files that will be packaged up into the solution.PCF Other
  4. Make sure that you update the value for “SolutionPackageType” on your cdsproj file to “Both”, so that the Solution Packager can build both Managed and Unmanaged Solutions.cdsproj

Once you make these changes commit the changes to your repo. The build command will run everytime you commit something to the repo. You can download the managed and unmanaged solutions from the artifact area on the build.

PCF Build

Once you are ready to release the solutions, tag the commit using “git tag”. If the tag is in this format: v* e.g. v1.0, v.1.0.0 etc., the workflow will run the Release steps, which will create a new release and add both the managed and unmanaged solutions to the release.

PCF Release

PCF Release TagPCF Releases

Both the Solution Name and the Solution Version will be picked up from the Solution.xml file, so as you release new versions, make sure you update this information on the file.

I hope this makes it little easier to automate your PCF builds and releases on GitHub.

 

 

Opening Lookups in Main Form Dialog

Main Form Dialog is a new feature in model-driven Power Apps. It was officially released this month (Feb 2020). Along with this, I also noticed today that Microsoft has also quietly added a new event that plays nice with MFD.

The new event is called OnLookupTagClick. This event handler can be used in scenarios where the user clicks on the lookup. Normally, this would redirect the page to the lookup record, but you can block this navigation on OnLookupTagClick, and rather show the lookup record on a modal popup using navigateTo. Below is the sample code on how to do this. Since I ran the code from DevTools console, I am using the deprecated Xrm.Page. If you are running this from a form, you should use the formContext.

Xrm.Page.getControl('parentcustomerid').addOnLookupTagClick(context => {
    context.getEventArgs().preventDefault();
	const tagValue = context.getEventArgs().getTagValue();

    Xrm.Navigation.navigateTo({
        pageType: "entityrecord",
        entityName: tagValue.entityType,
        formType: 2,
		entityId: tagValue.id
    }, {
        target: 2,
        position: 1,
        width: {
            value: 80,
            unit: "%"
        }
    });
})

Have a look at the GIF below to understand the UX of this approach.

2020-02-27_20-40-11

Using PowerShell to generate Earlybound classes

Early Bound generator is a great XrmToolBox tool developed by fellow MVP Daryl LaBar. It helps you to generate the strongly-typed classes that can be used in your custom code development projects e.g. plugins, workflows, console app etc.  If your are committing the early bound classes into source control, you also should committing the configuration that is used to generate the classes and also make it easy for the next developer joining the team to generate these classes, without jumping into XrmToolBox and clicking it manually. In this post, I will present my approach into streamlining this process.

  1. Generate the files once from XrmToolBox with the settings that are optimal for your requirements EBG Settings
  2. Locate the the path where all XrmToolBox tools are installed. You can see this by clicking Configuration -> Settings. On the settings page, click on the “Paths” tab and then the “Open XrmToolBox storage folder” link.XTB Path
  3. Navigate to Plugins -> DLaB.EarlyBoundGeneratorEBG Folder
  4. Copy the crmsvcutil.exe.config file into a folder. You would need this for checking in into source control. This file won’t have any passwords, so it is safe to check in.
  5. Below is the PowerShell script to run. You also should committing this script into source control. The PowerShell script should be run on the same location where you copied the crmsvcutil.exe.config file in the previous step. The generated early bound classes will be in the folder specified in the “$outputPath” variable, which is “EarlyBoundClasses” in this script. You can change this match your solution folder structure.
    Write-Output "Start"
    
    $sourceNugetExe = "https://dist.nuget.org/win-x86-commandline/latest/nuget.exe"
    $targetNugetExe = ".\nuget.exe"
    Remove-Item .\Tools -Force -Recurse -ErrorAction Ignore
    Invoke-WebRequest $sourceNugetExe -OutFile $targetNugetExe
    Set-Alias nuget $targetNugetExe -Scope Global -Verbose
    $connString = "AuthType=OAuth;Username=[LOGINNAME];Integrated Security=true;Url=https://[INSTANCENAME].[INSTANCEREGION].dynamics.com;AppId=51f81489-12ee-4a9e-aaae-a2591f45987d;RedirectUri=app://58145B91-0C36-4500-8554-080854F2AC97;TokenCacheStorePath=.\MyTokenCache;LoginPrompt=Auto"
    $outputPath = ".\EarlyBoundClasses"
    $namespace = "Xrm.Entities"
    
    ##
    ##Download EBG
    ##
    ./nuget install DLaB.Xrm.EarlyBoundGenerator.Api
    move .\DLaB.Xrm.*\content\bin\DLaB.*\* .\EBG -Force
    copy .\crmsvcutil.exe.config .\EBG -Force
    Remove-Item .\DLaB.Xrm.EarlyBoundGenerator.Api* -Force -Recurse
    
    .\EBG\CrmSvcUtil.exe /connstr:$connString /generateActions /out:"$outputPath\Actions.cs" /namespace:"$namespace" /codecustomization:"DLaB.CrmSvcUtilExtensions.Action.CustomizeCodeDomService,DLaB.CrmSvcUtilExtensions" /codegenerationservice:"DLaB.CrmSvcUtilExtensions.Action.CustomCodeGenerationService,DLaB.CrmSvcUtilExtensions" /codewriterfilter:"DLaB.CrmSvcUtilExtensions.Action.CodeWriterFilterService,DLaB.CrmSvcUtilExtensions" /metadataproviderservice:"DLaB.CrmSvcUtilExtensions.BaseMetadataProviderService,DLaB.CrmSvcUtilExtensions"
    .\EBG\CrmSvcUtil.exe /connstr:$connString /out:"$outputPath\CrmServiceContext.cs" /namespace:"$namespace" /servicecontextname:"CrmServiceContext" /codecustomization:"DLaB.CrmSvcUtilExtensions.Entity.CustomizeCodeDomService,DLaB.CrmSvcUtilExtensions" /codegenerationservice:"DLaB.CrmSvcUtilExtensions.Entity.CustomCodeGenerationService,DLaB.CrmSvcUtilExtensions" /codewriterfilter:"DLaB.CrmSvcUtilExtensions.Entity.CodeWriterFilterService,DLaB.CrmSvcUtilExtensions" /namingservice:"DLaB.CrmSvcUtilExtensions.NamingService,DLaB.CrmSvcUtilExtensions" /metadataproviderservice:"DLaB.CrmSvcUtilExtensions.Entity.MetadataProviderService,DLaB.CrmSvcUtilExtensions"
    .\EBG\CrmSvcUtil.exe /connstr:$connString /out:"$outputPath\OptionSets.cs" /namespace:"$namespace" /codecustomization:"DLaB.CrmSvcUtilExtensions.OptionSet.CustomizeCodeDomService,DLaB.CrmSvcUtilExtensions" /codegenerationservice:"DLaB.CrmSvcUtilExtensions.OptionSet.CustomCodeGenerationService,DLaB.CrmSvcUtilExtensions" /codewriterfilter:"DLaB.CrmSvcUtilExtensions.OptionSet.CodeWriterFilterService,DLaB.CrmSvcUtilExtensions" /namingservice:"DLaB.CrmSvcUtilExtensions.NamingService,DLaB.CrmSvcUtilExtensions" /metadataproviderservice:"DLaB.CrmSvcUtilExtensions.BaseMetadataProviderService,DLaB.CrmSvcUtilExtensions"
    
    ##
    #Cleanup
    ##
    Remove-Item nuget.exe
    Remove-Item .\EBG -Force -Recurse
    
    Write-Output "Complete"
    

You can see that I use OAuth for authentication. OAuth is one of the mechanism you can use to authenticate in the connection string. You can refer https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/xrm-tooling/use-connection-strings-xrm-tooling-connect page for additional information. If you don’t like entering your credentials during script execution, you can also use “ClientSecret”, but this needs to be first configured on Azure AD.  You can refer Nishant’s post about this https://nishantrana.me/2019/08/24/connect-to-dynamics-365-web-api-using-oauth-2-0-client-credentials/

Microsoft also has created a AppId and RedirectUri that you can use on you development or test instance. Below is the disclaimer on the docs page about these.

When using the OAuth AuthType\AuthenticationType
For development and prototyping purposes we have provided the following AppId or ClientId and Redirect URI for use in OAuth Flows.

For production use, you should create an AppId or ClientId that is specific to your tenant in the Azure Management portal.

Sample AppId or ClientId = 51f81489-12ee-4a9e-aaae-a2591f45987d
Sample RedirectUri = app://58145B91-0C36-4500-8554-080854F2AC97

 

Hope this helps.