In this blog, you'll learn how to get started with Power Platform and Azure DevOps to
When we deal with Microsoft Power Platform and the development of apps and process automation, we quickly come across the topics of "distribution of solutions in the company" and "development of solutions in development teams". These are the topics that DevOps and Application Lifecycle Management, or ALM for short, deal with.
There are some tutorials that describe how to implement ALM with Power Platform and Azure DevOps. It explains how to register the app in the Azure portal, configure the app users in the target environment, create service principals in Azure and how pipelines can be built to export Power Platform Solutions from an unmanaged (Dev) environment, transfer them to the source code repository of Azure DevOps and import them from there into another dev (unmanaged) or into a QA or Prod environment (managed). [1] and [2] are adequate sources for this, which make life much easier.
Illustration 1: ALM with Azure DevOps (Image taken from source [1])
In the following, I would like to take a look at the advantages and disadvantages of these solutions in different application scenarios and focus on citizen developers, professional developers, professional and fusion development teams. Based on the prerequisites of each group, I discuss pitfalls of the solutions described and offer options on how to avoid them.
First, I will dedicate myself to the application scenario of citizen developers, which is probably most likely to be covered by the solutions described in the mentioned source above. What the approaches have in common is that they are set up in a single Azure DevOps or Power Platform or M365 organization. In a real-world scenario, this means that it is your company's productively deployed organization.
Of course, access and functionality of the Power Platform can be restricted via governance mechanisms, so that your company can be protected from data loss or high licensing costs due to unawareness by employees. Secured in this way, a group of citizen developers can certainly be offered the opportunity to automate their own processes with the Power Platform and build apps that are useful for the department and produce value. However, this protection is limited and essentially refers to the protection of data via Dataverse through appropriate roles and rights assignments and the definition of DLP policies to restrict the use of connectors.
In order to be able to decide whether a connector can be used by all employees without concerns, you should know what these connectors actually allow. An example:
"An employee of a department with a M365 license uses the Outlook 365 connector included in it to automatically clean up his mailbox and a shared mailbox. To do this, the employee creates a Power Automate Flow that moves emails to different folders and deletes irrelevant emails."
The problem: People make mistakes and developing a cloud flow for automation always carries the risk of data loss due to human error. We know from our experience that mistakes happen in every development project. In this scenario, a mistake during the development of a Power Automate cloud flow for automation would directly impact the company's productive M365 systems.
To prevent such a scenario, the use of an isolated Power Platform or M365 organization for development processes is a recommendable and adequate solution. Just because we are now doing low-code/no-code development with the Microsoft Power Platform and because virtually every technically savvy person can do it on a small scale, you don't have to throw all paradigms from professional software development processes overboard. Even with Power Platform's own export and import mechanisms for Power Platform Solutions, the solutions created can be exported from one environment to one organization and imported into another organization's environment. And with Azure DevOps, this scenario can also be integrated into an ALM process.
For example, if we extend the scenario from [1] to include an isolated Power Platform organization for development purposes, we get the following scenario:
Illustration 2 ALM with isolated dev organization
However, to realize this, we need to make a few adjustments.
First, we need another app registration in the dev organization. This app registration must be created using the "Accounts in any organizational directory" option.
A Service Principal as described in [1] is now added to the respective Power Platform Dev environment of the isolated dev organization with the registry app you just created.
Next, the Service Connection is added to the dev environment in the production organization. Also, as already described in [1] with the difference that the Service Principal uses the Tenant Id, Application Id and Client Secret of the Dev environment of the isolated Dev organization.
With this change, a solution can now be exported unmanaged from the dev organization and deployed in a UAT environment of the productive organization and the multi-organizational capability of the ALM solution is established. By using the appropriate pipelines, both citizen developers and professional developers can develop their solutions in isolated environments without risk and transfer the stable solution into the organization with simple means.
An innovation that has occurred since the writing of the article [1] until today is that you can now have the msapp files of Canvas Apps automatically unzipped when exporting a Power Platform Solution and packed again accordingly when importing.
I would recommend doing this because the msapp files of Canvas App are binary files that cannot be synchronized via source code management. This leads to unnecessary conflicts in team collaboration. Furthermore, I would recommend including the msapp files in the gitignore file so that they do not even find their way into the repository.
A comparison of changes to canvas apps can be done via the unpacked YAML source files. However, there is still a drop of bitterness, as Microsoft's feature is still marked as a preview feature at the time of this article.
Sharing Canvas Apps in the Power Platform target environment is also a challenge that doesn't come out-of-the-box with the Power Platform build tools. While permissions for the Power Platform environments can be managed through corresponding Power Platform Build Tool Tasks, this does not apply to Canvas Apps. The first time Canvas Apps are imported into a target environment via a Service Principal, the Service Principal's App User is entered as the owner of the Canvas App in the target environment. In order for the app to be used by other users, the app must be shared with those users. This can be achieved in two ways.
A user who has the System Administrator role in the target environment can add the appropriate roles and users to the Canvas App as a user or co-owner. To do this, the path provided by the Power Platform is carried out manually.
Alternatively, the PowerApps Admin Module of the PowerShell cmdlets can be used for app creators, administrators, and developers. These can be added as a PowerShell Script Task to the corresponding Azure DevOps Import Pipeline. The Add-PowerAppsAccount cmdlet signs the service principal for the environment into the Power Platform. They already have the necessary permissions to share the Canvas App. The next step is to use the Set-AdminPowerAppRoleAssignment cmdlet to add the role or user to the Canvas App. See also [4] and [5].
Probably not an issue for citizen developers, but a bigger issue for professional developers is the question of source code management. Article [1] describes how to export a solution from the Power Platform development environment and transfer it to the source code repository of the Azure DevOps project and how to use the source code repository as source to import into the target environment.
This way works great when a developer is working alone on the respective branch (main or other) for a Power Platform solution. However, if a team is working on the same Power Platform solution and possibly on the same branch, you will quickly realize that this solution is unsuitable. The reason for this lies in the Azure Pipelines.
The import of the solution from the source code repository is done via a "Build Solution" pipeline, which is clearly assigned to a commit in the source code repository. The export pipeline, however, always compares the changes with the current state (last commit) of the source code repository. For a member of the team, this means that the state with which their own changes are compared does not necessarily correspond to the state from which the team member started the changes. Thus, all changes that have been committed between these two states by other team members will not be recognized as changes and will therefore be overwritten when committing the changes.
This problem can either be counteracted by a suitable branching strategy or the comparison with the source code repository is carried out via a local clone of the repository, as in conventional software development processes. A hybrid form of both approaches is also conceivable.
If you choose the branching strategy, you don't solve the fundamental problem of competing comparison of the source code. It's more of a workaround in which every developer, whether citizen or professional developer, works on their own, self-sufficient branch and changes are returned to the main branch via pull requests (PRs). By using branches, the code base between the import and export pipeline is not changed, and the branch itself holds the code base to the main branch. This means that the data can be synchronized at any time without loss. Further information on branching strategies can be found at [3].
If you choose the local clone of the source code repository as an option to avoid data loss in team collaboration, you will get a different scenario, which is shown in the following figure.
Illustration 3: ALM with isolated dev organization and local source code repository clones
The main difference to the previous scenarios is that the import and export of the Power Platform solutions for the development environments is no longer realized via Azure DevOps Pipelines. For this purpose, the PAC tools are used locally, which offer the same functionality as the Power Platform Build Tools for Azure. Due to the local clone of the repository, the code base for the changes to a solution is always given and a comparison with the source code repository in Azure DevOps is possible at any time, losslessly. If conflicts occur, they can be resolved locally.
This approach is more suitable for professional developer teams, as they are used to dealing with source code repositories and it is an everyday tool.
Basically, Azure DevOps can be used to find and implement a suitable ALM strategy for every Power Platform project and every organization. This is suitable for both citizen developers and professional developers. My favorite, depending on the customer project and complexity, is one or the other variant of the isolated dev environment. The source code repository strategy depends largely on the complexity of the implementation project and whether the development project takes place in a pure professional development team or in a fusion development team. However, it is inevitable that citizen developers need support when using source code repositories. Both the provision of the development environment to its own branch and the comparison of changes (merge) with the main branch via pull requests (PRs) are tasks that come from the professional software development process and can overwhelm many a citizen developer.
This article describes some technical possibilities and options on how to design a successful ALM strategy in a software development project using the Power Platform. However, the key to a successful ALM strategy lies in close cooperation and regular communication within the team, regardless of whether it is a professional development team or a fusion development team.
[2] https://learn.microsoft.com/de-de/power-platform/alm/devops-build-tools