cancel
Showing results for 
Search instead for 
Did you mean: 
suparna-banerje

Detail Step-By-Step - Power Platform ALM with Azure DevOps

Introduction

 

  • Azure DevOps Repositories can be used as Source Control for Power Platform Solutions
  • CI/CD can be implemented using Azure DevOps Pipelines
  • We can use Microsoft Power Platform Build Tools to automate common build and deployment tasks related to apps built on Microsoft Power Platform . There are 2 versions of Build tools – older based on PowerShell and latest version based on Power Platform CLI

 

Power Platform ALM with Azure DevOps Process Flow

 

To implement CI/CD process with Azure DevOps Pipelines, we can store the Power Platform solution in the source control. There are two main Paths –

  1. Export the unmanaged solution and place it as unpacked in the source control system. The build process imports the packed solution as unmanaged into a temporary build environment , then export the solution as managed and store it as a build artifact in your source control system.
  2. Export the solution as unmanaged and also export the solution as managed, and place both in the source control system.

suparnabanerje_0-1674123826749.png

 

Fig 1

 

In this article, we will show the steps to achieve the Option above 2 above, and Fig 1 depicts the process.

 

1.Setup source and Target environment

 

Create source and target environments. Both should have Dataverse database enabled. Create an unmanaged Solution in the source environment

 

2.Set up Azure DevOps

 

  1. Create Azure Dev Ops Organization
  2. Create the Project within it
  3. Create the Repository to hold the source code
  4. Install Microsoft Power Platform Build Tools into your Azure DevOps organization from Azure Marketpl...
  5. Request parallelism if using Azure DevOps pipelines for first time using the link https://aka.ms/azpipelines-parallelism-request.
  6. Within the Project, navigate to Project Settings> Repositories>Security Tab. Under user Permissions, make sure that for Project Collection Service Accounts under Azure DevOps Groups and <ProjectName> Build Service <OrgName> under users Contribute Permission is set to allow

suparnabanerje_1-1674123826755.png

 

suparnabanerje_2-1674123826760.png

 

 

3a. Create Azure DevOps Pipelines with Application ID and Client Secret

 

Create Azure AD App Registration

 

  1. Go to https://portal.azure.com
  2. Search for App Registration, click New Registration
  3. Provide the name, keep other fields with default value and click Register
  4. Once the App is created, go to API Permissions, click Add a Permission>Select Dynamics CRM>Add Permission >Grant Admin Consent for <tenant>suparnabanerje_3-1674123826766.png
  5. Go to Overview>Client credentials>New Secret. Copy the value into a notepad as this will be needed later and you won’t be able to get it once navigate away from this page.
  6.  Come back to overview, and copy the Application (client) ID and Directory (tenant) ID in the same notepad. You will need these 3 values while creating service connection

 

Add the service principal as App user into Power Platform source and destination environment.

 

  1. Go to Power Platform Admin Center>Environments
  2. Select your Source Environment
  3. From right navigation, Users >See All>App users list
  4. Click New App user>search for the App created in Previous step>Add it and provide System Customizer or System Administrator role.
  5. Repeat all the steps above for the destination environment

 

Create Service Connection with Application ID and Client Secret

 

  1. Go to your Azure DevOps Project, click Project Settings.
  2. Under Pipelines, click Service Connections >New Service Connection>Select Power Platform
  3. Select Authentication method as Application ID and client secret
  4. Go to make.powerapps.com> Select your Source environment >Go to Settings>Session details>copy the Instance url and paste it under Server Url
  5. Paste Tenant Id, Application Id and Client Secret as saved earlier
  6. Save the Service Connection with the name “Dev Service Principal”
  7. Follow the steps ii to vi above , this time get the destination environment url, create the service connection and save as “Prod Service Principal”

 

Create Pipeline – Export from Source

 

i.  From the left navigation within the Project, click on Pipelines >New Pipeline>Use the Classic Editor

ii.  Select the Source as Azure Repos Git, select your Project, Repository and Branch and click continue

suparnabanerje_4-1674123826768.png

 

iii.  Under select template, start with Empty job

suparnabanerje_5-1674123826769.png

 

 

iv.  Click Agent Job 1 and make sure Allow Scripts to access OAuth token is checked

suparnabanerje_6-1674123826770.png

 

 

v.  Add the task Power Platform Tool Installer with task version 2

suparnabanerje_7-1674123826776.png

 

 

vi.  Add the task Power Platform Export Solution. We are adding the task to export the solution as unmanaged here

suparnabanerje_8-1674123826778.png

 

 

suparnabanerje_9-1674123826789.png

 

 

For Service Connection, Select Service Principal> Select Dev Service Principal from Dropdown

Provide your Solution Name (not the display name)

Solution output file name $(Build.ArtifactStagingDirectory)\<SolutionName>.zip

Uncheck export as Managed Solutin

 

vii.  Copy the above task . This time we are exporting managed solution. Keep all settings same, only check the box Export as Managed solution and the Solution Output file name to $(Build.ArtifactStagingDirectory)\<SolutionName>_managed.zip

 

viii.  Add the task Power Platform Unpack Solution

suparnabanerje_10-1674123826801.png

 

 

Solution Input File -$(Build.ArtifactStagingDirectory)\<SolutionName>.zip

Target Folder to Unpack Solution - $(Build.SourcesDirectory)\<SolutionName>

Type of Solution – Both

 

ix.  Add a task Command Line script, and paste the below script

 

echo commit all changes

git config user.email “<email>”

git config user.name "<user name>"

git checkout -B main

git add --all

git commit -m "code commit"

git push --set-upstream origin main

 

x.  Save and queue the Pipeline and wait it to be finished

xi.  Check the repository for the unpacked source code

 

suparnabanerje_11-1674123826803.png

 

 

Create Deployment Settings File

 

  1. Open Visual Studio Code
  2. Install PAC CLI
  3. Run the below command to export the solution in your local machine

 

pac solution export --name <solutionname> --path .\ --managed false

 

     iv.  Run below command to create Deployment Settings file

 

pac solution create-settings --solution-zip .\<SolutionName>.zip --settings-file <SolutionName>.json

    v.  Update values in the Deployment Settings file for the target environment

    vi.  In the Repository, create a Folder named Settings, create a file <SolutionName>.json within it, copy the text from the Deployment Settings File

suparnabanerje_12-1674123826808.png

 

 

Create Build Pipeline

 

  1.             Create a new Pipeline with Classic Editor>Empty Job
  2.             Add the task Power Platform Tool Installer
  3.             Add a task Power Platform Pack Solution

suparnabanerje_13-1674123826818.png

 

Source Folder of Solution to Pack -Select Folder by clicking 3 dots

Solution Output File -<SolutionName>.zip

Type of Solution -Both

  iv.     Add a task -Copy Files

suparnabanerje_14-1674123826827.png

 

Source Folder -Settings

Contents -**

Target Folder - $(Build.ArtifactStagingDirectory)

 

  v.   Add a task Publish Artifact

 

suparnabanerje_15-1674123826835.png

 

 

Path to publish - $(Build.ArtifactStagingDirectory)

Artifact Name – drop

 

  vi.   Save and Queue the Build Pipeline

 

Create Release Pipeline

 

  1.             From Left navigation, click Releases >New Release Pipeline >Start with Empty Job
  2.             Add the Artifact created in the Build pipeline

suparnabanerje_16-1674123826845.png

 

Select your project and build pipeline

Source alias -drop

  1.             Under Stages>Stage 1>Click Job
  2.             Add the task Power Platform Tool installer
  3.             Add the task Power Platform Import Solution

 

suparnabanerje_17-1674123826855.png

 

      iv.   Service Connection – Select the service connection you created for Prod.

       v.   Solution Input File – Select clicking 3 dots, select the managed zip file for Prod

suparnabanerje_18-1674123826859.png

      vi.   Check Use Deployment Settings File and select the Deployment settings file by clicking 3 dots.

      vii.  For Prod, under Advanced, check Import Managed Solution

       viii.  Save the pipeline and create a Release

       ix.  Check the Solution has been deployed properly to Production

 

3b. Create Azure DevOps Pipelines with Managed Identity

 

    1. Create VMScaleSet and assign Managed Identity
    2. Create Self hosted Agent pool and point to the VMscale set
    3. Add the managed identity as App user into Power Platform source and destination environment.
    4. Create Service Connection with managed identity.
    5. Create Build Pipelines- Export from Source and Build Solution. Create Release Pipeline – Deploy to Destination . Use previously created self-hosted agent pool.

 

Please follow this Blog for detail steps .

 

 

 

Comments

Hi Suparna,

 

Great article. We are trying to set up DevOps process for the power platform in our Agency. We are using our on-prem Azure DevOps server. I was able to export the solution from our dev power apps environment and create a build. But when trying to deploy the build artifact to the test environment I am getting the error below.

 

Error: RetrievePrivilegeForUser: The user with id {GUID} has not been assigned any roles. They need a role with the prvReadAsyncOperation privilege.

 

We do not see any user with the id {GUID} in our AAD. Not sure how to resolve this issue. Any help or suggestion will be appreciated. 

 

Thank you,

Subodh

I could export the solution to target environment, but not able to play the app. It's happening only when I'm exporting solution through Pipelines and working well when exporting manually. Please look into this issue.

@sxm0275  Did you add the Service Principal as an App user to the destination environment with at least System Customizer role? Please go to portal.azure.com>App Registrations and and search for the Service Principal you created for this pipeline, and check that it's Application(Client) ID is matching the Guid that is shown in the error.

@angaravlgs When you export the App using pipelines with a Service Principal, the Service Principal (App user) becomes the owner of the App, any other user will get access denied as it's not shared with them. You need to login as an Admin and share the App with any user, including the Admin . When you are importing manually, the App owner is the user who is importing, so he/she can access the App.

@suparna-banerje The service principle was missing sys admin/customizer role in destination. Adding the roles got it working. Thank you. 

Hi @suparna-banerje 

 

Your method worked and pipeline went through without any errors. However, in target environment, I could see the connection established from deployment settings file. I had to manually edit the app and configure the connection. Is there anything I am missing?

Anonymous

This is giving us issues between the unmanaged and managed environment. I am able to package, export, import and deploy but it is not behaving the same way in both environments which is beyond frustrating when trying to move your application through the ALM and out to production.

 

Also, the Dataflows are not easily migrated and the connections between services that are dev/test/prod are not easily switched. It makes for further headaches when it should be seamless.

@Anonymous Can you please provide more details on what is behaving differently in 2 environments and any other specific issue you are facing?

@Kavya1 Are you having this issue for all connections or for some connections also? If for some, can you please let me know which connections are having this problem?

@suparna-banerje I tried for SQL connection and got to know there is no option to create an environment variable for SQL database. 

Anonymous

@suparna-banerje It seems that the issue is around the Navigate variables that are being created between screens. The lead product engineer shows in the development environment, but not in test. It is being looked up using a nested gallery (a gallery within a gallery item). So it looks up the order information in a gallery and then brings in the lead product engineer information in a nested gallery within that gallery. Going to try and make it a global variable with the Set command and see if that fixes the issue.

Helper III

Thanks a lot for the post, @suparna-banerje .

 

However, I have a question: How do I handle plugin assemblies when packing my solution?

When I unpack the .dll files will be exported as ".dll.data.xml" which will not work when using the "Pack Solution" task.

 

I understand I must somehow "map" the .dll file using this option in the Pack Solution task. But how is this done? Do I need to pack both the solution and the plugin component separately and use the "Add Solution Component" afterward for the plugin part?

 

Currently my "Pack Solution" task is failing because it is trying to find the .dll file for my plugin, but the repository only contains the ".dll.data.xml" exported from the "Unpack Solution" task.

 

So my question is, how do I map this .dll file correctly so my "Pack Solution" task works? 🙂

Unfortunately, the documentation I have found isn't really helping me understand this step.

oml_0-1685454232047.png

oml_1-1685454250827.png

 


Cheers

Is it possible to perform the step Create Deployment Settings File as part of the DevOps task to fully automate the process? What changes need to be made to achieve this? Thanks

@JLKM 

Yes, you can perform the step "Create Deployment Settings File" as Part of the DevOps Pipeline. You can use the powershell-task to achieve it.
For example:

  - powershell: |
      Write-Output "Creating SolutionSettings"
      $env:PATH = $env:PATH + ";" + "$(pacPath)"
      pac solution create-settings --solution-folder $(unpackaged_folder) --settings-file "SolutionSettings/settings$(solutionname)Dev.json"
    displayName: 'Generate SolutionSettings.json'

I also used Powershell to check if the Solution Files also exists for UAT and Prod. If they exist they can be updated via powershell. Sadly you still need to set the properties by yourself once. Maybe here could help a Canvas App to  be more user friendly. 

I hope this helps. 





I am using Patches in MS Dynamics. Did anyone of you experience the CI/CD process including patches? 

I would like to know if the opportunities for this use case are limited or not by Microsoft.

Thanks for this article @suparna-banerje 

 

When committing the unmanaged solution to the repo using a yaml pipeline, I found that you also need to include a git fetch command to bring the branches into the environment and update the branch after the initial commit.

Hi @suparna-banerje , thank you for the article, its helpful for lot moving forward in automating the deployment. However I am facing an issue wherein the environment variables and connection reference is not getting updated with the new values in the destination. It is still retaining the old values only. Any inputs where I need to check, as I have followed all the steps in the article.

Below is my deployment settings entries. All these variables are connections are still pointing to Dev, even though the solution is imported successfully without any errors.

{
  "EnvironmentVariables": [
    {
      "SchemaName": "cr5f7_varListName",
      "Value": "NewEmployeeOnboardingUAT"
    },
    {
      "SchemaName": "cr5f7_varSiteURL",
    }
  ],
  "ConnectionReferences": [
    {
      "LogicalName": "cr5f7_sharedsharepointonline_8d82e",
      "ConnectionId": "",
      "ConnectorId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline"
    }
  ]
}

Hi @Akash_S 

 

Environment Variables:

Did you remove current value before deployment?

 

Doing so ensure that the default value is moved over to the destination/target environment. That is, default value should hold the value meant for the destination/target environment.

 

Connection Reference:

Did you ensure there is an equivalent connection reference in the destination/target environment before deployment?

 

Create one if there isn't, or deployment will fail.

 

Just my two cents.

@JLKM - Yes current value is cleared in the source environment for all variables.

For connections as well, there is a connection already created in the destination and I updated the same in the deployment settings file. 

PP error 1.pngerror 2.pngerror 3.pngerror 4.png

I've imported the solution into the target environment with empty values in the ConnectionID field in the DeploymentSettings.json. However, when I try to pre-populate or update values in the Deployment Settings file for the target environment, I encounter a generic error. 

eudaimonia__0-1706542794602.png


I attempted to manually add the connection in the target environment and even created a canvas app with a gallery to verify the connection references, but the connection ID is empty.

eudaimonia__1-1706542882151.png

 

How can I resolve this? TIA

 

@eudaimonia_ If you are deploying through service principal, can you please try sharing the connection in the target environment with the service principal, then add the connection id in the deployment settings file and try to deploy?

@Akash_S Are you still facing the issue regarding no update of environment variables in target env? There are some recent Product updates on this, can you please try again?

@badovinacs  This worked for me while committing using yaml pipeline

 

taskCmdLine@2
  inputs:
    script: |
      echo commit all changes
      git config user.email "<email>"
      git config user.name "<user name>"
      git checkout -B main
      git add --all
      git commit -m "solution init"
      echo push code to new repo
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push --set-upstream origin main

Hello @suparna-banerje 

 

I have followed the steps outlined here, but in the Build Solution process-Pack Solution, I am getting this error: "Error: Cannot find required file 'D:\\a\\1\\s\\mysolution\\Other\\Customizations.xml'."

 

talyrobin1_0-1706657677451.png

 

My pipeline tasks are as follows: 

talyrobin1_1-1706657753585.png

 

My 1st pipeline completed successfully and added to my repo as shown below.  The Customizations.xml file exists; just not sure what I am doing wrong.  Can you please advise?

 

talyrobin1_2-1706657843019.png

 

 

@talyrobin1  Why it is searching inside folder mysolution? In your screenshot I can see that your folder name is TaskTrackerSolution.zip. You may click 3 dots and select the source folder. Can you please check the SolutionName variable in your Export and Build pipeline, ideally it would be the name of the solution (not display name), without zip

@suparna-banerje , thank you for the reply.

I checked my variable and it is correct.  My issue was that when I unpacked the solution in the PP Unpack Solution tasks, I used $(SolutionName).zip.  However, I was trying to unpack only $(SolutionName) (no .zip).  I have fixed this and it works.

Thanks so much for pointing me in the right direction!  Much appreciated.

Great Article, extremely helpful. Everything runs fine but i have trying to achieve something that is not working:

The idea is multiple developers can push their code in thier own branches in azure devops and then they will be merged for deployment.

i have these set of commands in yml. I have yml on my main branch and trying to push solution to custom FeatureBranch. It works if i run pipeline from FeatureBranch but not from main branch. 

echo commit all changes
      git config user.email "<email>"
      git config user.name "<user name>"
      git checkout -B FeatureBranch
      git add --all
      git commit -m "solution init"
      echo push code to new repo
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push --set-upstream origin
FeatureBranch

@fannazephyr You may refer to the ALM accelerator yaml for export solution and push to git, it has the code for pushing to a remote branch https://github.com/microsoft/coe-alm-accelerator-templates/blob/main/Pipelines/Templates/export-Solu... 

@suparna-banerje Thanks for your response, yes indeed i have been trying to get around it using ALM code but no success yet. But i will keep trying i guess 🙂 

Hello, do we need to have azure devops paid subscription for ALM. Our customer are data sensitive organization so i am not confident with the free version of azure devops.

Hello @suparna-banerje 

 

Greetings! 
Great article and it will be definitely helpful to everyone who work in Power Platform.


Is there a possibility to integrate the same with GitHub as like Azure Repo?

 

Please advise! 

Thank you again

Hello Suparna

 

Great Article, I followed it and related discussion as well, able to implement without any issue.

I have one basic question, suppose after created new release pipeline, I made some changes in my existing powerapp application and its solution already exist in devops, how the solution update in devops automatically. how it will updated changes in DevOps, do we need to do some kind of schedule or any other process we need to follow?

 

Thanks

Avian

@AvuanDecosta You can either manually trigger the Export pipeline after each change, or schedule the Export pipeline from the Triggers section, each export will bring the latest changes. 

Hello Suparna,

 

Thanks for reply, as you explained about manual/schedule the export pipeline, it would be greatful if you can share some article or any of your blog where you explained about the schedule/manual process.

 

How schedule job will check that solution has some changes, because schedule job will always run and always update the pipeline. I think it it is not possible to manage this action thru schedule job, this can be done thru manual job.

 

Thanks

Avian

Hi @suparna-banerje ,

 

Thanks for the article. Could you please let me know if this works for multi tenant deployment.

I have a separate dev and QA tenant.

 

Thanks

@vinayaknair2802 yes, it works for multi tenant deployment, provide your tenant id and app registration details while creating service connection

@suparna-banerje,

 

Any plans to provide detailed steps for Managed Identity just like you provided for Service Connection?

I have created VM Set, created and assigned managed identity to the VM set, added the managed identity as app user to my Dynamics instance, created Power Platform service connection in Azure devops (it does not have much option than to just provide Dynamics org url).

But when I create pipelines, the queue never runs. It sits in queued state forever.

pipeline runs without error, but does not seem to create any folders/files in my repo?  Any help appreciated. 

Make sure about your subscriptions.

Hi @suparna-banerje

 

I am not seeing the same steps to create a new pipeline - here is what I see:

joecognizant_0-1718901624813.pngjoecognizant_1-1718901653430.pngjoecognizant_2-1718901683512.png

 

Show More doesn't have anything relevant to Power Platform so I clicked Starter Pipeline (there was no option to select the classic editor or to start with an empty Job) and this is what I see:

 

joecognizant_4-1718902019799.png

 

After saving and running as is, I am in a detached HEAD state

joecognizant_7-1718903239986.png

 

 

joecognizant_6-1718902364546.png

 

Here is my question:

 

After following your steps to add the Tool installer, Exporting the solution, and Unpacking it, how do I commit?

 

I tried this:

 

taskCmdLine@2

  inputs:

    script: |

      echo commit all changes

      git config user.email Joe@blahblahblah.onmicrosoft.com

      git config user.name 'Automatic Build'

      git checkout -B main

      git add --all

      git commit -m 'code commit'

      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push --set-upstream origin main

    workingDirectory'$(System.DefaultWorkingDirectory)'

 

And I'm getting this:

 

"Nothing to commit, working tree clean"

 

Thoughts?

 

Joe

 

Why we need unpack and again pack solution ? I ready don't understand any Business or Technical use of it. We can export Managed  and then deploy to Target environment. 

@joecognizant Please check your Azure DevOps Project Settings if the creation of Classic Pipelines is disabled, you need to have it enabled to use the Classic editor