top of page
Meeting room


This blog post will assume that readers have a basic understanding of environments and how they are set up for an enterprise.

For the post and the examples provided we will assume the following:







More environment details -


  • Unmanaged solutions - are used in development environments while you make changes to your application. Unmanaged solutions can be exported either as unmanaged or managed. Exported unmanaged versions of your solutions should be checked into your source control system. Unmanaged solutions should be considered your source for Microsoft Power Platform assets. When an unmanaged solution is deleted, only the solution container of any customizations included in it is deleted. All the unmanaged customizations remain in effect and belong to the default solution.

  • Managed solutions - are used to deploy to any environment that isn't a development environment for that solution. This includes test, UAT, SIT, and production environments. Managed solutions can be serviced independently from other managed solutions in an environment. As an ALM best practice, managed solutions should be generated by exporting an unmanaged solution as managed and considered a build artifact. Additionally:

    • You can't edit components directly within a managed solution. To edit managed components, first add them to an unmanaged solution.

    • When you do this, you create a dependency between your unmanaged customizations and the managed solution. When a dependency exists, the managed solution can't be uninstalled until you remove the dependency.

    • Some managed components can’t be edited. To verify whether a component can be edited, view the Managed properties.

    • You can't export a managed solution.

  • When a managed solution is deleted (uninstalled), all the customizations and extensions included with it are removed.

More on solutions - Solution concepts - Power Platform | Microsoft Docs play

Additional assumptions:

  • All environments are Dataverse environments

  • All development is done within solutions

Pre-requisites for DevOps with Power Platform

Create the application user
Start by creating an application registration in Azure, this is allow you to follow good security practice and separate duties should you or your client require it. This also detaches automated deployments from user access entirely.
Note –

  • Be sure to save your application ID and secret in Azure Key Vault

  • Repeat this process for each environment

Details on how to create this for a power platform environment can be found here Use single-tenant server-to-server authentication (Microsoft Dataverse) - Power Apps | Microsoft Docs


Add the new application user to your environment

An admin will be able to add the application id to the environment

Simply enter the app id and Azure active directory should be able to pick up the App Id created in first step.


Repeat this for each environment you created an application user for.

You will need to create Azure DevOps pipelines.


Create the service principal and connection
In Azure DevOps, an admin or build admin will be able to add the service connection to the Organization in Azure DevOps. This is the application user that was created in Azure and the same user added to your environment










Build your first pipeline

Note – this example is a simple backup of a solution you might have in DEV, a good first step on your way to automated deployments. We recommend that a deployment pipeline have an unmanaged and managed solution both be exported to follow good practice and the deployed solution is the managed one.

To start, you will need to be a ‘Build Admin’ in Azure DevOps to create the pipeline. Once it’s created, then you can specify contributors that do not have to be Build Admins.


There is good information about the Power Platform build Tasks and how to create these pipelines through laps online but I will walk through a simple example below. This example is meant to be just to get started and start with automated deployments.

Note - PowerApps-Samples/build-tools at master · microsoft/PowerApps-Samples · GitHub

1. The Pipeline should be built using the classic editor, when selecting 'New Pipeline' click 'Use the classic editor'




2. You will be asked to provide some initial inputs – make sure that you have a repo created and I suggest having a branch that you want to commit to




3. On the next screen, create using an empty job. This is where you get a clean slate to add power platform build tools



4. Add these tool tasks to the job




5. Details of export solution




6. Details of Unpack solution






Creating this first job is simple if we follow the pre-requisites. This can be difficult barrier to entry so make sure that they are cleared before getting started.


Automate it
At a high level, the automated deployments solution looks something like this:






  • Start with data – create a Dataverse table to store requests, approvals etc.

  • Create a ‘Model-driven app’ in PowerApps in order to give developers a mechanism to request a deployment by adding a new row and submit a form

  • Create a robust cloud flow to put it all together

  • Use ‘Approvals’ in Teams to get client approval

  1. The cloud flow to orchestrate this starts after a developer makes a request and submits, the cloud flow is triggered by the creation of the row in Dataverse

  2. For this example I’ll walk through how an automated back up of a solution can work but from the screen shot below there are other scenarios that are able to be created for all automated deployment needs and eventually continuous




In this case, depending on the request by the developer. There are 3 options available, each follow different steps, pipelines and procedure.

For this example, we assume a developer had selected ‘DEV backup’ – see screen shot below

  1. The pipeline can be triggered through the Azure DevOps connector – This action will be called ‘Queue a new build’

  2. Just queueing a new build isn’t enough, we must wait for the build to complete – this can only be done by using an HTTP call.
    a. When you Queue the build it will respond with a build Id, this id can be used to retrieve the status

  3. When you immediately get the status, it should not be complete as it can take a few mins for the job to complete. The cloud flow will need to delay and then check again for the status using the same HTTP call to Azure DevOps
    a. Add a ‘Do Until’ action to the flow and wait for the result to be ‘completed’ – this will tell the flow when the pipeline finished the job
    b. Note – this does not mean that the pipeline was successful, there can be situations where there is an issue and it fails to complete













































For any inquiries, questions or commendations, please call: 123-456-7890 or fill out the following form

For any inquiries, questions or commendations, please call: 123-456-7890 or fill out the following form

500 Terry Francine Street

San Francisco, CA 94158


To apply for a job with Sphere Constuctions, please send a cover letter together with your C.V. to:

To apply for a job with Sphere Constuctions, please send a cover letter together with your C.V. to:

500 Terry Francine Street

San Francisco, CA 94158



Head Office


500 Terry Francine Street

San Francisco, CA 94158

Tel: 123-456-7890

Fax: 123-456-7890

Next Steps

Learn more about the value our work brings to our clients and their customers.

Get a quote: 123-456-7890

For any inquiries, questions or commendations, please call: 123-456-7890 or fill out the following form

Contact us

500 Terry Francine Street

San Francisco, CA 94158

Tel: 123-456-7890

Fax: 123-456-7890


To apply for a job with Sphere Constructions, please send a cover letter together with your C.V. to:

bottom of page