This article provides a technical walkthrough for setting up version control for North52 business rules and Dataverse Power Platform solutions using Azure DevOps. It covers repository structure, YAML pipeline configuration, service connections, and the complete workflow for exporting solutions from a development environment to source control, and deploying them to test and production environments.
Overview
The solution uses two Azure DevOps pipelines to manage the application lifecycle:
- Export Pipeline — Exports solutions from your dev environment, unpacks them for Git-based change tracking, versions the zip files, and commits everything to source control.
- Deploy Pipeline — Deploys a managed solution from the repository to your test and/or production environments, with version selection and built-in safety checks.
Repository Structure
The Git repository is organized into three main folders at the top level, plus a readme file.
pipelines-yaml
This folder contains the two pipeline definition files: the export pipeline YAML and the deploy pipeline YAML. Keeping pipeline definitions in their own folder separates them from the actual solution files.
solutions-archive
This is where the exported zip files are stored. These are the actual packaged solutions that get deployed to test and production. Inside this folder there is a subfolder for each solution, and within each solution folder there are two subfolders: managed and unmanaged.
Every time the export pipeline runs, it drops a versioned zip file into each of these subfolders. For example, you might see files such as:
BusinessRulesCreditCard_managed_1.0.5.0.zipBusinessRulesCreditCard_managed_1.0.6.0.zip
This is useful for rollback scenarios — you can point the deploy pipeline at an older version and it will grab that exact zip file from this folder.
solutions-unpacked
This is where source control tracking happens. When the export pipeline runs, it doesn't just save the zip file — it also unpacks the unmanaged solution into individual XML files. You will see folders for entities, workflows, North52 formulas, and all other components that make up your solution.
This matters because Git can track changes at the file level. When someone updates a business rule in your dev environment and runs the export pipeline, you can go into the commit history and see exactly which files changed, what the old values were, and what the new values are. You get a proper diff, not just a before-and-after zip file.
Important: The archive and unpacked folders are kept completely separate. The archive contains your deployable packages (what actually gets imported into an environment). The unpacked folder is purely for visibility and traceability — it is the human-readable version of what is inside those zip files.
Export Pipeline YAML
The export pipeline is a manual pipeline (trigger is set to none) that handles exporting solutions from the dev environment, versioning them, unpacking for source control, and committing to Git.
Pipeline Name & Parameters
The pipeline name includes the target solution variable and today's date, so each run shows up in your history as something like Export - BusinessRulesCreditCard - 2025-01-15. This makes it easy to locate specific runs later.
When you run the pipeline, you are prompted for two input fields:
- Commit Message — Defaults to "automated solution export", but you should replace this with something meaningful (e.g., "Lowered max credit score from 850 to 845").
- Work Item ID — Links the Azure Boards work item to the Git commit.
Versioning Variables
The target solution, major, and minor version numbers are set as UI pipeline variables — they are locked down and not visible in the YAML or editable at runtime.
The patch variable is defined in the YAML using a counter function that takes the major and minor variables as a key. Every time the pipeline runs, the counter increments by one. If you change the major or minor value in the UI, the key changes and the counter resets to zero. This means bumping the minor version automatically resets the patch number.
The solution version variable combines all of them into a clean semantic version: major.minor.patch.0.
Pipeline Steps
The export pipeline runs a single stage called "Export from Dev" containing the following steps:
- Checkout — Checks out the repository with
persistCredentialsset totrue. Without this, the pipeline can check out the repo but cannot push commits back to it. - Install Power Platform Build Tools — Standard step to make Power Platform tasks available.
- Publish Customizations — Connects to the dev environment using the Dataverse Dev service connection and publishes all customizations. This ensures that any unpublished changes made by a developer are included in the export. Without this step, you could export a stale solution.
- Set Solution Version — Stamps the computed version number onto the solution in Dataverse before the export happens, so the zip files carry the correct version internally.
- Export Solution (Unmanaged) — Exports the unmanaged solution. This is what gets unpacked for source control. Async operations are enabled with a 60-minute timeout to handle large solutions.
- Export Solution (Managed) — Exports the managed solution. This is what gets deployed to other environments.
- Unpack Solution — Takes the unmanaged zip and extracts it into the
solutions-unpackedfolder inside a subfolder named after the solution. It setsoverrideFilestotrue, so each export replaces the previous unpacked files. This is what gives you proper diffs in Git — individual XML files for entities, workflows, and business rules are all tracked separately. - Create Archive Directories (PowerShell) — Ensures the solution archive directory structure exists and creates the managed and unmanaged subfolders if they don't already exist.
- Copy Versioned Zip Files (PowerShell) — Takes the exported zip files from the staging directory and copies them into the solution archive folder with versioned file names. For example,
BusinessRulesCreditCard_managed.zipbecomesBusinessRulesCreditCard_managed_1.0.3.0.zip. If either source file is missing, it throws an error and stops. - Git Commit (PowerShell) — Sets the Git identity to whoever triggered the pipeline (so commits show the actual person's name, not a generic service account), checks out main, stages all changes with
git add --all, and builds the commit message by combining your input message, any work item IDs prefixed with a hash, and the solution version. For example:Lowered max credit score from 850 to 845 #15 - v1.0.3.0. There is a safety check — if there are no actual changes, it skips the commit entirely rather than pushing an empty one. This step also has a condition so it does not run during a pull request build. - Publish Pipeline Artifacts — Publishes the solution zips as pipeline artifacts as a backup. Even though the zips are committed to the repo, having them as pipeline artifacts means you can download them directly from the pipeline run if needed.
Deploy Pipeline YAML
The deploy pipeline handles importing managed solutions into your test and/or production environments.
Pipeline Name & Parameters
Like the export pipeline, the name includes the target solution variable along with the date and time. When you run the pipeline, you are prompted with three parameters:
- Deploy to Test — A boolean checkbox.
- Deploy to Production — A boolean checkbox.
- Solution Version — A text field that defaults to
latest. You can enter a specific version number (e.g.,1.0.2.0) to deploy an older version.
The target solution is again set as a UI pipeline variable.
Deploy Stages
There are two stages — Deploy to Test and Deploy to Production — that are essentially mirrors of each other, pointing at different service connections.
Test Stage
The test stage has a condition that only runs if the "Deploy to Test" checkbox is ticked. Each stage runs the following steps:
- Checkout — Checks out the Git repo. Unlike deploy pipelines that download artifacts from another pipeline, this one reads directly from the solutions archive folder in the repo. This keeps things simple and decoupled.
- Resolve Solution Version (PowerShell) — If you entered a specific version (e.g.,
1.0.2.0), it looks for an exact match in the solution archive managed folder. If that file doesn't exist, it lists all available versions in the log output so you can see what's there. If you left the version aslatest, it scans the folder, sorts the files, and grabs the most recent one. Either way, it sets the zip path as a pipeline variable for the next steps to use. - Install Power Platform Build Tools — Standard step.
- Import Managed Solution — Imports the managed solution into the test environment using the Dataverse Test service connection.
- Publish Customizations — Publishes all customizations so everything is live.
Production Stage
The production stage is almost identical to the test stage but uses the Dataverse Prod service connection. It also has two important differences:
- It depends on the test stage.
- The condition is set up so that if both checkboxes are ticked, production waits for test to succeed before it runs. If test fails, production is skipped — your production environment is protected. However, if only the production checkbox is ticked and test is not, the dependency is satisfied and production runs on its own.
This gives you safety when deploying to both environments, with flexibility when you only need one.
Service Connections
Three service connections are configured in the Azure DevOps project settings:
- Dataverse Dev
- Dataverse Test
- Dataverse Prod
Each service connection uses the same configuration layout:
- Server URL
- Tenant ID
- Application (Client) ID
- Client Secret
Walkthrough: End-to-End Example
The following walkthrough demonstrates the complete process from requirement to deployment.
1. Review the Work Item
Open Azure Boards and locate the work item. In this example, work item #15 states that the maximum required credit score needs to be lowered from 850 to 845.
2. Verify the Dev Change
Open the dev environment and confirm the change has been made to the business rule.
3. Run the Export Pipeline
Navigate to Pipelines and open the dev export/Git sync pipeline. Before running, you can edit the pipeline to review the variables (major version, minor version, and target solution). Run the pipeline and provide:
- Commit message: "Lowered max credit score from 850 to 845"
- Work item ID: 15
4. Review the Export Results
When the pipeline finishes, verify that it completed all steps: checkout, install tools, publish all in dev, compute version, export unmanaged and managed solutions, unpack the solution, copy to archive, commit to Git, and publish artifacts.
Open the repository and review the commit. You should see:
- The commit named with your message, work item reference, and version number.
- Managed and unmanaged solution zips added to the solution archive.
- Changes in the unpacked XML files — for example, the old value (850) shown in red and the new value (845) shown in green.
- The commit details showing who ran the pipeline and the linked work item.
- On the work item itself, the commit is automatically connected.
5. Run the Deploy Pipeline
Return to Pipelines and open the deploy pipeline. Review the variables (target solution), then run it. Select which environment to deploy to — in this case, test only — and leave the version as latest.
6. Review the Deployment Results
When the pipeline finishes, verify that only the test environment was deployed to and the production stage was skipped. The pipeline should have checked out the repo, identified the solution file, installed the build tools, imported the solution to test, and published all customizations.
Summary
This setup provides a complete version control and deployment workflow for North52 business rules:
- Semantic versioning with automatic patch incrementing.
- Full Git diff tracking at the individual XML file level.
- Work item traceability linking commits to Azure Boards items.
- Versioned solution archives for easy rollback.
- Flexible deployment with environment selection and version control.
- Production safety with conditional stage dependencies.
