Views:

 

This article explains how to implement a complete Application Lifecycle Management (ALM) solution for your Dynamics 365 / Power Platform business rules using North52 Decision Suite and Azure DevOps. This approach gives you full traceability, automated deployments, and complete auditability — whether you're managing credit card eligibility rules, approval workflows, or any other business logic in Dataverse.

Why Version Control for Business Rules?

Business rules change constantly. Market conditions shift, regulations update, and business requirements evolve. When these changes are tracked manually — through spreadsheets, emails, or undocumented deployments — you end up with what can be described as "black-box deployments": changes going to production with nobody knowing who changed what, or why.

The goal is complete traceability. You need to be able to trace every change from the original business requirement all the way through to production deployment: who requested it, what was changed, when it was deployed, and why.

The Four-Step ALM Workflow

The workflow is a four-step process that is completely automated once configured:

Step 1: Request

Everything starts with a work item in Azure DevOps — a user story, bug fix, or change request. Every change has a trackable origin.

Step 2: Build

The developer or business analyst makes changes in the North52 Formula Editor in your development environment. This is where the actual business logic gets updated.

Step 3: Commit

An Azure DevOps pipeline automatically exports the solution from dev, unpacks it, and commits it to your Git repository. The commit message includes a link back to the original work item.

Step 4: Deploy

Another pipeline takes the committed code and deploys it as a managed solution to your test environment and optionally to production.

The key benefits are zero touch (no manual solution exports or imports) and full integration (every code change is linked back to the work item that requested it).

Manual vs. Automated Approach

The old way relied on spreadsheets for tracking changes, manual solution exports and imports, moving zip files around, and hoping someone remembers to document what they did. Accountability gaps are everywhere.

The modern way gives you an automated audit trail where every change is recorded automatically. One-click pipelines handle all exports and imports, and you have proof of every change — who made it, when, and what the actual code difference was. In regulated industries, this kind of audit trail can be the difference between passing or failing a compliance audit.

Solution Flow Across Environments

The solution flows through three stages:

Development Environment — Developers and business analysts implement changes using the North52 Formula Editor. Once changes are validated and tested locally, they are ready for export.

Azure DevOps — The export pipeline connects to your dev environment, exports the solution, and commits it to your Git repo. This creates a permanent audit trail. You can see exactly what changed in each commit, compare versions, and roll back if needed.

Test & Production — The import pipeline takes the managed solution from the repository and deploys it first to test for validation, then to production when ready. The same managed solution that was tested is exactly what gets deployed to production — no manual steps, no opportunity for human error.

Walkthrough: Updating a Business Rule

As a concrete example, consider updating a platinum card eligibility rule. The work item (e.g. Work Item #16) specifies: "Change the minimum total assets required for customers with a credit rating between 760 and 770. Increase the total assets of applicant from $200,000 to $300,000."

A functional consultant opens the North52 Decision Suite and updates the logic directly — no complex coding, just a simple configuration change. Behind the scenes, this rule is saved as a web resource inside the Business Rules Credit Card solution.

Triggering the Export Pipeline

Instead of emailing zip files, you simply trigger the export pipeline and tag it with the work item number (e.g. Work Item 16). This tag links the business request to the technical solution. You provide a meaningful commit message such as "Updated platinum card eligibility min total assets 300,000".

Viewing the Audit Trail

The automation exports the solution, unpacks the files, and commits them to source control. The result is a complete audit trail. You can see exactly what changed — the system highlights that the value moved from 200,000 to 300,000 — providing proof that the system matches the requirement.

In the committed files you can see the change made to the formula, the managed and unmanaged solutions that were exported, who exported it, and the linked work item. The commit is also added to the work item's development section, ensuring traceability for each and every change.

Deploying to Test & Production

Deploying to test or production uses the exact same managed solution that was just verified, ensuring consistency across all environments with zero manual steps.

Repository Structure

Your Azure DevOps repository should have the following folder structure:

pipelines-yaml/ — Contains your YAML pipeline definitions: dev-export-sync.yml for exporting from dev, and deploy-solution.yml for deploying to test and production.

solutions-archive/ — Where the pipeline stores versioned zip files of your solutions. Contains subfolders for each solution with managed and unmanaged subdirectories. Each export creates a new zip file with a version number in the filename, e.g. BusinessRulesCreditCard_managed_1.0.5.0.zip. This gives you an archive of every version ever deployed.

solutions-unpacked/ — Contains the unpacked source code, broken down by solution. This is what enables the diff view. When the pipeline exports your solution, it unpacks the files into this folder so Git can track individual file changes.

This structure gives you the best of both worlds: archive zip files for easy deployment and unpacked source code for detailed version tracking.

Export Pipeline: Step by Step

1. Publish All Customizations

Before exporting, the pipeline publishes all customizations in your dev environment to ensure you are capturing the latest data and not any stale changes.

2. Set the Solution Version

The pipeline automatically sets the solution version using semantic versioning in the format major.minor.patch.0. The major and minor numbers are controlled by pipeline variables you set in the UI. The patch number auto-increments every time the pipeline runs and resets to zero whenever you bump the major or minor version. For example: 1.0.0.0, 1.0.1.0, 1.0.2.0, etc.

3. Export Solutions

The pipeline exports both unmanaged and managed versions. The unmanaged version is for source control (it gets unpacked and can be diffed). The managed version is what gets deployed to downstream environments.

4. Unpack and Archive

The unmanaged solution gets unpacked into the solutions-unpacked folder, breaking it into individual XML files that Git can track at the component level. Both managed and unmanaged zips are copied into the solutions-archive folder with versioned filenames. Every version is kept, so you can always roll back.

5. Commit to Git

Everything gets committed to the repo — the unpacked files, the archive zips, all of it. The commit message includes your custom message, a link to the work item, and the version number, e.g. "Updated platinum credit card threshold – #16 v1.0.3.0".

Running the Export Pipeline

When you run the pipeline, you will see two input fields: a commit message and a work item ID. The solution name is locked down as a pipeline variable called target_solution, so there is no risk of accidentally exporting the wrong solution. The pipeline uses a service connection (e.g. dataverse-dev) with service principal authentication for secure access.

Deploy Pipeline: Test & Production

The deploy pipeline has two stages — deploy to test and deploy to production — both controlled by checkboxes when you run the pipeline. You tick which environments you want to deploy to: test, production, or both. You can also specify a solution version (defaults to "latest") or enter a specific version number for rollbacks.

Stage 1: Deploy to Test

Only runs if you tick the "deploy to test" checkbox. It checks out the Git repo, runs a PowerShell script to find the correct managed version from the solutions-archive folder, installs the Power Platform Build Tools, imports the managed solution to your test environment, and publishes customizations. Uses the dataverse-test service connection.

Stage 2: Deploy to Production

Only runs if you tick the "deploy to production" checkbox. If both checkboxes are ticked, production waits for test to succeed first — if test fails, production is skipped and your production environment is protected. Follows the same steps using the dataverse-prod service connection.

Version Resolution

If you leave the version as "latest", the pipeline scans the solutions-archive/managed folder, sorts the versioned zip files, and grabs the most recent one. If you enter a specific version (e.g. 1.0.2.0), it looks for an exact match. If that version does not exist, it lists all available versions in the log output so you can pick the correct one.

Azure DevOps Environments

Azure DevOps environments provide deployment tracking and history. Each environment shows the current deployment status, the build number deployed, and when it was last updated. Benefits include:

Complete deployment history — See every deployment that has ever been made to each environment.

Approval gates — Require approval before production deployments.

Easy rollback — Redeploy a previous build with a few clicks.

Accountability — Know exactly who deployed what and when.

Service Connections & Authentication

Service connections are how your pipelines authenticate to your Dataverse environments. You need three connections:

dataverse-dev — Used by the export pipeline to connect to your development environment and export solutions.

dataverse-test — Used by the deploy pipeline to deploy solutions to your test environment.

dataverse-prod — Used by the deploy pipeline to deploy solutions to your production environment.

All three use service principal authentication (SPN), which is more secure than using a user account because it is not tied to a specific person and does not expire when someone leaves the organization.

Setting Up Service Connections

To set these up manually:

  1. Create an App Registration in Entra ID (formerly Azure AD) to obtain an Application ID and Client Secret.
  2. Add an Application User in the Power Platform Admin Center with the System Administrator security role in each environment.
  3. Create the service connection in Azure DevOps under Project Settings > Service Connections, using the Application ID and Client Secret.

Quick Setup with Power Platform CLI

The easiest way is to use the Power Platform CLI command:

pac admin create-service-principal

targeting your environment. This single command registers the app in Entra ID, creates the application user in Dataverse, and assigns the System Administrator role automatically. The Client ID and Secret are output directly in the terminal.

How the Two Pipelines Work Together

The export pipeline handles the development side: publishing customizations, stamping semantic versions, exporting managed and unmanaged zips, unpacking for source control, archiving versioned files, and committing everything to Git with work item links.

The deploy pipeline handles deployment: checking out the repo, finding the correct managed zip (latest or a specific version), and importing it into test, production, or both.

Importantly, the two pipelines are completely decoupled. There is no pipeline resource link or artifact dependency between them. The export pipeline writes to the repo and the deploy pipeline reads from the repo. This means you can export without deploying, deploy an older version without re-exporting, or run them independently at different times. The Git repo is the single source of truth that ties everything together.

Summary of Benefits

Linked requirements — Every technical change is linked back to a business requirement. Work items in Azure DevOps create a clear chain from request to implementation.

Single source of truth — Your Git repository becomes the definitive record for all business rules. It is automated, version-controlled, and always up to date.

Consistent deployment — The same pipeline deploys to test and production with no manual steps, no variation, and no human error.

Reduced risk and increased speed — Full traceability for compliance, zero manual errors because everything is automated, and faster deployments because you are not waiting for someone to manually export and import solutions.

This is enterprise-grade ALM for your Dynamics 365 / Power Platform business rules.