Overview
Using a custom action we show you how to trigger North52 Decision Table rules as part of your Azure Data Factory pipeline. The Decision Table rules in this scenario have been purposely kept simple as the focus of this article is how to trigger the rules from Azure Data Factory rather than the rules themselves (which of course could be far more complex).

North52 Decision Suite Solution
The North52 Decision Suite solution works like this:
- An Azure Data Factory Pipeline is configured to import data into Dataverse and post import executes a Custom Action
- A Formula is set up on the Custom Action which:
- Updates information in the Account records when it is called from the Data Factory Pipeline
Prerequisites
Set up an Azure App Registration for the Dataverse environment where your Custom Action is configured. Follow the steps in the Microsoft documentation: https://learn.microsoft.com/en-us/power-apps/developer/data-platform/walkthrough-register-app-azure-active-directory.
Set up Global Action, Formula and Data Factory Pipeline
Global Action
Create a Global Action with a single string input paramater: InputString. This parameter will be populated from Azure Data Factory when it calls the Global Action.

North52 Formula
This Formula, kept simple for demonstration purposes, updates the Account description by appending the InputString value provided by Azure Data Factory when calling the Custom Action.

Azure Data Factory Pipeline
The Azure Data Factory Pipeline has three steps:
- Copy Data - extracts data from CSV file uploaded to Azure Blob Storage and creates new accounts
- Lookup - queries Dataverse to find the records created by the Copy Data step using the Import Sequence Number field populated as part of the Copy Data step
- For Each - for each Account found in the Lookup step a Web Action is executed which calls the Custom Action via the Dataverse Web API
Linked Services
Before creating the Pipeline Activities, two Linked Services need to be created:
- Link to Dataverse - allows update and query to be done
- Link to Azure Blob Storage - allows CSV file to be retrieved

Dataverse Linked Service
Create new linked service and search for dataverse:

Key fields when setting up the Dataverse Linked Service:
- Service Uri - this is your Dataverse url e.g. https://xxx.crm.dynamics.com
- Service Principal ID - obtain this from the App Registration overview and copy the Application (client) ID
- Service Principle Key - obtain this from the Client Secrets and copy the key value

Azure Blob Storage
Create new linked service and search for azure blob:

Setting up the Azure Blob Storage Linked Service is straightforward, select the Subscription and then the Storage Account - the credentials are set up for you:

Create Datasets
CreateAccountsFromCSV
Create a new Dataset and link to the Azure Blob Storage Linked Service. Update the file path to your file and configure any other relevant settings for your source file.

DataverseAccounts
Create a new Dataset and link to the Dataverse Linked Service. Configure the Entity name relevant for your requirements - for this demo it is account

Create Pipeline
The first pipeline activity for our demo scenario is Copy Data. This copies the data in our CSV file and imports it into Dataverse. The various set up tabs are configured as shown below.
Copy Data - General Tab

Copy Data - Source Tab

Copy Data - Sink Tab

Copy Data - Mapping Tab
Click Import Schemas to get an automatic mapping and then delete the unmapped fields.

Lookup - General Tab

Lookup - Settings Tab
Select the Dataverse Linked Service for the Source dataset and Query. After selecting Query you can click the edit button to enter your Fetch XML. We are using the importsquencenumber field to identify the records created in the Copy Data activity.

For Each - General Tab

For Each - Settings Tab
Click the Items field to configure the items that will be used for the For Each activity. From the Activity Outputs tab, you can click the Accounts by Import Sequence Number option to popluate the expression builder. The expression needs to begin with the @ symbol. You will also need to append .value

For Each - Web Activity General Tab
Add a Web activity and enter a name.

For Each - Web Activity Settings Tab
Key fields when setting up the Web activity:
- URL - see additional screenshot below
- Method - set to POST
- Body - see additional screenshot below
- Authentication - set to Service principal
- Tenant - obtain this from the App Registration overview and copy the Directory (tenant) ID
- Service Principal ID - obtain this from the App Registration overview and copy the Application (client) ID
- Service Principle Key - obtain this from the Client Secrets and copy the key value
- Resource - this is your Dataverse url e.g. https://xxx.crm.dynamics.com

For Each - Web Activity Settings Tab - Service Url
This URL needs to be changed for each Account within the For Each loop, so we use item( ).accountid to do this:
@concat('https://north52demo19.crm4.dynamics.com/api/data/v9.1/accounts(', item().accountid, ')/Microsoft.Dynamics.CRM.n52demo_DataFactoryDemo')

For Each - Web Activity Settings Tab - Body
The action has one input parameter InputString, and for demo purposes we are using a System variable to populate this value. If you had multiple input parameters this is where you would populate the values.
{
"InputString": "@{pipeline().TriggerTime}"
}

Testing
To test we trigger the pipeline.
The CSV file gets read from Azure Blob Storage:

The newly created account records from the Copy Data activity, before the custom action and North52 rules applied:

The Pipeline run log:

The updated Accounts after the For Each and Web activities have run:
