• PRODUCTS
  • SUPPORT
  • DOCS
  • PARTNERS
  • COMPANY
  • QUOTE
  • ACCOUNT
  • STORE
QUOTE

Documentation

Support Forums
What can we help you with?
Topics

Azure Data Factory Deployment Through PowerShell

This document and video illustrates how to build and deploy Azure Data Factory ARM template artifacts using PowerShell.

For information on how to deploy through the Azure Portal: Azure Portal Deployment Guide

Prerequisites

Using a BimlFlex metadata project configured for Azure Data Factory, such as one of the many sample metadata projects, build the project in BimlStudio to create the ADF Artifacts.

Ensure that the arm_template.json and arm_template_parameters.json files are in the project's output folder. The path will look like this:

...\output\DataFactories\<Setting.AzureDataFactoryName>\arm_template.json

The Data Factory name used is derived from the BimlFlex settings. It is recommended that a proper Data Factory name is defined and used for the deployment. As the Data Factory name is a globally unique name, make sure the name is available if it has not already been created. If no Data Factory name is specified in the Settings, BimlFlex will name the Data Factory BimlFlex in the logical view and generate a random name for the deployment.

ARM Template Parameters File

An ARM template can include parameters that allow for customized deployments. For example, they can provide values that are tailored for a specific environment (eg. dev, test, or production). In PowerShell, the parameters can be specified as arguments to the command, or they can be included in a parameter file.BimlFlex generates the Parameters file, populated with parameter values from the project settings.

Example parameters, including a Snowflake Azure Function Bridge

Parameter Name BimlFlex Setting Description
AzureFunctionBridgeName_functionKey.value AzureFunctionBridgeKey
BimlFlexAutogeneratedKeyVaultSettings.keyVaultName AzureKeyVault
BimlFlexAzureFunctionBridgeSettings.value.AppInsightsName AzureFunctionBridgeName [VALUE]-AI
BimlFlexAzureFunctionBridgeSettings.value.AppName AzureFunctionBridgeName
BimlFlexAzureFunctionBridgeSettings.value.FunctionKey AzureFunctionBridgeKey
BimlFlexAzureFunctionBridgeSettings.value.StorageAccountName AzureFunctionBridgeName [VALUE]sa
BimlFlexAzureFunctionBridgeSettings.value.WebFarmName AzureFunctionBridgeName [VALUE]-WF
factoryName.value AzureDataFactoryName

ARM Template Size Limitations

Microsoft has provided a detailed account of ARM Template best practices here.

The most significant ARM template limitation in reference to BimlFlex is the overall size of the final state template being less than 4MB.

Additional documentation regarding BimlFlex's handling of ARM Template size restrictions and the generation of a linked template folder for individual data sets can be referenced here.

Example: SQL ADF arm_template_parameters.json

This example file illustrates a dynamically generated ADF name and a corresponding Key Vault name

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "BimlFlexAutogeneratedKeyVaultSettings": {
      "value": {
        "keyVaultName": "AKV-281a7eeb2eb3d",
        "skuName": "Standard"
      }
    },
    "factoryName": {
      "value": "ADF-281a7eeb2eb3d"
    }
  }
}
Note

Populate the AzureDataFactoryName and AzureKeyVault settings with the expected target names instead of relying on the autogenerated values from BimlFlex

Example: SnowFlake ADF arm_template_parameters.json

This example file illustrates a Snowflake deployment file with parameters defined

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "AzureFunctionBridgeName_functionKey": {
      "value": "AzureFunctionBridgeKey"
    },
    "BimlFlexAutogeneratedKeyVaultSettings": {
      "value": {
        "keyVaultName": "AzureKeyVault",
        "skuName": "Standard"
      }
    },
    "BimlFlexAzureFunctionBridgeSettings": {
      "value": {
        "AppInsightsName": "AzureFunctionBridgeName-AI",
        "AppName": "AzureFunctionBridgeName",
        "DeployAppInsights": true,
        "FunctionKey": "AzureFunctionBridgeKey",
        "StorageAccountName": "azurefunctionbridgenamesa",
        "StorageAccountType": "Standard_LRS",
        "WebFarmName": "AzureFunctionBridgeName-WF"
      }
    },
    "factoryName": {
      "value": "AzureDataFactoryName"
    }
  }
}

Generated SSDT Deployment Scripts

BimlFlex creates PowerShell deployment scripts for SQL Server Data Tools projects. These are used to deploy SSDT database project definitions to compatible targets.

A deploy all script is created with the name:

_ssdt-deploy-all.ps1

This will call all individual Build and Deploy scripts for each database project.

These scripts are located in the ...\output\Deploy\ in the defined output location.

Running the _ssdt-deploy-all.ps1 script will deploy the schema to the relevant database for supported SSDT targets.

Generated Deployment Scripts

BimlFlex creates PowerShell deployment scripts for the Data Factory ARM template.

These scripts are located in the ...\output\Deploy\ in the defined output location.

A deploy script is created with the name following this pattern:

adf-deploy.<Setting.AzureDataFactoryName>.ps1

Note

Execution only requires running the *.ps1 file and not manually running the PowerShell commands.

Important

If there is no value specified for the AzureDataFactoryName setting, the actual ADF Data Factory Name will be named according to this convention: ADF-<RandomHashValue>.

The file contents show the commands that are used. At the top of the file are some commented-out commands that are needed to both install the cmdlets and connect to the specified environment.

# If required run the following command to install the Azure cmdlets
# Install-Module -Name Az -AllowClobber -Scope CurrentUser
​
# If required run the following command to connect to your Azure account
# Connect-AzAccount
Note

If further details on the generated commands are needed, or to create these commands manually, refer the the article linked below.
Microsoft Docs:
Get started with Azure PowerShell
Connect-AzAccount

Now, similar to the arm_template_parameters file above, BimlFlex generates variables that store setting values from the project, and passes them into the deployment commands.

# Provide your Subscription and ResourceGroupName below 
$azureSubscriptionId = "00000000-0000-0000-0000-000000"
$azureResourceGroup = "BFX_Test"

$outputBasePath = "C:\Varigence-Test\ADF-ELT-SQLDW\output\";
$armTemplatePath = "$($outputBasePath)\DataFactories\BimlFlex\arm_template.json"
$armTemplateParamsPath = "$($outputBasePath)\DataFactories\BimlFlex\arm_template_parameters.json"
Parameter Name BimlFlex Setting Description
$azureSubscriptionId AzureSubscriptionId The SubscriptionId to deploy to, defined in the settings
$azureResourceGroup AzureResourceGroup The Resource Group to Deploy to, defined in the settings
$outputBasePath OutputPath This is the BimlFlex project's output path.
$armTemplatePath AzureDataFactoryName If not supplied then BimlFlex is used.
$armTemplateParamsPath AzureDataFactoryName If not supplied then BimlFlex is used.

Run this file to deploy the ADF ARM Template assets to Azure

Post Deployment

Pipelines are now available for verification or running inside the ADF Authoring Tool.

Pipelines can either be run manually or started using ADF triggers.

Create Triggers

The following options are available for creating Triggers to run the Pipelines

  1. Create the Trigger in the ADF Authoring Tool or through PowerShell
  2. Create the Trigger in BimlFlex through an Extension Point.

An example Extension Point that creates an ADF trigger is listed below. Adding the trigger as an Extension Point will include it in the ARM template, enabling it to be deployed with the rest of the ADF assets.

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="AdfBatchTrigger" Target="<AddBatchNameAsTargetHere>" #>
<#@ property name="batch" type="BimlFlexModelWrapper.BatchesWrapper" #>
<Schedule Name="ScheduleTriggerName" Frequency="Hour" Interval="1" Start="2001-01-01" End="2020-12-31">
    <Pipelines>
        <Pipeline PipelineName="0_<#=batch.Name #>_Batch">
            <Parameters>
                <Parameter Name="IsInitialLoad">false</Parameter>
            </Parameters>
        </Pipeline>
    </Pipelines>
</Schedule>
Note

This will create a trigger that will run the pipeline, once every hour from January 1st, 2001, to December 31st, 2020.

Once a trigger is deployed, it must be started.

Start it manually using the Authoring Tool or by running the following command in PowerShell with the corresponding variables populated with the target:

Start-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name "ScheduleTriggerName"

© Varigence