Skip to main content

Using .NET Control Structures to Fully Automate BimlScript

Transform a single BimlScript template into hundreds of pipelines automatically. This lesson teaches you to use loops, conditionals, and live metadata queries to eliminate manual package creation entirely.

What You'll Learn

  • Use foreach loops to iterate over table collections
  • Apply if/else conditionals for platform-specific logic
  • Query live database schemas with ImportDB() and GetDatabaseSchema()
  • Access Biml objects through RootNode API
  • Organize multi-file solutions with tiers
  • Generate pipelines for SSIS, ADF, Databricks, Snowflake, and Fabric

Prerequisites: Using .NET Variables (understand text/control nuggets)


Why Full Automation Matters

Compare the effort required to create 100 staging packages:

ApproachTime per PackageTotal TimeMaintenance Cost
Manual SSDT/ADF UI15 minutes25 hoursEdit each package
Variables (previous lesson)15 seconds25 minutesEdit and rebuild each
Control structures0 secondsSingle buildRebuild all instantly

Business Benefits

  • Consistency: Every package follows the same pattern with no human variation
  • Speed: Add 50 tables to scope? Rebuild takes seconds
  • Maintainability: Change the pattern once, regenerate everything
  • Auditability: Template is version-controlled; changes are traceable

Creating Table Collections

Step 1: Define a Static Collection

Start with a hardcoded list to understand the mechanics:

<#
var tableNames = new[] {
"DimAccount",
"DimCurrency",
"DimCustomer",
"DimDate",
"DimProduct"
};
#>

This creates a C# string array. The next step iterates over it.

Step 2: Loop with Foreach

Wrap your package definition in a foreach loop:

<#
var tableNames = new[] {
"DimAccount",
"DimCurrency",
"DimCustomer",
"DimDate",
"DimProduct"
};
#>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="Source" ConnectionString="Provider=SQLNCLI11;Server=SOURCE_SERVER;Initial Catalog=AdventureWorksDW2012;Integrated Security=SSPI;" />
<OleDbConnection Name="Target" ConnectionString="Provider=SQLNCLI11;Server=localhost;Initial Catalog=Staging;Integrated Security=SSPI;" />
</Connections>
<Packages>
<# foreach (var tableName in tableNames) { #>
<Package Name="Load_<#=tableName#>" ConstraintMode="Linear">
<Tasks>
<Dataflow Name="DFT_Load">
<Transformations>
<OleDbSource Name="Source" ConnectionName="Source">
<DirectInput>SELECT * FROM [dbo].[<#=tableName#>]</DirectInput>
</OleDbSource>
<OleDbDestination Name="Destination" ConnectionName="Target">
<ExternalTableOutput Table="[stg].[<#=tableName#>]" />
</OleDbDestination>
</Transformations>
</Dataflow>
</Tasks>
</Package>
<# } #>
</Packages>
</Biml>

Output: Five packages named Load_DimAccount.dtsx, Load_DimCurrency.dtsx, etc.

Key Syntax Points

ElementPurpose
<# foreach (...) { #>Opens loop (control nugget)
<#=tableName#>Outputs current value (text nugget)
<# } #>Closes loop (control nugget)

The opening brace { must be inside the control nugget. The closing brace } needs its own control nugget.


Querying Live Database Metadata

Hardcoded lists require manual updates when tables change. Query the database directly instead.

Using ImportDB (Simple)

ImportDB() connects to a database and retrieves all schema information:

<#
var sourceConn = RootNode.DbConnections["Source"];
var importResult = sourceConn.ImportDB();
#>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Packages>
<# foreach (var table in importResult.TableNodes) { #>
<Package Name="Load_<#=table.Schema.Name#>_<#=table.Name#>" ConstraintMode="Linear">
<Tasks>
<Dataflow Name="DFT_Load">
<Transformations>
<OleDbSource Name="Source" ConnectionName="Source">
<DirectInput>SELECT * FROM [<#=table.Schema.Name#>].[<#=table.Name#>]</DirectInput>
</OleDbSource>
<OleDbDestination Name="Destination" ConnectionName="Target">
<ExternalTableOutput Table="[stg].[<#=table.Name#>]" />
</OleDbDestination>
</Transformations>
</Dataflow>
</Tasks>
</Package>
<# } #>
</Packages>
</Biml>

The importResult.TableNodes collection contains full table metadata including columns, data types, and schema information.

Using GetDatabaseSchema (Advanced)

For more control, use GetDatabaseSchema() with filters:

<#@ import namespace="Varigence.Biml.CoreLowerer.SchemaManagement" #>
<#
var sourceConn = RootNode.OleDbConnections["Source"];
var includedSchemas = new List<string> { "Sales", "Person", "Production" };

var tables = sourceConn.GetDatabaseSchema(
includedSchemas, // Include only these schemas
null, // No table name filter
ImportOptions.ExcludeForeignKey | // Skip FK metadata
ImportOptions.ExcludeViews | // Skip views
ImportOptions.ExcludeColumnDefault // Skip default constraints
).TableNodes;
#>

ImportOptions Reference

OptionEffect
ExcludeForeignKeyFaster import, no FK metadata
ExcludeViewsTables only, skip views
ExcludeColumnDefaultSkip default constraint metadata
ExcludeIdentitySkip identity column metadata
ExcludeIndexSkip index metadata

Combining options with | (bitwise OR) applies multiple filters.


Accessing Biml Objects with RootNode

RootNode provides access to all objects defined across your Biml files. This enables one file to reference objects defined in another.

Common Collections

PropertyReturnsExample
RootNode.ConnectionsAll connectionsRootNode.Connections["Source"]
RootNode.DbConnectionsDatabase connections onlyRootNode.DbConnections["Source"]
RootNode.OleDbConnectionsOLE DB connectionsRootNode.OleDbConnections["Source"]
RootNode.DatabasesDatabase definitionsRootNode.Databases["Staging"]
RootNode.SchemasSchema definitionsRootNode.Schemas["Staging.dbo"]
RootNode.TablesTable definitionsRootNode.Tables["Customer"]
RootNode.PackagesPackage definitionsRootNode.Packages["Load_Customer"]

Filtering with LINQ

Apply standard LINQ methods to filter and sort collections:

<#
// Tables in Sales schema only
var salesTables = RootNode.Tables.Where(t => t.Schema.Name == "Sales");

// Tables with specific annotation
var fullLoadTables = RootNode.Tables.Where(t => t.GetTag("LoadType") == "Full");

// First 10 tables alphabetically
var firstTen = RootNode.Tables.OrderBy(t => t.Name).Take(10);

// Tables containing "Dim" in name
var dimTables = RootNode.Tables.Where(t => t.Name.Contains("Dim"));

// Count tables per schema
var schemaGroups = RootNode.Tables.GroupBy(t => t.Schema.Name);
#>

Accessing Table Properties

Each table object exposes rich metadata:

<# foreach (var table in RootNode.Tables) { #>
<!--
Table: <#=table.Name#>
Schema: <#=table.Schema.Name#>
Qualified: <#=table.SchemaQualifiedName#>
Column Count: <#=table.Columns.Count#>
-->
<# } #>

Applying Conditional Logic

Use if/else statements to customize output based on metadata.

Basic Conditional

Add optional steps based on table annotations:

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Packages>
<# foreach (var table in RootNode.Tables) { #>
<Package Name="Load_<#=table.Name#>" ConstraintMode="Linear">
<Tasks>
<# if (table.GetTag("TruncateFirst") == "true") { #>
<ExecuteSQL Name="Truncate" ConnectionName="Target">
<DirectInput>TRUNCATE TABLE [stg].[<#=table.Name#>]</DirectInput>
</ExecuteSQL>
<# } #>
<Dataflow Name="LoadData">
<Transformations>
<OleDbSource Name="Source" ConnectionName="Source">
<DirectInput>SELECT * FROM [<#=table.Schema.Name#>].[<#=table.Name#>]</DirectInput>
</OleDbSource>
<OleDbDestination Name="Destination" ConnectionName="Target">
<ExternalTableOutput Table="[stg].[<#=table.Name#>]" />
</OleDbDestination>
</Transformations>
</Dataflow>
</Tasks>
</Package>
<# } #>
</Packages>
</Biml>

Load Type Logic

Handle different load patterns based on metadata:

<#
// Inside foreach loop
var loadType = table.GetTag("LoadType") ?? "Full";
string preSql = "";
string whereSql = "";

if (loadType == "Full") {
preSql = $"TRUNCATE TABLE [stg].[{table.Name}]";
whereSql = "";
} else if (loadType == "Incremental") {
preSql = "";
whereSql = "WHERE ModifiedDate > @LastLoadDate";
} else if (loadType == "Append") {
preSql = "";
whereSql = "";
}
#>

Conditional Column Processing

Apply transformations based on column data types:

<# foreach (var table in RootNode.Tables) { #>
<Dataflow Name="Load_<#=table.Name#>">
<Transformations>
<OleDbSource Name="Source" ConnectionName="Source">
<DirectInput>SELECT * FROM [<#=table.Schema.Name#>].[<#=table.Name#>]</DirectInput>
</OleDbSource>
<#
var ansiColumns = table.Columns.Where(c => c.DataType == System.Data.DbType.AnsiString);
if (ansiColumns.Any()) {
#>
<DataConversion Name="ConvertToUnicode">
<Columns>
<# foreach (var col in ansiColumns) { #>
<Column SourceColumn="<#=col.Name#>" TargetColumn="<#=col.Name#>_Unicode" DataType="String" Length="<#=col.Length#>" />
<# } #>
</Columns>
</DataConversion>
<# } #>
<OleDbDestination Name="Destination" ConnectionName="Target">
<ExternalTableOutput Table="[stg].[<#=table.Name#>]" />
</OleDbDestination>
</Transformations>
</Dataflow>
<# } #>

Organizing Multi-File Solutions

Complex solutions require multiple Biml files with controlled compilation order.

Tier System

Use the tier directive to control compilation order:

<#@ template tier="10" #>  <!-- Compiles first: connections, databases -->
<#@ template tier="20" #> <!-- Compiles second: tables referencing connections -->
<#@ template tier="30" #> <!-- Compiles third: packages referencing tables -->

Lower numbers compile first. Objects must exist before they can be referenced.

File Naming Convention

Use numbered prefixes to indicate build steps:

1-2-Environment.biml        # Step 1 and 2: connections, databases, schemas
1-2-CreateTables.biml # Step 1 and 2: table definitions
1-x-DeployTables.biml # Step 1 only: DDL deployment package
x-2-CreateLoadPackages.biml # Step 2 only: load packages
  • Build Step 1: Select files starting with 1-
  • Build Step 2: Select files starting with x-2 or containing -2-

Example: Complete Staging Solution

1-2-Environment.biml (tier 10)

<#@ template tier="10" #>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="Source" ConnectionString="Provider=SQLNCLI11;Server=SOURCE_SERVER;Initial Catalog=AdventureWorksDW2012;Integrated Security=SSPI;" />
<OleDbConnection Name="Target" ConnectionString="Provider=SQLNCLI11;Server=localhost;Initial Catalog=Staging;Integrated Security=SSPI;" />
</Connections>
<Databases>
<Database Name="Staging" ConnectionName="Target" />
</Databases>
<Schemas>
<Schema Name="stg" DatabaseName="Staging" />
</Schemas>
</Biml>

1-2-CreateTables.biml (tier 20)

<#@ template tier="20" #>
<#@ import namespace="Varigence.Biml.CoreLowerer.SchemaManagement" #>
<#
var source = RootNode.OleDbConnections["Source"];
var tables = source.GetDatabaseSchema(
null, null,
ImportOptions.ExcludeForeignKey | ImportOptions.ExcludeViews
).TableNodes;
#>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Tables>
<# foreach (var table in tables) { #>
<Table Name="<#=table.Name#>" SchemaName="Staging.stg">
<Columns>
<# foreach (var col in table.Columns) { #>
<#=col.GetBiml()#>
<# } #>
<Column Name="LoadDateTime" DataType="DateTime2" />
</Columns>
<Annotations>
<Annotation Tag="SourceTable"><#=table.SchemaQualifiedName#></Annotation>
</Annotations>
</Table>
<# } #>
</Tables>
</Biml>

1-x-DeployTables.biml (tier 30)

<#@ template tier="30" #>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Packages>
<Package Name="DeployTables" ConstraintMode="Parallel">
<Tasks>
<# foreach (var table in RootNode.Tables) { #>
<ExecuteSQL Name="Create_<#=table.Name#>" ConnectionName="Target">
<DirectInput><#=table.GetDropAndCreateDdl()#></DirectInput>
</ExecuteSQL>
<# } #>
</Tasks>
</Package>
</Packages>
</Biml>

x-2-CreateLoadPackages.biml (tier 30)

<#@ template tier="30" #>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Packages>
<# foreach (var table in RootNode.Tables) { #>
<Package Name="Load_<#=table.Schema.Name#>_<#=table.Name#>" ConstraintMode="Linear">
<Tasks>
<ExecuteSQL Name="Truncate" ConnectionName="Target">
<DirectInput>TRUNCATE TABLE [<#=table.Schema.Name#>].[<#=table.Name#>]</DirectInput>
</ExecuteSQL>
<Dataflow Name="LoadData">
<Transformations>
<OleDbSource Name="Src" ConnectionName="Source">
<DirectInput>SELECT * FROM <#=table.GetTag("SourceTable")#></DirectInput>
</OleDbSource>
<DerivedColumns Name="AddAudit">
<Columns>
<Column Name="LoadDateTime" DataType="DateTime2" Scale="7">(DT_DBTIMESTAMP2,7)GETDATE()</Column>
</Columns>
</DerivedColumns>
<OleDbDestination Name="Dst" ConnectionName="Target">
<ExternalTableOutput Table="[<#=table.Schema.Name#>].[<#=table.Name#>]" />
</OleDbDestination>
</Transformations>
</Dataflow>
</Tasks>
</Package>
<# } #>

<!-- Master workflow package -->
<Package Name="Workflow_LoadAll" ConstraintMode="Parallel">
<Tasks>
<# foreach (var table in RootNode.Tables) { #>
<ExecutePackage Name="Run_<#=table.Name#>">
<ExternalProjectPackage Package="Load_<#=table.Schema.Name#>_<#=table.Name#>.dtsx" />
</ExecutePackage>
<# } #>
</Tasks>
</Package>
</Packages>
</Biml>

Platform-Specific Examples

The same automation principles apply across all Biml-supported platforms.

Azure Data Factory

Generate Copy Activities for each table:

<# foreach (var table in RootNode.Tables) { #>
<Pipeline Name="Copy_<#=table.Name#>">
<Activities>
<Copy Name="CopyData" EnableStaging="false">
<Source xsi:type="AzureSqlSource">
<SqlReaderQuery>SELECT * FROM [<#=table.Schema.Name#>].[<#=table.Name#>]</SqlReaderQuery>
</Source>
<Sink xsi:type="AzureSqlSink">
<TableName>[stg].[<#=table.Name#>]</TableName>
<PreCopyScript>TRUNCATE TABLE [stg].[<#=table.Name#>]</PreCopyScript>
</Sink>
</Copy>
</Activities>
</Pipeline>
<# } #>

Databricks

Generate MERGE statements for each table:

<# foreach (var table in RootNode.Tables) { #>
-- Load <#=table.Name#>
MERGE INTO staging.<#=table.Name#> AS target
USING source.<#=table.Name#> AS source
ON target.id = source.id
WHEN MATCHED THEN UPDATE SET *
WHEN NOT MATCHED THEN INSERT *;

<# } #>

Snowflake

Generate COPY INTO statements:

<# foreach (var table in RootNode.Tables) { #>
-- Load <#=table.Name#> from stage
COPY INTO staging.<#=table.Name#>
FROM @source_stage/<#=table.Name#>/
FILE_FORMAT = (TYPE = PARQUET)
MATCH_BY_COLUMN_NAME = CASE_INSENSITIVE;

<# } #>

Microsoft Fabric

Generate Lakehouse table definitions:

<# foreach (var table in RootNode.Tables) { #>
-- Create staging table for <#=table.Name#>
CREATE TABLE IF NOT EXISTS lakehouse.staging.<#=table.Name#> (
<# foreach (var col in table.Columns) { #>
<#=col.Name#> <#=GetSparkDataType(col.DataType)#>,
<# } #>
LoadDateTime TIMESTAMP
)
USING DELTA;

<# } #>

For comprehensive platform-specific guidance, see BimlFlex Documentation.


CI/CD Integration Benefits

Automated Biml generation integrates naturally with CI/CD pipelines.

Workflow

  1. Developer commits BimlScript template changes
  2. CI pipeline triggers Biml build
  3. Build generates all packages/pipelines
  4. Automated tests validate generated artifacts
  5. CD pipeline deploys to target environment

Key Advantages

Traditional ApproachBiml Automation
Commit 100 package filesCommit 1 template file
Review 100 package diffsReview 1 template diff
Merge conflicts in packagesMerge conflicts in template only
Manual deployment of changed packagesRegenerate and deploy all

Command-Line Build

# BimlStudio command-line build
bimlstudio.exe build MyProject.mst /output:./output

# Integrate with Azure DevOps, GitHub Actions, Jenkins

Common Issues and Fixes

Loop Not Generating Output

Symptom: Package element appears once or not at all

Checks:

  1. Verify collection isn't empty: <# WriteLine($"Table count: {tables.Count()}"); #>
  2. Ensure closing brace has its own nugget: <# } #>
  3. Check tier order: referenced objects must compile first

"Object reference not set" Error

Symptom: Null reference exception during build

Cause: Accessing property on null object

Fix:

<#
// Wrong - crashes if GetTag returns null
var loadType = table.GetTag("LoadType").ToUpper();

// Right - use null-conditional operator
var loadType = table.GetTag("LoadType")?.ToUpper() ?? "FULL";
#>

Connection Not Found

Symptom: RootNode.DbConnections["Source"] returns null

Cause: Connection file has higher tier than current file

Fix: Ensure connection file has lower tier number:

<!-- Environment.biml -->
<#@ template tier="10" #>

<!-- LoadPackages.biml -->
<#@ template tier="20" #>

ImportDB Returns No Tables

Symptom: importResult.TableNodes is empty

Checks:

  1. Connection string is valid and server accessible
  2. User has permission to read metadata
  3. Database contains tables (not just views)
  4. Schema filter isn't excluding all tables

Multi-Select Build Issues (BimlExpress)

Symptom: Objects not found when building

Cause: Not all required files selected

Fix: In BimlExpress, Ctrl+Click to multi-select all dependent Biml files before right-clicking to build.


Quick Reference

Loop Syntax

PatternSyntax
Open loop<# foreach (var item in collection) { #>
Output value<#=item.Property#>
Close loop<# } #>
Conditional<# if (condition) { #> ... <# } #>
Else<# } else { #>

Metadata Methods

MethodPurpose
ImportDB()Quick schema import
GetDatabaseSchema()Filtered schema import
GetBiml()Convert object to Biml XML
GetDropAndCreateDdl()Generate DDL script
GetColumnList()Comma-separated column names
GetTag("name")Read annotation value

Common LINQ Operations

OperationExample
Filter.Where(t => t.Schema.Name == "Sales")
Sort.OrderBy(t => t.Name)
Limit.Take(10)
Check exists.Any()
Count.Count()

Next Steps