Skip to main content

Microsoft Fabric Integration

Microsoft Fabric is a unified analytics platform that combines data engineering, data warehousing, data science, and real-time analytics. This guide covers how BimlStudio can be used to develop and deploy data solutions targeting Microsoft Fabric.

Overview

BimlStudio supports Microsoft Fabric through:

  • Connection definitions for Fabric data sources
  • Table definitions that can be deployed to Fabric Data Warehouse
  • SQL script generation compatible with Fabric T-SQL
  • Integration with Azure Data Factory pipelines that work with Fabric

Prerequisites

Before integrating BimlStudio with Microsoft Fabric:

  1. Access to a Microsoft Fabric workspace with appropriate permissions
  2. Familiarity with Fabric concepts (Lakehouse, Warehouse, OneLake)
  3. BimlStudio installed with a valid license

Connecting to Fabric Data Warehouse

Connection String Configuration

Create a connection in BimlStudio for your Fabric Data Warehouse:

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="FabricWarehouse"
ConnectionString="Provider=MSOLEDBSQL;
Data Source=your-workspace.datawarehouse.fabric.microsoft.com;
Initial Catalog=YourWarehouse;
Authentication=ActiveDirectoryInteractive;" />
</Connections>
</Biml>

Authentication Options

Fabric supports several authentication methods:

MethodUse Case
Azure AD InteractiveDevelopment and testing
Azure AD Service PrincipalCI/CD pipelines
Azure AD Managed IdentityAzure-hosted solutions

Service Principal Example:

<OleDbConnection Name="FabricWarehouse"
ConnectionString="Provider=MSOLEDBSQL;
Data Source=your-workspace.datawarehouse.fabric.microsoft.com;
Initial Catalog=YourWarehouse;
Authentication=ActiveDirectoryServicePrincipal;
User ID=your-app-id;
Password=your-client-secret;" />

Working with Fabric Lakehouse

Connecting to Lakehouse SQL Endpoint

Fabric Lakehouse provides a SQL analytics endpoint for querying Delta tables:

<Connections>
<OleDbConnection Name="FabricLakehouse"
ConnectionString="Provider=MSOLEDBSQL;
Data Source=your-workspace.sql.fabric.microsoft.com;
Initial Catalog=YourLakehouse;
Authentication=ActiveDirectoryInteractive;" />
</Connections>

Defining Tables for Lakehouse

When targeting Lakehouse, consider these patterns:

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Tables>
<Table Name="FactSales" SchemaName="dbo">
<Columns>
<Column Name="SalesKey" DataType="Int64" IsNullable="false" />
<Column Name="CustomerKey" DataType="Int64" IsNullable="false" />
<Column Name="ProductKey" DataType="Int64" IsNullable="false" />
<Column Name="OrderDate" DataType="Date" IsNullable="false" />
<Column Name="SalesAmount" DataType="Decimal" Precision="18" Scale="2" />
</Columns>
</Table>
</Tables>
</Biml>

Data Warehouse Patterns

Creating Fabric-Compatible DDL

Fabric Data Warehouse supports standard T-SQL with some limitations. Use BimlStudio to generate compatible DDL:

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Databases>
<Database Name="Analytics" />
</Databases>
<Schemas>
<Schema Name="dim" DatabaseName="Analytics" />
<Schema Name="fact" DatabaseName="Analytics" />
</Schemas>
<Tables>
<Table Name="DimCustomer" SchemaName="dim" DatabaseName="Analytics">
<Columns>
<Column Name="CustomerKey" DataType="Int64" IsNullable="false" />
<Column Name="CustomerID" DataType="String" Length="20" IsNullable="false" />
<Column Name="CustomerName" DataType="String" Length="100" />
<Column Name="Email" DataType="String" Length="256" />
<Column Name="CreatedDate" DataType="DateTime" />
</Columns>
</Table>
</Tables>
</Biml>

Considerations for Fabric

  1. No clustered indexes - Fabric Warehouse uses automatic distribution
  2. No primary key constraints - Define logical keys in metadata only
  3. Use appropriate data types - Fabric has specific type mappings

Integration with Azure Data Factory

BimlStudio-generated assets can be orchestrated using Azure Data Factory, which integrates natively with Fabric:

Pipeline Pattern

  1. Use BimlStudio to define source and target metadata
  2. Generate T-SQL scripts for transformations
  3. Deploy scripts to Fabric Warehouse
  4. Create ADF pipelines to orchestrate execution

Connecting ADF to Fabric

In your BimlStudio project, define ADF linked services that connect to Fabric:

<!-- ADF Linked Service definition can reference Fabric endpoints -->
<Connections>
<AdoNetConnection Name="FabricLinkedService"
ConnectionString="Data Source=your-workspace.datawarehouse.fabric.microsoft.com;
Initial Catalog=YourWarehouse;
Authentication=ActiveDirectoryManagedIdentity;" />
</Connections>

Deployment Workflow

Manual Deployment

  1. Build your BimlStudio project to generate DDL scripts
  2. Connect to Fabric using Azure Data Studio or SSMS
  3. Execute the generated scripts against your Fabric Warehouse

Automated Deployment

For CI/CD pipelines:

  1. Generate scripts during build
  2. Use Azure CLI or REST API to deploy to Fabric
  3. Integrate with Azure DevOps or GitHub Actions

Example PowerShell for script execution:

# Connect to Fabric Warehouse
$connectionString = "Server=your-workspace.datawarehouse.fabric.microsoft.com;Database=YourWarehouse;Authentication=Active Directory Default"

# Execute generated DDL
Invoke-Sqlcmd -ConnectionString $connectionString -InputFile ".\output\ddl\CreateTables.sql"

Best Practices

Data Modeling

  1. Use star schema patterns - Fabric is optimized for dimensional models
  2. Leverage partitioning - For large fact tables
  3. Consider data types carefully - Some types perform better than others in Fabric

Performance

  1. Minimize data movement - Use Fabric's native copy capabilities where possible
  2. Use appropriate batch sizes - For large data loads
  3. Monitor with Fabric Capacity Metrics - Track resource usage

Security

  1. Use managed identities - For production deployments
  2. Implement row-level security - Where needed
  3. Follow least-privilege principles - For service accounts

Known Limitations

Current limitations when working with Fabric from BimlStudio:

  • Direct SSIS package deployment to Fabric is not supported (use ADF instead)
  • Some advanced T-SQL features may not be available in Fabric
  • Schema changes may require table recreation