Microsoft Fabric Integration
Microsoft Fabric is a unified analytics platform that combines data engineering, data warehousing, data science, and real-time analytics. This guide covers how BimlStudio can be used to develop and deploy data solutions targeting Microsoft Fabric.
Overview
BimlStudio supports Microsoft Fabric through:
- Connection definitions for Fabric data sources
- Table definitions that can be deployed to Fabric Data Warehouse
- SQL script generation compatible with Fabric T-SQL
- Integration with Azure Data Factory pipelines that work with Fabric
Prerequisites
Before integrating BimlStudio with Microsoft Fabric:
- Access to a Microsoft Fabric workspace with appropriate permissions
- Familiarity with Fabric concepts (Lakehouse, Warehouse, OneLake)
- BimlStudio installed with a valid license
Connecting to Fabric Data Warehouse
Connection String Configuration
Create a connection in BimlStudio for your Fabric Data Warehouse:
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="FabricWarehouse"
ConnectionString="Provider=MSOLEDBSQL;
Data Source=your-workspace.datawarehouse.fabric.microsoft.com;
Initial Catalog=YourWarehouse;
Authentication=ActiveDirectoryInteractive;" />
</Connections>
</Biml>
Authentication Options
Fabric supports several authentication methods:
| Method | Use Case |
|---|---|
| Azure AD Interactive | Development and testing |
| Azure AD Service Principal | CI/CD pipelines |
| Azure AD Managed Identity | Azure-hosted solutions |
Service Principal Example:
<OleDbConnection Name="FabricWarehouse"
ConnectionString="Provider=MSOLEDBSQL;
Data Source=your-workspace.datawarehouse.fabric.microsoft.com;
Initial Catalog=YourWarehouse;
Authentication=ActiveDirectoryServicePrincipal;
User ID=your-app-id;
Password=your-client-secret;" />
Working with Fabric Lakehouse
Connecting to Lakehouse SQL Endpoint
Fabric Lakehouse provides a SQL analytics endpoint for querying Delta tables:
<Connections>
<OleDbConnection Name="FabricLakehouse"
ConnectionString="Provider=MSOLEDBSQL;
Data Source=your-workspace.sql.fabric.microsoft.com;
Initial Catalog=YourLakehouse;
Authentication=ActiveDirectoryInteractive;" />
</Connections>
Defining Tables for Lakehouse
When targeting Lakehouse, consider these patterns:
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Tables>
<Table Name="FactSales" SchemaName="dbo">
<Columns>
<Column Name="SalesKey" DataType="Int64" IsNullable="false" />
<Column Name="CustomerKey" DataType="Int64" IsNullable="false" />
<Column Name="ProductKey" DataType="Int64" IsNullable="false" />
<Column Name="OrderDate" DataType="Date" IsNullable="false" />
<Column Name="SalesAmount" DataType="Decimal" Precision="18" Scale="2" />
</Columns>
</Table>
</Tables>
</Biml>
Data Warehouse Patterns
Creating Fabric-Compatible DDL
Fabric Data Warehouse supports standard T-SQL with some limitations. Use BimlStudio to generate compatible DDL:
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Databases>
<Database Name="Analytics" />
</Databases>
<Schemas>
<Schema Name="dim" DatabaseName="Analytics" />
<Schema Name="fact" DatabaseName="Analytics" />
</Schemas>
<Tables>
<Table Name="DimCustomer" SchemaName="dim" DatabaseName="Analytics">
<Columns>
<Column Name="CustomerKey" DataType="Int64" IsNullable="false" />
<Column Name="CustomerID" DataType="String" Length="20" IsNullable="false" />
<Column Name="CustomerName" DataType="String" Length="100" />
<Column Name="Email" DataType="String" Length="256" />
<Column Name="CreatedDate" DataType="DateTime" />
</Columns>
</Table>
</Tables>
</Biml>
Considerations for Fabric
- No clustered indexes - Fabric Warehouse uses automatic distribution
- No primary key constraints - Define logical keys in metadata only
- Use appropriate data types - Fabric has specific type mappings
Integration with Azure Data Factory
BimlStudio-generated assets can be orchestrated using Azure Data Factory, which integrates natively with Fabric:
Pipeline Pattern
- Use BimlStudio to define source and target metadata
- Generate T-SQL scripts for transformations
- Deploy scripts to Fabric Warehouse
- Create ADF pipelines to orchestrate execution
Connecting ADF to Fabric
In your BimlStudio project, define ADF linked services that connect to Fabric:
<!-- ADF Linked Service definition can reference Fabric endpoints -->
<Connections>
<AdoNetConnection Name="FabricLinkedService"
ConnectionString="Data Source=your-workspace.datawarehouse.fabric.microsoft.com;
Initial Catalog=YourWarehouse;
Authentication=ActiveDirectoryManagedIdentity;" />
</Connections>
Deployment Workflow
Manual Deployment
- Build your BimlStudio project to generate DDL scripts
- Connect to Fabric using Azure Data Studio or SSMS
- Execute the generated scripts against your Fabric Warehouse
Automated Deployment
For CI/CD pipelines:
- Generate scripts during build
- Use Azure CLI or REST API to deploy to Fabric
- Integrate with Azure DevOps or GitHub Actions
Example PowerShell for script execution:
# Connect to Fabric Warehouse
$connectionString = "Server=your-workspace.datawarehouse.fabric.microsoft.com;Database=YourWarehouse;Authentication=Active Directory Default"
# Execute generated DDL
Invoke-Sqlcmd -ConnectionString $connectionString -InputFile ".\output\ddl\CreateTables.sql"
Best Practices
Data Modeling
- Use star schema patterns - Fabric is optimized for dimensional models
- Leverage partitioning - For large fact tables
- Consider data types carefully - Some types perform better than others in Fabric
Performance
- Minimize data movement - Use Fabric's native copy capabilities where possible
- Use appropriate batch sizes - For large data loads
- Monitor with Fabric Capacity Metrics - Track resource usage
Security
- Use managed identities - For production deployments
- Implement row-level security - Where needed
- Follow least-privilege principles - For service accounts
Known Limitations
Current limitations when working with Fabric from BimlStudio:
- Direct SSIS package deployment to Fabric is not supported (use ADF instead)
- Some advanced T-SQL features may not be available in Fabric
- Schema changes may require table recreation
Related Topics
- Cloud Connections - Configuring cloud data source connections
- Creating a Basic Package - Package development fundamentals
- Configuring Project Settings - Project configuration