Skip to main content

Implementing Fabric Lakehouse Using Data Factory

BimlFlex provides an intuitive process to implement Microsoft Fabric Lakehouse using Data Factory (FDF) for cloud-based data warehousing solutions. This integration extends metadata-driven automation to the Fabric ecosystem, enabling customers to design and generate Lakehouse solutions directly within the platform.

Architecture Overview

BimlFlex uses Data Factory copy commands to ingest and land (stage) source data in OneLake, Azure Blob Storage, or Data Lake Storage Gen2. BimlFlex provides logic to map the resulting files so that the generated code can load the data into Fabric Lakehouse tables.

Fabric Lakehouse Features

Microsoft Fabric Lakehouse support in BimlFlex provides capabilities similar to the Databricks and Snowflake integrations:

Metadata Import Support

Schema and object metadata can be imported directly from Fabric Lakehouse to drive automation patterns. This reduces manual mapping and accelerates the creation of Data Vault and Data Mart solutions based on existing Lakehouse structures.

To import metadata from a source:

  1. Navigate to the source connection in the Connections editor
  2. Click the Preview button to view available objects
  3. Select the tables you want to import (you can import thousands of tables at once)
  4. BimlFlex imports the metadata and makes it available for modeling

Data Vault Templates

New templates generate hubs, links, satellites, and PIT/Bridge tables within Fabric Lakehouse. These accelerate Data Vault implementation on Fabric and ensure consistent, metadata-driven automation across the platform.

Supported Data Vault constructs include:

  • Hubs: Core business entities
  • Links: Relationships between hubs
  • Satellites: Descriptive attributes with history
  • PIT Tables: Point-in-time (PIT) tables
  • Bridge Tables: Denormalized query acceleration structures

Data Mart Templates

Specialized templates for Fabric Lakehouse automate the creation of dimensional models and reporting structures. This provides optimized Data Mart solutions that align with Fabric's analytics and performance capabilities.

Delete Detection Templates

Templates for identifying and processing deletions in source systems are available within Fabric Lakehouse. These simplify data management, ensure data integrity, and reduce manual intervention in handling deletes.

Lakehouse Medallion Architecture Support

BimlFlex supports the medallion architecture pattern (Bronze/Silver/Gold) for Fabric Lakehouse implementations:

LayerBimlFlex ImplementationFabric Components
BronzeStaging + Persistent StagingLanding files in OneLake, Delta tables for raw data
SilverData Vault or Normal FormCleansed and integrated Delta tables
GoldData Mart / DimensionalOptimized analytics tables and views

Bronze Layer (Raw Data)

The Bronze layer captures raw data as received from source systems. In BimlFlex, this maps to:

  • Landing Area: Initial data ingestion via Data Factory copy commands
  • Staging Area: Transient storage for current batch processing
  • Persistent Staging Area: Historical retention of all received data

Silver Layer (Curated Data)

BimlFlex supports two approaches for the Silver layer:

  • Data Vault (recommended): Provides flexibility, auditability, and scalability through Hub, Link, and Satellite patterns
  • Normal Form: Traditional relational modeling for simpler use cases

Gold Layer (Business-Ready Data)

The Gold layer delivers business-ready data through:

  • Dimensional Models: Star schema with Fact and Dimension tables
  • Data Marts: Purpose-built analytics structures
tip

For detailed guidance on implementing each layer, see the Delivering Lakehouse documentation.

Prerequisites

Before implementing Fabric Lakehouse with Data Factory, ensure you have completed the following:

  1. Fabric Configuration: Complete the setup outlined in the Microsoft Fabric Configuration Overview
  2. Storage: Configure OneLake, blob storage, or Data Lake Storage Gen2 for landing, staging, archive, and error containers
  3. Connections: Create and configure the Fabric Lakehouse connection in BimlFlex
note

Detailed prerequisites and configuration steps are provided in the Microsoft Fabric Configuration Overview section.

Configuring Fabric Lakehouse in BimlFlex

Loading Sample Metadata

BimlFlex provides two pre-configured sample metadata sets for Fabric Lakehouse:

SampleDescriptionUse Case
Fabric Data VaultPre-configured for Data Vault implementationBuilding a silver layer with Hub, Link, and Satellite patterns
Fabric DatamartPre-configured for dimensional modelingBuilding bronze-to-gold layer data marts

To load a sample:

  1. Navigate to the BimlFlex Dashboard
  2. Select from the Load Sample Metadata dropdown
  3. Choose either Fabric Data Vault or Fabric Datamart

The sample metadata includes pre-configured projects, connections, and objects that demonstrate best practices for Fabric Lakehouse implementations.

[SCREENSHOT NEEDED: fabric-lakehouse-data-vault-fdf-sample.png - Sample metadata selection showing Fabric Data Vault and Fabric Datamart options]

note

For more information on lakehouse and data modeling implementations:

Connection Configuration

Configure your Fabric Lakehouse connections from within the BimlFlex Connections editor:

Source System Connection:

  • Enable Cloud option for the source system
  • Configure Staging / Landing Environment for OneLake, Blob Storage, or Data Lake Storage Gen2 with Data Factory connections

[SCREENSHOT NEEDED: fabric-lakehouse-connection-source.png - Connection settings for source system with cloud enabled]

Fabric Lakehouse Connection:

  • Set System Type to Fabric Lakehouse
  • Configure the Connection String appropriately for Fabric Lakehouse
  • Configure Integration Template to Data Factory Source -> Target
  • Set External Location to the OneLake path (e.g., abfss://<workspace>@onelake.dfs.fabric.microsoft.com/<lakehouse>/Files/)
  • Set External Reference to the Fabric connection ID (every connection in Fabric has an internal ID that BimlFlex uses to reference it)

[SCREENSHOT NEEDED: fabric-lakehouse-connection-settings.png - Connection settings showing Fabric Lakehouse configuration]

note

The External Reference is required for all Fabric Lakehouse connections. This ID can be found in the Fabric portal and enables BimlFlex to properly reference the connection when generating and deploying Data Factory pipelines.

Batch Configuration

Prior to building your solution, configure batches from the BimlFlex Batches editor to:

  • Assign batches to different compute resources
  • Configure scaling parameters
  • Set execution priorities

[SCREENSHOT NEEDED: fabric-lakehouse-batch-configuration.png - Batch configuration screen]

Generated Output

BimlFlex generates all necessary Fabric Lakehouse artifacts automatically—you do not need to write notebooks, stored procedures, or pipeline code manually:

Artifact TypeDescription
Lakehouse TablesDDL scripts for creating all Lakehouse table structures
NotebooksSpark notebooks for data processing (all code to load data is generated automatically)
Stored ProceduresSQL procedures for transformation logic where applicable
Data Factory PipelinesComplete pipeline orchestration including copy activities, notebook execution, and error handling

Pipeline Features

Generated pipelines include sophisticated data movement logic:

  • High watermark lookups for incremental loading
  • Copy activities with proper connection settings
  • Notebook execution for staging layer processing
  • Automatic file handling (archive/error movement)
  • Error handling and retry logic

[SCREENSHOT NEEDED: fabric-lakehouse-generated-code.png - Generated Fabric Lakehouse code output in BimlStudio]

Deployed Solution

Once deployed to Data Factory, the solution provides:

  • Visual pipeline representation
  • Monitoring and logging capabilities
  • Error handling with automatic file archiving

[SCREENSHOT NEEDED: fabric-lakehouse-fdf-pipeline.png - Data Factory pipeline visualization for Fabric Lakehouse solution]

Monitoring and Management

After deployment, you can:

  • Scale compute resources up or down
  • View copy command completions and errors
  • Suspend or resume solution execution
  • Monitor execution status and performance

[SCREENSHOT NEEDED: fabric-lakehouse-solution-monitoring.png - Data Factory monitoring showing completions and errors]

note

Files encountering errors are automatically moved to an error folder. On subsequent runs, these files will have already been processed and archived appropriately.

BimlFlex Documentation

External Resources

Video Resources

Refer to the video in the Microsoft Fabric Configuration Overview for a walkthrough of configuring BimlFlex for Microsoft Fabric, including Lakehouse implementations.

Fabric as a Source System

BimlFlex supports using Fabric Lakehouse as both a source AND target. This enables scenarios such as:

  • Processing data from one Lakehouse to another Lakehouse
  • Moving data between Bronze, Silver, and Gold layers within Fabric
  • Using naming patterns and schemas for layer separation

To configure Fabric Lakehouse as a source:

  1. Create a connection with Integration Stage set to Source System
  2. Set System Type to Fabric Lakehouse
  3. Configure the appropriate connection string and external reference

Benefits of Using BimlFlex with Fabric Lakehouse

BimlFlex provides significant advantages when building Fabric Lakehouse solutions:

  • No Code Required: Your team only needs to understand data modeling—BimlFlex generates all notebooks, stored procedures, and pipelines automatically
  • Focus on Design: Concentrate on source-to-target mappings and transformations, not implementation details
  • Automatic Updates: As Microsoft Fabric evolves, BimlFlex templates are updated to ensure optimal implementations
  • Data Vault Accelerator: Full access to the Data Vault accelerator for modeling hubs, links, and satellites
  • Transformation Support: Apply transformations directly in BimlFlex, including macros for reusable patterns
  • Data Lineage: Complete data lineage visualization for any object in your solution
  • Schema Documentation: Automatic schema diagrams and documentation generation