Skip to main content

Choosing an Integration Template

An Integration Template determines how BimlFlex generates your data pipelines. It is set at the Project level, and every data pipeline in that project uses the selected template. Choosing the right template is the first architectural decision in any BimlFlex implementation.

Decision Matrix

FactorSSISADFDatabricks (DBR)Fabric (FDF)
RuntimeSQL Server Integration ServicesAzure Data Factory Execute PipelinesADF orchestration + Databricks Spark notebooksMicrosoft Fabric Data Pipelines
Best forOn-premises SQL Server, existing SSIS investmentsCloud-first Azure SQL / Synapse workloadsLarge-scale data lake / lakehouse with Spark processingMicrosoft Fabric-native environments
ComputeSSIS server (on-prem or Azure-SSIS IR)ADF managed compute + SQL stored proceduresDatabricks clusters (job or serverless)Fabric capacity units
Pushdown processingOptional — SSIS Data Flows if disabled, stored procedures if enabledAlways — Copy Activity + stored proceduresAlways — Spark notebooks handle all transformationAlways — Fabric pipelines + stored procedures or Spark
Source supportBroadest — any SSIS-compatible source via ODBC/OLEDBAzure-native sources + on-prem via Self-Hosted IRSources accessible via ADF Copy Activity (ADF orchestrates ingestion)Fabric-connected sources + on-prem via gateway
Target platformsSQL Server, Azure SQL, SynapseAzure SQL, Synapse, Snowflake (via landing area)Databricks / Delta Lake / Unity CatalogFabric Lakehouse, Fabric Warehouse
CI/CDSSIS Catalog deployment via PowerShell or SSDTARM template deployment via PowerShell or Azure PortalADF ARM templates + Databricks notebook deploymentFabric deployment pipelines
LicensingSQL Server license requiredADF consumption-based pricingDatabricks + ADF pricingFabric capacity licensing
Team skills neededSSIS, T-SQL, Visual StudioADF, T-SQL, Azure PortalSpark/Python/SQL, ADF, Databricks workspaceFabric, T-SQL or Spark, Power BI ecosystem

When to Choose Each Template

SSIS — SQL Server Integration Services

Choose SSIS when:

  • Your data warehouse runs on on-premises SQL Server and there is no near-term cloud migration plan
  • Your team has existing SSIS expertise and a library of SSIS packages
  • You need the broadest source connectivity (mainframe, legacy ODBC drivers, flat files with complex parsing)
  • You require fine-grained data flow control (row-level transformations in the SSIS pipeline)

Avoid SSIS when:

  • Your targets are cloud-native (Snowflake, Databricks, Fabric) — SSIS cannot directly target these without an intermediate landing zone
  • You want serverless, pay-per-use compute — SSIS requires provisioned servers

Related docs: SSIS Technology Overview, SSIS Deployment Guide

ADF — Azure Data Factory

Choose ADF when:

  • Your target is Azure SQL Database, Azure Synapse Analytics, or Snowflake (via landing area)
  • You want managed, serverless orchestration without maintaining SSIS servers
  • Your transformations are primarily SQL-based (stored procedures) rather than row-level data flows
  • You are moving from on-prem to Azure and want a cloud-native orchestration layer

Avoid ADF when:

  • You need Spark-based transformations (use Databricks template instead)
  • You are building on Microsoft Fabric (use the Fabric template for native integration)

Related docs: ADF Technology Overview, ADF Deployment, Landing Area Configuration

Databricks (DBR)

Choose Databricks when:

  • You are building a lakehouse architecture on Delta Lake / Unity Catalog
  • Your data volumes require Spark-based distributed processing
  • You want to leverage Databricks serverless compute for cost optimization
  • Your team has Python/Spark skills alongside SQL

Avoid Databricks when:

  • Your target is a traditional SQL Server data warehouse — the Databricks template targets Delta Lake, not SQL Server
  • You need simplicity — Databricks adds operational complexity (cluster management, notebook versioning)

Related docs: Databricks Technology Overview, Databricks Configuration, Implementing Databricks in ADF

Fabric (FDF) — Data Factory for Fabric

Choose Fabric when:

  • You are building on Microsoft Fabric as your unified analytics platform
  • You want to leverage Fabric Lakehouse (Spark + Delta) or Fabric Warehouse (T-SQL)
  • Your organization has Fabric capacity licensing already in place
  • You want tight integration with the Power BI / OneLake ecosystem

Avoid Fabric when:

  • Fabric is not available in your region or tenant
  • You need mature CI/CD pipelines — Fabric deployment tooling is still evolving

Related docs: Fabric Technology Overview, Fabric Configuration, Fabric Lakehouse, Fabric Warehouse

Utility Templates: S2FIL and S2ZIP

These are special-purpose SSIS templates for file extraction scenarios:

  • S2FIL — Extract data to flat files using SSIS
  • S2ZIP — Extract data to compressed (zip) files using SSIS

Use these alongside a primary integration template when you need to produce file-based exports from your data warehouse.

Combining Templates

A single BimlFlex solution can use multiple integration templates across different Projects. For example:

  • Project A (SSIS): Extract from on-premises Oracle into a SQL Server staging area
  • Project B (ADF): Load from staging into Azure Synapse using ADF Copy Activity + stored procedures
  • Project C (Databricks): Transform staged data into a Delta Lake lakehouse

Each project independently selects its template. The template choice applies to all objects within that project.

Setting the Integration Template

The integration template is configured on the Project Editor:

  1. Open the BimlFlex App and navigate to Projects
  2. Select or create the target project
  3. In the Integration Template dropdown, select the desired template (SSIS, ADF, DBR, or FDF)
  4. Save the project

The integration template works alongside the Connection properties (Connection Type, System Type, Integration Stage) and Batch configuration to determine the full pipeline generation behavior. See Connection Editor and Project Editor for details.