Choosing an Integration Template
An Integration Template determines how BimlFlex generates your data pipelines. It is set at the Project level, and every data pipeline in that project uses the selected template. Choosing the right template is the first architectural decision in any BimlFlex implementation.
Decision Matrix
| Factor | SSIS | ADF | Databricks (DBR) | Fabric (FDF) |
|---|---|---|---|---|
| Runtime | SQL Server Integration Services | Azure Data Factory Execute Pipelines | ADF orchestration + Databricks Spark notebooks | Microsoft Fabric Data Pipelines |
| Best for | On-premises SQL Server, existing SSIS investments | Cloud-first Azure SQL / Synapse workloads | Large-scale data lake / lakehouse with Spark processing | Microsoft Fabric-native environments |
| Compute | SSIS server (on-prem or Azure-SSIS IR) | ADF managed compute + SQL stored procedures | Databricks clusters (job or serverless) | Fabric capacity units |
| Pushdown processing | Optional — SSIS Data Flows if disabled, stored procedures if enabled | Always — Copy Activity + stored procedures | Always — Spark notebooks handle all transformation | Always — Fabric pipelines + stored procedures or Spark |
| Source support | Broadest — any SSIS-compatible source via ODBC/OLEDB | Azure-native sources + on-prem via Self-Hosted IR | Sources accessible via ADF Copy Activity (ADF orchestrates ingestion) | Fabric-connected sources + on-prem via gateway |
| Target platforms | SQL Server, Azure SQL, Synapse | Azure SQL, Synapse, Snowflake (via landing area) | Databricks / Delta Lake / Unity Catalog | Fabric Lakehouse, Fabric Warehouse |
| CI/CD | SSIS Catalog deployment via PowerShell or SSDT | ARM template deployment via PowerShell or Azure Portal | ADF ARM templates + Databricks notebook deployment | Fabric deployment pipelines |
| Licensing | SQL Server license required | ADF consumption-based pricing | Databricks + ADF pricing | Fabric capacity licensing |
| Team skills needed | SSIS, T-SQL, Visual Studio | ADF, T-SQL, Azure Portal | Spark/Python/SQL, ADF, Databricks workspace | Fabric, T-SQL or Spark, Power BI ecosystem |
When to Choose Each Template
SSIS — SQL Server Integration Services
Choose SSIS when:
- Your data warehouse runs on on-premises SQL Server and there is no near-term cloud migration plan
- Your team has existing SSIS expertise and a library of SSIS packages
- You need the broadest source connectivity (mainframe, legacy ODBC drivers, flat files with complex parsing)
- You require fine-grained data flow control (row-level transformations in the SSIS pipeline)
Avoid SSIS when:
- Your targets are cloud-native (Snowflake, Databricks, Fabric) — SSIS cannot directly target these without an intermediate landing zone
- You want serverless, pay-per-use compute — SSIS requires provisioned servers
Related docs: SSIS Technology Overview, SSIS Deployment Guide
ADF — Azure Data Factory
Choose ADF when:
- Your target is Azure SQL Database, Azure Synapse Analytics, or Snowflake (via landing area)
- You want managed, serverless orchestration without maintaining SSIS servers
- Your transformations are primarily SQL-based (stored procedures) rather than row-level data flows
- You are moving from on-prem to Azure and want a cloud-native orchestration layer
Avoid ADF when:
- You need Spark-based transformations (use Databricks template instead)
- You are building on Microsoft Fabric (use the Fabric template for native integration)
Related docs: ADF Technology Overview, ADF Deployment, Landing Area Configuration
Databricks (DBR)
Choose Databricks when:
- You are building a lakehouse architecture on Delta Lake / Unity Catalog
- Your data volumes require Spark-based distributed processing
- You want to leverage Databricks serverless compute for cost optimization
- Your team has Python/Spark skills alongside SQL
Avoid Databricks when:
- Your target is a traditional SQL Server data warehouse — the Databricks template targets Delta Lake, not SQL Server
- You need simplicity — Databricks adds operational complexity (cluster management, notebook versioning)
Related docs: Databricks Technology Overview, Databricks Configuration, Implementing Databricks in ADF
Fabric (FDF) — Data Factory for Fabric
Choose Fabric when:
- You are building on Microsoft Fabric as your unified analytics platform
- You want to leverage Fabric Lakehouse (Spark + Delta) or Fabric Warehouse (T-SQL)
- Your organization has Fabric capacity licensing already in place
- You want tight integration with the Power BI / OneLake ecosystem
Avoid Fabric when:
- Fabric is not available in your region or tenant
- You need mature CI/CD pipelines — Fabric deployment tooling is still evolving
Related docs: Fabric Technology Overview, Fabric Configuration, Fabric Lakehouse, Fabric Warehouse
Utility Templates: S2FIL and S2ZIP
These are special-purpose SSIS templates for file extraction scenarios:
- S2FIL — Extract data to flat files using SSIS
- S2ZIP — Extract data to compressed (zip) files using SSIS
Use these alongside a primary integration template when you need to produce file-based exports from your data warehouse.
Combining Templates
A single BimlFlex solution can use multiple integration templates across different Projects. For example:
- Project A (SSIS): Extract from on-premises Oracle into a SQL Server staging area
- Project B (ADF): Load from staging into Azure Synapse using ADF Copy Activity + stored procedures
- Project C (Databricks): Transform staged data into a Delta Lake lakehouse
Each project independently selects its template. The template choice applies to all objects within that project.
Setting the Integration Template
The integration template is configured on the Project Editor:
- Open the BimlFlex App and navigate to Projects
- Select or create the target project
- In the Integration Template dropdown, select the desired template (SSIS, ADF, DBR, or FDF)
- Save the project
The integration template works alongside the Connection properties (Connection Type, System Type, Integration Stage) and Batch configuration to determine the full pipeline generation behavior. See Connection Editor and Project Editor for details.