Data Vault Extension Points
The Data Vault category has the following available Extension Points defined.
Data Vault Bridge Custom SQL
Configure custom SQL for the Data Vault Bridge
Parameters
Name | Type | Description |
|---|---|---|
| table | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the SQL will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
| DvBridgeLagSql | String | XXXX |
| DvBridgeDateWhereSql | String | XXXX |
| DvBridgeDateOverrideWhereSql | String | XXXX |
| DvBridgeWhereSql | String | XXXX |
| DvBridgeOverrideWhereSql | String | XXXX |
| DvBridgeHubWhereSql | String | XXXX |
| DvBridgeHubOverrideWhereSql | String | XXXX |
| DvBridgeColumnSelect | String | XXXX |
| DvBridgeColumnInsert | String | XXXX |
| DvBridgeColumnElements | String | XXXX |
| DvBridgeOverrideSql | String | XXXX |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvBridgeSql" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<!-- Use DvBridgeSql combined the below samples to add additional logic to the Bridge load procedure. -->
<#*
<# CustomOutput.ObjectInherit = false; #>
<# CustomOutput.DvBridgeColumnSelect = @"";#>
<# CustomOutput.DvBridgeColumnInsert = @""; #>
<# CustomOutput.DvBridgeDateWhereSql = @""; #>
<# CustomOutput.DvBridgeDateOverrideWhereSql = @""; #>
<# CustomOutput.DvBridgeWhereSql = @""; #>
<# CustomOutput.DvBridgeOverrideWhereSql = @""; #>
<# CustomOutput.DvBridgeHubWhereSql = @""; #>
<# CustomOutput.DvBridgeHubOverrideWhereSql = @""; #>
<# CustomOutput.DvBridgeOverrideWhereSql = @""; #>
<# CustomOutput.DvBridgeColumnElements = @""; #>
<# CustomOutput.DvBridgeOverrideSql = @"Add SQL here to completely override the BRIDGE Code."; #>
*#>
What It Does
Injects custom SQL fragments into the generated Bridge (BRG) table load stored procedure. Bridge tables pre-join Hub and Link structures to flatten the Data Vault model for downstream query performance. This extension point lets you control column selection, WHERE conditions, hub join logic, and date filtering -- or completely override the entire Bridge SQL with your own implementation.
Where It Fires
Registered in the BimlFlex extension point manifest for Bridge table (BRG) SQL customization. The extension point is consumed during stored procedure generation for any object with a Bridge table type. Multiple custom outputs are available: DvBridgeColumnSelect, DvBridgeColumnInsert, DvBridgeWhereSql, DvBridgeHubWhereSql, DvBridgeOverrideSql, and others. Supports MSSQL, SQLDB, SQLDW, SQLMI, Snowflake (SFLDW), and Fabric targets.
When to Use It
- Add a tenant or security filter to Bridge queries by setting
DvBridgeWhereSqlto append aWHEREclause condition -- for example, filteringdv.BRG_SalesOrderto only include active orders from theAWLT_SRCsource system. - Include additional columns from Satellites in the Bridge output by setting
DvBridgeColumnSelectandDvBridgeColumnInsert-- use this instead of manually creating a view on top of the Bridge table. - Override the date-range filtering logic with
DvBridgeDateWhereSqlwhen the default temporal join behavior does not match your business rules. - Completely replace the generated Bridge SQL with
DvBridgeOverrideSqlwhen the standard template cannot express your join pattern -- use this instead of creating a post-deploy script, so the logic stays inside the BimlFlex lifecycle. - Customize Hub join conditions with
DvBridgeHubWhereSqlwhen you need to exclude ghost records or filter by record source during the Bridge build forLOAD_BFX_DVbatch processing.
Prerequisites
- Target object must be configured as a Bridge (BRG) type in the BimlFlex metadata.
- The Bridge object must reference at least one Hub and one or more Links or Satellites.
- The
BFX_DVconnection (or equivalent Data Vault target connection) must be configured. - Familiarity with the generated Bridge stored procedure structure is recommended -- build once without extension points and review the output.
Implementation Steps
- In BimlFlex, open the Extension Points editor and create a new extension point file.
- Set the
extensionpointdirective toDvBridgeSqland thetargetattribute to your Bridge object. - Uncomment and populate only the
CustomOutputproperties you need. - Build the project and review the generated stored procedure to confirm your SQL fragments appear in the correct locations.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvBridgeSql" target="BRG_SalesOrder" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
<# CustomOutput.DvBridgeWhereSql = @"AND h.[FlexRowRecordSource] = 'AWLT'"; #>
<# CustomOutput.DvBridgeColumnSelect = @",sat.[OrderStatus]"; #>
<# CustomOutput.DvBridgeColumnInsert = @",[OrderStatus]"; #>
Example
Before (default generated Bridge procedure fragment):
SELECT h.[Customer_BK], l.[SalesOrder_BK]
FROM dv.HUB_Customer h
INNER JOIN dv.LNK_SalesOrder l ON h.Customer_SK = l.Customer_SK
WHERE 1 = 1
After (with DvBridgeWhereSql and DvBridgeColumnSelect applied):
SELECT h.[Customer_BK], l.[SalesOrder_BK], sat.[OrderStatus]
FROM dv.HUB_Customer h
INNER JOIN dv.LNK_SalesOrder l ON h.Customer_SK = l.Customer_SK
WHERE 1 = 1
AND h.[FlexRowRecordSource] = 'AWLT'
Common Mistakes
- Setting
DvBridgeOverrideSqlwhile also setting individual fragment outputs -- Symptom: the individual fragments are silently ignored because the override replaces the entire procedure body. Fix: use eitherDvBridgeOverrideSqlor the granular outputs, not both. - Forgetting to include
ANDat the start ofDvBridgeWhereSql-- Symptom: SQL syntax error during procedure execution. Fix: always prefix WHERE-clause fragments withANDsince they are appended after an existingWHERE 1 = 1condition. - Adding columns in
DvBridgeColumnSelectwithout matchingDvBridgeColumnInsert-- Symptom: column count mismatch error during INSERT...SELECT. Fix: always set both properties together with matching column lists. - Setting
ObjectInherit = truewithout realizing it applies to all Bridge objects in the batch -- Symptom: unintended filters applied to every Bridge in theLOAD_BFX_DVbatch. Fix: useObjectInherit = falseand target individual Bridge objects with thetargetattribute.
Related Extension Points
- DvBridgePreProcessSql / DvBridgePostProcessSql -- use these to add setup or cleanup SQL around the Bridge procedure rather than modifying the core query logic.
- DvPitSql -- the PIT equivalent of this extension point; use when building Point-in-Time tables instead of Bridge tables.
- DvPreProcessSql / DvPostProcessSql -- for wrapping the main Data Vault load (Hub, Link, Satellite) procedures; these do not affect Bridge objects.
Data Vault Bridge Post Process SQL
Configure SQL to be injected after the Bridge Query
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvBridgePostProcessSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */
What It Does
Injects a custom SQL fragment that executes immediately after the Bridge table load query completes within the generated stored procedure. This is useful for post-load validation, logging, index maintenance, or cleanup operations that should run every time the Bridge table is refreshed. The injected SQL becomes part of the same transaction scope as the Bridge load itself.
Where It Fires
Consumed as a pre/post process SQL wrapper for Bridge table generation. The SQL fragment is appended after the main Bridge INSERT...SELECT statement in the generated stored procedure. Applied only to objects with a Bridge (BRG) type. Supports MSSQL, SQLDB, SQLDW, SQLMI, Snowflake (SFLDW), and Fabric targets.
When to Use It
- Log the row count loaded into
dv.BRG_SalesOrderto an audit table after each Bridge refresh -- use this instead of adding a separate post-deploy script that runs outside the stored procedure. - Rebuild or update statistics on the Bridge table after the load completes to ensure downstream queries have optimal execution plans.
- Execute a data quality check after the Bridge load and raise an error if integrity constraints are violated -- for example, verifying no orphaned Link keys exist in
dv.BRG_SalesOrder. - Send a notification or update a control table to signal that the Bridge refresh for the
LOAD_BFX_DVbatch is complete.
Prerequisites
- Target object must be a Bridge (BRG) type in BimlFlex metadata.
- The
BFX_DVconnection must be configured. - The SQL fragment must be valid for the target platform (e.g., T-SQL for MSSQL, Snowflake SQL for SFLDW).
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvBridgePostProcessSqland thetargetattribute to your Bridge object. - Write the SQL fragment that should execute after the Bridge load.
- Build and review the generated stored procedure to confirm placement.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvBridgePostProcessSql" target="BRG_SalesOrder" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
INSERT INTO dv.AuditLog (ObjectName, RowsAffected, LoadDate)
SELECT 'BRG_SalesOrder', @@ROWCOUNT, GETDATE();
Example
Before (end of generated Bridge procedure):
-- ... Bridge INSERT...SELECT completes ...
END
After (with DvBridgePostProcessSql applied):
-- ... Bridge INSERT...SELECT completes ...
INSERT INTO dv.AuditLog (ObjectName, RowsAffected, LoadDate)
SELECT 'BRG_SalesOrder', @@ROWCOUNT, GETDATE();
END
Common Mistakes
- Referencing
@@ROWCOUNTafter additional statements -- Symptom: audit log always shows 0 or incorrect counts. Fix: capture@@ROWCOUNTin the very first statement of your post-process SQL since any intervening statement resets it. - Using platform-specific SQL without checking the target connection type -- Symptom: build errors when deploying to Snowflake with T-SQL syntax. Fix: use BimlScript conditionals to emit platform-appropriate SQL, or create separate extension points per target platform.
- Raising errors that break the transaction without a TRY/CATCH -- Symptom: the entire Bridge load rolls back with an unhandled exception. Fix: wrap validation logic in a TRY/CATCH block if you want to log errors without aborting the load.
Related Extension Points
- DvBridgePreProcessSql -- the counterpart that runs before the Bridge query; use together for setup/teardown patterns.
- DvBridgeSql -- for modifying the Bridge query logic itself rather than adding steps before or after it.
- DvPostProcessSql -- similar post-process hook but for Hub, Link, and Satellite loads rather than Bridge objects.
Data Vault Bridge Pre Process SQL
Configure SQL to be injected before the Bridge Query
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvBridgePreProcessSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */
What It Does
Injects a custom SQL fragment that executes immediately before the Bridge table load query begins within the generated stored procedure. This is commonly used for creating temporary tables, setting session variables, truncating the target Bridge table for a full reload, or performing pre-load validation checks. The injected SQL runs in the same transaction scope as the Bridge load.
Where It Fires
Consumed as a pre/post process SQL wrapper for Bridge table generation. The SQL fragment is inserted before the main Bridge INSERT...SELECT statement in the generated stored procedure. Applied only to objects with a Bridge (BRG) type. Supports MSSQL, SQLDB, SQLDW, SQLMI, Snowflake (SFLDW), and Fabric targets.
When to Use It
- Truncate the Bridge table before a full reload -- use this instead of configuring a separate TRUNCATE step in your orchestration when you always want
dv.BRG_SalesOrderfully refreshed. - Create a temporary table or table variable to stage intermediate results that the Bridge query will reference.
- Set session-level options such as
SET NOCOUNT ONorSET ANSI_WARNINGS OFFthat affect the Bridge query behavior. - Log the start time of the Bridge load to an audit table for the
LOAD_BFX_DVbatch monitoring.
Prerequisites
- Target object must be a Bridge (BRG) type in BimlFlex metadata.
- The
BFX_DVconnection must be configured. - The SQL fragment must be valid for the target platform.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvBridgePreProcessSqland thetargetattribute to your Bridge object. - Write the SQL fragment that should execute before the Bridge load.
- Build and review the generated stored procedure to confirm placement.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvBridgePreProcessSql" target="BRG_SalesOrder" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
TRUNCATE TABLE dv.BRG_SalesOrder;
INSERT INTO dv.AuditLog (ObjectName, EventType, EventDate)
SELECT 'BRG_SalesOrder', 'BridgeLoadStart', GETDATE();
Example
Before (start of generated Bridge procedure):
CREATE PROCEDURE [dv].[usp_BRG_SalesOrder]
AS
BEGIN
-- Bridge INSERT...SELECT begins ...
After (with DvBridgePreProcessSql applied):
CREATE PROCEDURE [dv].[usp_BRG_SalesOrder]
AS
BEGIN
TRUNCATE TABLE dv.BRG_SalesOrder;
INSERT INTO dv.AuditLog (ObjectName, EventType, EventDate)
SELECT 'BRG_SalesOrder', 'BridgeLoadStart', GETDATE();
-- Bridge INSERT...SELECT begins ...
Common Mistakes
- Truncating a Bridge table that is referenced by active queries -- Symptom: blocking or deadlocks during concurrent reads. Fix: use a staging pattern with a swap (rename) strategy, or schedule Bridge refreshes during low-usage windows.
- Creating a temp table with
#prefix in Snowflake -- Symptom: syntax error. Fix: use Snowflake temporary table syntax (CREATE TEMPORARY TABLE) or use BimlScript conditionals to handle platform differences. - Placing DML that depends on the Bridge load results in PreProcess instead of PostProcess -- Symptom: the DML runs on stale data. Fix: move result-dependent logic to
DvBridgePostProcessSql.
Related Extension Points
- DvBridgePostProcessSql -- the counterpart that runs after the Bridge query; use together for setup/teardown patterns.
- DvBridgeSql -- for modifying the Bridge query logic itself rather than adding steps before or after it.
- DvPreProcessSql -- similar pre-process hook but for Hub, Link, and Satellite loads rather than Bridge objects.
Data Vault Pit Custom SQL
Configure custom SQL for the Data Vault PIT
Parameters
Name | Type | Description |
|---|---|---|
| table | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the SQL will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
| DvPitAddParameter | String | XXXX |
| DvPitLagSql | String | XXXX |
| DvPitDateWhereSql | String | XXXX |
| DvPitDateOverrideWhereSql | String | XXXX |
| DvPitHubWhereSql | String | XXXX |
| DvPitHubOverrideWhereSql | String | XXXX |
| DvPitSatWhereSql | String | XXXX |
| DvPitSatOverrideWhereSql | String | XXXX |
| DvPitDeleteWhereSql | String | XXXX |
| DvPitDeleteOverrideWhereSql | String | XXXX |
| DvPitSourceAddSelect | String | XXXX |
| DvPitTargetAddInsert | String | XXXX |
| DvPitTargetAddSelect | String | XXXX |
| DvPitColumnElements | String | XXXX |
| DvPitOverrideSql | String | XXXX |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPitSql" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<!-- Use DvPitSql combined the below samples to add additional logic to the PIT load procedure. -->
<#*
<# CustomOutput.ObjectInherit = false; #>
<# CustomOutput.DvPitAddParameter = @" ,@TenantCode VARCHAR(10) = NULL"; #>
<# CustomOutput.DvPitLagSql = @"SET @Lag = -7"; #>
<# CustomOutput.DvPitDateWhereSql = @"AND [FlexRowRecordSource] = 'AW'"; #>
<# CustomOutput.DvPitDateOverrideWhereSql = @"WHERE [FlexRowEffectiveFromDate] > '1900-01-01'AND [FlexRowRecordSource] = 'AW'"; #>
<# CustomOutput.DvPitDeleteWhereSql = @"AND [FlexRowRecordSource] = 'AW'"; #>
<# CustomOutput.DvPitDeleteOverrideWhereSql = @"WHERE [FlexRowEffectiveFromDate] > '1900-01-01'AND [FlexRowRecordSource] = 'AW'"; #>
<# CustomOutput.DvPitHubWhereSql = @"AND [FlexRowRecordSource] = 'AW'"; #>
<# CustomOutput.DvPitHubOverrideWhereSql = @"WHERE [FlexRowEffectiveFromDate] > '1900-01-01'AND [FlexRowRecordSource] = 'AW'"; #>
<# CustomOutput.DvPitSatWhereSql = @"AND [FlexRowRecordSource] = 'AW'"; #>
<# CustomOutput.DvPitSatOverrideWhereSql = @"WHERE [FlexRowEffectiveFromDate] > '1900-01-01'AND [FlexRowRecordSource] = 'AW'"; #>
<# CustomOutput.DvPitSourceAddSelect = @",l1.[TenantCode] "; #>
<# CustomOutput.DvPitTargetAddInsert = @",[TenantCode] "; #>
<# CustomOutput.DvPitTargetAddSelect = @",[TenantCode] "; #>
<# CustomOutput.DvPitColumnElements = @"<Column Name=""TenantCode"" DataType=""AnsiString"" Length=""3"" IsNullable=""true"" />"; #>
<# CustomOutput.DvPitOverrideSql = @"Add SQL here to completely override the PIT Code."; #>
*#>
What It Does
Injects custom SQL fragments into the generated Point-in-Time (PIT) table load stored procedure. PIT tables create snapshot records that join a Hub with all its Satellites at specific points in time, enabling efficient temporal queries without complex multi-way outer joins. This extension point lets you add parameters, modify lag calculations, customize WHERE conditions for Hub and Satellite joins, add or override columns, and control delete logic -- or completely override the entire PIT SQL.
Where It Fires
Consumed during PIT table SQL generation. Multiple custom outputs are available: DvPitLagSql, DvPitAddParameter, DvPitHubWhereSql, DvPitSatWhereSql, DvPitSourceAddSelect, DvPitTargetAddInsert, DvPitTargetAddSelect, DvPitDeleteWhereSql, DvPitColumnElements, and DvPitOverrideSql. Applied only to objects with a PIT table type. Supports MSSQL, SQLDB, SQLDW, SQLMI, Snowflake (SFLDW), and Fabric targets.
When to Use It
- Add a tenant filter parameter to the PIT procedure using
DvPitAddParameterandDvPitHubWhereSql-- for example, filteringdv.PIT_Customerby a@TenantCodeparameter so that multi-tenant deployments only process relevant records. - Change the default lag period with
DvPitLagSqlto control how far back the PIT table looks for satellite changes -- use this instead of modifying the BimlFlex setting when different PIT tables need different lag windows. - Filter Satellite rows with
DvPitSatWhereSqlto exclude specific record sources -- for example, only including rows from theAWLT_SRCsource indv.SAT_Customer_Details_awlt. - Add custom columns to the PIT output using
DvPitSourceAddSelect,DvPitTargetAddInsert,DvPitTargetAddSelect, andDvPitColumnElementstogether -- use this when downstream consumers need denormalized attributes directly in the PIT table. - Completely replace the PIT generation SQL with
DvPitOverrideSqlwhen the standard PIT pattern does not match your requirements -- use this instead of creating a separate post-deploy stored procedure.
Prerequisites
- Target object must be configured as a PIT type in the BimlFlex metadata.
- The PIT object must reference a Hub and one or more Satellites.
- The
BFX_DVconnection (or equivalent Data Vault target connection) must be configured. - When adding columns via
DvPitColumnElements, the Biml XML element must conform to the<Column>schema with validDataType,Length, andIsNullableattributes. - Familiarity with the generated PIT stored procedure structure is recommended -- build once without extension points and review the output.
Implementation Steps
- In BimlFlex, open the Extension Points editor and create a new extension point file.
- Set the
extensionpointdirective toDvPitSqland thetargetattribute to your PIT object. - Uncomment and populate only the
CustomOutputproperties you need. - If adding columns, set all four related properties:
DvPitSourceAddSelect,DvPitTargetAddInsert,DvPitTargetAddSelect, andDvPitColumnElements. - Build the project and review the generated stored procedure to confirm your SQL fragments appear correctly.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPitSql" target="PIT_Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
<# CustomOutput.DvPitAddParameter = @" ,@TenantCode VARCHAR(10) = NULL"; #>
<# CustomOutput.DvPitLagSql = @"SET @Lag = -7"; #>
<# CustomOutput.DvPitHubWhereSql = @"AND h.[FlexRowRecordSource] = 'AWLT'"; #>
<# CustomOutput.DvPitSatWhereSql = @"AND sat.[FlexRowRecordSource] = 'AWLT'"; #>
Example
Before (default generated PIT procedure fragment):
CREATE PROCEDURE [dv].[usp_PIT_Customer]
AS
BEGIN
DECLARE @Lag INT = -1
-- ... Hub and Satellite joins ...
WHERE 1 = 1
After (with DvPitAddParameter, DvPitLagSql, and DvPitHubWhereSql applied):
CREATE PROCEDURE [dv].[usp_PIT_Customer]
@TenantCode VARCHAR(10) = NULL
AS
BEGIN
SET @Lag = -7
-- ... Hub and Satellite joins ...
WHERE 1 = 1
AND h.[FlexRowRecordSource] = 'AWLT'
Common Mistakes
- Setting
DvPitOverrideSqlwhile also setting individual fragment outputs -- Symptom: the individual fragments are silently ignored because the override replaces the entire procedure body. Fix: use eitherDvPitOverrideSqlor the granular outputs, not both. - Adding columns with
DvPitSourceAddSelectbut forgettingDvPitTargetAddInsert,DvPitTargetAddSelect, orDvPitColumnElements-- Symptom: column count mismatch or missing column definition errors. Fix: always set all four column-related outputs together. - Malformed XML in
DvPitColumnElements-- Symptom: BimlScript compilation error during build. Fix: ensure the column element is valid Biml XML, e.g.,<Column Name="TenantCode" DataType="AnsiString" Length="10" IsNullable="true" />. - Forgetting the
ANDprefix inDvPitHubWhereSqlorDvPitSatWhereSql-- Symptom: SQL syntax error at runtime. Fix: always start these fragments withANDsince they append to an existingWHERE 1 = 1clause. - Using
DvPitDeleteWhereSqlwithout understanding the PIT delete pattern -- Symptom: stale rows remain in the PIT table or valid rows are deleted. Fix: review the generated delete logic first and ensure your filter aligns with the existing delete WHERE conditions.
Related Extension Points
- DvPitPreProcessSql / DvPitPostProcessSql -- use these to add setup or cleanup SQL around the PIT procedure rather than modifying the core query logic.
- DvBridgeSql -- the Bridge equivalent of this extension point; use when building Bridge tables instead of PIT tables.
- DvPreProcessSql / DvPostProcessSql -- for wrapping the main Data Vault load (Hub, Link, Satellite) procedures; these do not affect PIT objects.
Data Vault Pit Post Process SQL
Configure SQL to be injected after the Point In Time Query
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPitPostProcessSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */
What It Does
Injects a custom SQL fragment that executes immediately after the PIT table load query completes within the generated stored procedure. This is useful for post-load audit logging, index rebuilds, statistics updates, or data quality validation on the freshly loaded PIT table. The injected SQL runs in the same transaction scope as the PIT load.
Where It Fires
Consumed as a pre/post process SQL wrapper for PIT table generation. The SQL fragment is appended after the main PIT INSERT...SELECT and DELETE statements in the generated stored procedure. Applied only to objects with a PIT table type. Supports MSSQL, SQLDB, SQLDW, SQLMI, Snowflake (SFLDW), and Fabric targets.
When to Use It
- Log the row count and load completion timestamp for
dv.PIT_Customerto an audit table -- use this instead of a separate orchestration step so the logging stays within the procedure transaction. - Rebuild clustered columnstore indexes on the PIT table after a full refresh to optimize query performance for downstream reporting.
- Run a validation query that checks the PIT table for duplicate surrogate key entries and logs any anomalies.
- Update a watermark or control table to record the latest successful PIT refresh for the
LOAD_BFX_DVbatch.
Prerequisites
- Target object must be a PIT type in BimlFlex metadata.
- The
BFX_DVconnection must be configured. - The SQL fragment must be valid for the target platform (e.g., T-SQL for MSSQL, Snowflake SQL for SFLDW).
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvPitPostProcessSqland thetargetattribute to your PIT object. - Write the SQL fragment that should execute after the PIT load.
- Build and review the generated stored procedure to confirm placement.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPitPostProcessSql" target="PIT_Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
INSERT INTO dv.AuditLog (ObjectName, RowsAffected, LoadDate)
SELECT 'PIT_Customer', @@ROWCOUNT, GETDATE();
UPDATE STATISTICS dv.PIT_Customer;
Example
Before (end of generated PIT procedure):
-- ... PIT INSERT and DELETE complete ...
END
After (with DvPitPostProcessSql applied):
-- ... PIT INSERT and DELETE complete ...
INSERT INTO dv.AuditLog (ObjectName, RowsAffected, LoadDate)
SELECT 'PIT_Customer', @@ROWCOUNT, GETDATE();
UPDATE STATISTICS dv.PIT_Customer;
END
Common Mistakes
- Referencing
@@ROWCOUNTafter other statements -- Symptom: audit log always shows the row count of the preceding audit INSERT, not the PIT load. Fix: capture@@ROWCOUNTinto a variable as the very first statement. - Running ALTER INDEX REBUILD on a large PIT table without sufficient maintenance window -- Symptom: long-running blocking operation. Fix: use
ALTER INDEX REORGANIZEfor online maintenance or schedule full rebuilds during off-hours. - Placing logic that depends on pre-load state in PostProcess instead of PreProcess -- Symptom: the check sees post-load data and produces incorrect results. Fix: move pre-load checks to
DvPitPreProcessSql.
Related Extension Points
- DvPitPreProcessSql -- the counterpart that runs before the PIT query; use together for setup/teardown patterns.
- DvPitSql -- for modifying the PIT query logic itself (column selection, WHERE conditions, parameters) rather than adding steps before or after.
- DvBridgePostProcessSql -- the equivalent post-process hook for Bridge tables.
Data Vault Pit Pre Process SQL
Configure SQL to be injected before the Point In Time Query
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPitPreProcessSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */
What It Does
Injects a custom SQL fragment that executes immediately before the PIT table load query begins within the generated stored procedure. This is commonly used for creating temporary staging tables, setting session variables, truncating the PIT table for a full rebuild, or capturing pre-load state for comparison after the load completes. The injected SQL runs in the same transaction scope as the PIT load.
Where It Fires
Consumed as a pre/post process SQL wrapper for PIT table generation. The SQL fragment is inserted before the main PIT INSERT...SELECT statement in the generated stored procedure. Applied only to objects with a PIT table type. Supports MSSQL, SQLDB, SQLDW, SQLMI, Snowflake (SFLDW), and Fabric targets.
When to Use It
- Truncate
dv.PIT_Customerbefore a full rebuild -- use this instead of a separate orchestration step when the PIT table should always be fully refreshed. - Create a temporary table to capture the current PIT row count for comparison in
DvPitPostProcessSql. - Set session-level options that affect query behavior, such as
SET LOCK_TIMEOUTfor large PIT loads againstBFX_DV. - Log the PIT load start time to an audit table for monitoring the
LOAD_BFX_DVbatch execution.
Prerequisites
- Target object must be a PIT type in BimlFlex metadata.
- The
BFX_DVconnection must be configured. - The SQL fragment must be valid for the target platform.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvPitPreProcessSqland thetargetattribute to your PIT object. - Write the SQL fragment that should execute before the PIT load.
- Build and review the generated stored procedure to confirm placement.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPitPreProcessSql" target="PIT_Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
TRUNCATE TABLE dv.PIT_Customer;
INSERT INTO dv.AuditLog (ObjectName, EventType, EventDate)
SELECT 'PIT_Customer', 'PitLoadStart', GETDATE();
Example
Before (start of generated PIT procedure):
CREATE PROCEDURE [dv].[usp_PIT_Customer]
AS
BEGIN
DECLARE @Lag INT = -1
-- ... PIT INSERT...SELECT begins ...
After (with DvPitPreProcessSql applied):
CREATE PROCEDURE [dv].[usp_PIT_Customer]
AS
BEGIN
TRUNCATE TABLE dv.PIT_Customer;
INSERT INTO dv.AuditLog (ObjectName, EventType, EventDate)
SELECT 'PIT_Customer', 'PitLoadStart', GETDATE();
DECLARE @Lag INT = -1
-- ... PIT INSERT...SELECT begins ...
Common Mistakes
- Truncating a PIT table referenced by active dashboard queries -- Symptom: users see empty results during the refresh window. Fix: use a swap-table pattern (load into a staging copy, then rename) or schedule PIT refreshes during off-hours.
- Creating temp tables with names that collide with BimlFlex-generated temp tables -- Symptom: unexpected data or errors. Fix: use a distinctive naming prefix like
#ep_for extension point temp tables. - Adding logic that should run after the load -- Symptom: the SQL executes on stale PIT data. Fix: move post-load logic to
DvPitPostProcessSql.
Related Extension Points
- DvPitPostProcessSql -- the counterpart that runs after the PIT query; use together for setup/teardown patterns.
- DvPitSql -- for modifying the PIT query logic itself rather than adding steps before or after it.
- DvBridgePreProcessSql -- the equivalent pre-process hook for Bridge tables.
Data Vault Post Process SQL
Configure SQL to be injected at the end of the Data Vault stored procedure
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the target object to which the pipeline will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPostProcessSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */
What It Does
Injects a custom SQL fragment at the very end of the generated Data Vault stored procedure, after all Hub, Link, and Satellite load logic has completed. This is the outermost post-processing hook for the core Data Vault load and is ideal for final audit logging, notification triggers, post-load cleanup, or orchestration control-flow updates. The injected SQL executes within the stored procedure scope.
Where It Fires
Consumed in Code/DataVaultProvider.cs (line ~169-173), also Code/DatabricksProvider.cs and Code/FabricProvider.cs. Injected into the generated stored procedure after the main Data Vault load logic. Applied to Hub, Link, Satellite, and Reference objects. Supports MSSQL, SQLDB, SQLDW, SQLMI, Snowflake (SFLDW), and Fabric targets.
When to Use It
- Log the completion of a Satellite load (e.g.,
dv.SAT_Customer_Details_awlt) to an audit table with row counts and timestamps -- use this instead of adding orchestration-level logging when you want the audit record within the same transaction. - Execute post-load data quality checks against
dv.HUB_Customerto verify no duplicate business keys were inserted during the load fromBFX_STG. - Update a control or watermark table to signal that the Data Vault load step in the
LOAD_BFX_DVbatch has completed successfully. - Send a Service Broker message or insert into a queue table to trigger downstream processing of the loaded Data Vault objects.
- Rebuild indexes on Satellite tables after large delta loads -- use this instead of
DvPitPostProcessSqlorDvBridgePostProcessSqlwhen the target is a core Data Vault object, not a PIT or Bridge.
Prerequisites
- Target object must be a Hub, Link, Satellite, or Reference type in BimlFlex metadata.
- The
BFX_DVconnection must be configured with the appropriate Data Vault target database. - The SQL fragment must be valid for the target platform.
- Does not apply to PIT or Bridge objects -- use
DvPitPostProcessSqlorDvBridgePostProcessSqlfor those.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvPostProcessSqland thetargetattribute to your Data Vault object. - Write the SQL that should execute after the Data Vault load completes.
- Build and review the generated stored procedure to confirm the SQL appears at the end of the procedure body.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPostProcessSql" target="SAT_Customer_Details_awlt" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
INSERT INTO dv.AuditLog (ObjectName, RowsAffected, EventType, LoadDate)
SELECT '<#=targetTable.Name#>', @@ROWCOUNT, 'DvLoadComplete', GETDATE();
Example
Before (end of generated DV stored procedure):
-- ... Hub/Link/Satellite INSERT completes ...
END
After (with DvPostProcessSql applied):
-- ... Hub/Link/Satellite INSERT completes ...
INSERT INTO dv.AuditLog (ObjectName, RowsAffected, EventType, LoadDate)
SELECT 'SAT_Customer_Details_awlt', @@ROWCOUNT, 'DvLoadComplete', GETDATE();
END
Common Mistakes
- Using this extension point for PIT or Bridge objects -- Symptom: the extension point is ignored; no SQL appears in the generated procedure. Fix: use
DvPitPostProcessSqlorDvBridgePostProcessSqlfor PIT and Bridge objects respectively. - Referencing
@@ROWCOUNTafter other statements -- Symptom: incorrect row count captured. Fix: capture@@ROWCOUNTinto a variable as the very first statement of your post-process SQL. - Setting
ObjectInherit = truewithout realizing it applies to all objects in the batch -- Symptom: the same post-process SQL runs for every Hub, Link, and Satellite inLOAD_BFX_DV. Fix: useObjectInherit = falseand target specific objects, or use BimlScript conditionals to filter by object type. - Confusing DvPostProcessSql with DvPostTargetSql -- Symptom: SQL appears in the wrong location. Fix:
DvPostProcessSqlruns at the very end of the procedure;DvPostTargetSqlruns after the target INSERT statement but before any subsequent logic.
Related Extension Points
- DvPreProcessSql -- the counterpart that runs at the start of the procedure; use together for setup/teardown patterns.
- DvPostTargetSql -- runs after the target INSERT query specifically, not at the end of the entire procedure.
- DvPostSourceSql -- runs after the source (staging) query within the procedure.
- DvPitPostProcessSql / DvBridgePostProcessSql -- the equivalent hooks for PIT and Bridge objects.
Data Vault Post Source SQL
Configure SQL to be injected after the staging query in the Data Vault stored procedure
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the target object to which the pipeline will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPostSourceSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */
What It Does
Injects a custom SQL fragment immediately after the source (staging) query within the generated Data Vault stored procedure. In an ELT pattern, the source query reads from the staging area (e.g., BFX_STG) and prepares the data for insertion into the Data Vault target. This extension point lets you modify, filter, or augment the staged result set after it has been queried but before it is inserted into the Hub, Link, or Satellite.
Where It Fires
Consumed in Code/DataVaultProvider.cs (lines ~184-194), Code/DatabricksProvider.cs, and Code/FabricProvider.cs. The SQL fragment is placed after the source query SQL block in the generated stored procedure. Applied to Hub, Link, Satellite, and Reference objects. Supports MSSQL, SQLDB, SQLDW, SQLMI, Snowflake (SFLDW), and Fabric targets.
When to Use It
- Add a post-source data quality check that validates the staged data before it reaches the Hub or Satellite insert -- for example, verifying that no NULL business keys exist in the staging query results for
dv.HUB_Customer. - Insert the staged result set into an additional audit or debug table for troubleshooting load issues -- use this instead of adding a separate logging step in orchestration.
- Apply additional filtering or transformation to a temp table created by the source query before the target insert consumes it.
- Log the count of records retrieved from
BFX_STGfor the current load cycle ofSalesLT.Customerdata into the Data Vault.
Prerequisites
- Target object must be a Hub, Link, Satellite, or Reference type in BimlFlex metadata.
- The
BFX_DVconnection andBFX_STG(or equivalent staging connection) must be configured. - Familiarity with the generated stored procedure structure is recommended -- the source query typically populates a CTE or temp table that the target INSERT references.
- Does not apply to PIT or Bridge objects.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvPostSourceSqland thetargetattribute to your Data Vault object. - Write the SQL fragment that should execute after the source query.
- Build and review the generated stored procedure to verify placement between the source query and the target insert.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPostSourceSql" target="HUB_Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
-- Validate no NULL business keys before Hub insert
IF EXISTS (SELECT 1 FROM #SrcData WHERE [Customer_BK] IS NULL)
BEGIN
RAISERROR('NULL business key detected in staging data for HUB_Customer', 16, 1);
RETURN;
END
Example
Before (generated procedure with source query and target insert):
-- Source query populates temp table
SELECT [Customer_BK], [FlexRowHashType1], ...
INTO #SrcData
FROM [BFX_STG].[awlt].[SalesLT_Customer]
-- Target INSERT into Hub
INSERT INTO [dv].[HUB_Customer] ...
SELECT ... FROM #SrcData
After (with DvPostSourceSql applied):
-- Source query populates temp table
SELECT [Customer_BK], [FlexRowHashType1], ...
INTO #SrcData
FROM [BFX_STG].[awlt].[SalesLT_Customer]
-- Validate no NULL business keys before Hub insert
IF EXISTS (SELECT 1 FROM #SrcData WHERE [Customer_BK] IS NULL)
BEGIN
RAISERROR('NULL business key detected in staging data for HUB_Customer', 16, 1);
RETURN;
END
-- Target INSERT into Hub
INSERT INTO [dv].[HUB_Customer] ...
SELECT ... FROM #SrcData
Common Mistakes
- Modifying the source temp table in a way that breaks the downstream INSERT -- Symptom: column mismatch or missing data in the target. Fix: add columns rather than removing them, or use a separate temp table for your logic.
- Confusing DvPostSourceSql with DvPostProcessSql -- Symptom: SQL appears at the wrong location. Fix:
DvPostSourceSqlruns after the source query;DvPostProcessSqlruns at the very end of the procedure. - Using RAISERROR without RETURN -- Symptom: the procedure continues to the target INSERT despite the error. Fix: follow RAISERROR with RETURN to halt execution.
- Applying this to PIT or Bridge objects -- Symptom: the extension point is ignored. Fix: PIT and Bridge objects have their own extension points (
DvPitSql,DvBridgeSql).
Related Extension Points
- DvPreSourceSql -- runs before the source query; use together to bracket the source query with pre/post logic.
- DvPostTargetSql -- runs after the target INSERT; use when you need to act on the data after it has been written to the Hub/Link/Satellite.
- DvPostProcessSql -- runs at the very end of the procedure; use for final cleanup or logging.
Data Vault Post Target SQL
Configure SQL to be injected after the target insert query in the Data Vault stored procedure
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPostTargetSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */
What It Does
Injects a custom SQL fragment immediately after the target INSERT query in the generated Data Vault stored procedure. This runs after data has been written to the Hub, Link, or Satellite table but before the procedure-level post-process logic. It is useful for capturing insert metrics, performing immediate post-insert validation, or triggering dependent operations that need to act on freshly inserted data.
Where It Fires
Consumed in Code/DataVaultProvider.cs (lines ~184-194), Code/DatabricksProvider.cs, and Code/FabricProvider.cs. The SQL fragment is placed after the target INSERT...SELECT statement in the generated stored procedure. Applied to Hub, Link, Satellite, and Reference objects. Supports MSSQL, SQLDB, SQLDW, SQLMI, Snowflake (SFLDW), and Fabric targets.
When to Use It
- Capture
@@ROWCOUNTimmediately after the Satellite insert to log how many new or changed records were written todv.SAT_Customer_Details_awlt-- use this instead ofDvPostProcessSqlwhen you need the exact insert row count before any other statements reset it. - Run a referential integrity check to verify that every row inserted into
dv.LNK_SalesOrderhas matching Hub keys indv.HUB_Customer. - Trigger a dependent operation such as updating a materialized view or refreshing a cache immediately after the Hub insert.
- Insert newly loaded business keys into a processing queue for downstream consumption by the
LOAD_BFX_DVbatch.
Prerequisites
- Target object must be a Hub, Link, Satellite, or Reference type in BimlFlex metadata.
- The
BFX_DVconnection must be configured. - Familiarity with the distinction between
DvPostTargetSql(after target INSERT) andDvPostProcessSql(end of procedure) is important. - Does not apply to PIT or Bridge objects.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvPostTargetSqland thetargetattribute to your Data Vault object. - Write the SQL fragment that should execute after the target INSERT.
- Build and review the generated stored procedure to verify placement.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPostTargetSql" target="SAT_Customer_Details_awlt" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
DECLARE @InsertedRows INT = @@ROWCOUNT;
INSERT INTO dv.AuditLog (ObjectName, RowsInserted, EventType, LoadDate)
SELECT 'SAT_Customer_Details_awlt', @InsertedRows, 'PostTargetInsert', GETDATE();
Example
Before (generated procedure target INSERT section):
-- Target INSERT into Satellite
INSERT INTO [dv].[SAT_Customer_Details_awlt] ([Customer_SK], [FlexRowHashType1], ...)
SELECT [Customer_SK], [FlexRowHashType1], ...
FROM #SrcData src
WHERE NOT EXISTS (SELECT 1 FROM [dv].[SAT_Customer_Details_awlt] tgt WHERE ...)
-- ... procedure continues ...
After (with DvPostTargetSql applied):
-- Target INSERT into Satellite
INSERT INTO [dv].[SAT_Customer_Details_awlt] ([Customer_SK], [FlexRowHashType1], ...)
SELECT [Customer_SK], [FlexRowHashType1], ...
FROM #SrcData src
WHERE NOT EXISTS (SELECT 1 FROM [dv].[SAT_Customer_Details_awlt] tgt WHERE ...)
DECLARE @InsertedRows INT = @@ROWCOUNT;
INSERT INTO dv.AuditLog (ObjectName, RowsInserted, EventType, LoadDate)
SELECT 'SAT_Customer_Details_awlt', @InsertedRows, 'PostTargetInsert', GETDATE();
-- ... procedure continues ...
Common Mistakes
- Confusing DvPostTargetSql with DvPostProcessSql -- Symptom: SQL appears at the end of the procedure instead of right after the INSERT. Fix: use
DvPostTargetSqlfor logic that must run immediately after the INSERT; useDvPostProcessSqlfor end-of-procedure logic. - Not capturing
@@ROWCOUNTas the first statement -- Symptom:@@ROWCOUNTreturns the count from your first SQL statement, not the INSERT. Fix: always declare a variable and capture@@ROWCOUNTin the very first line. - Running expensive queries against the target table immediately after insert -- Symptom: performance degradation or blocking. Fix: keep post-target logic lightweight; defer heavy operations to
DvPostProcessSqlor a separate step.
Related Extension Points
- DvPreTargetSql -- runs before the target INSERT; use together to bracket the INSERT with pre/post logic.
- DvPostSourceSql -- runs after the source query but before the target INSERT.
- DvPostProcessSql -- runs at the very end of the entire procedure; use for final cleanup or logging.
Data Vault Pre Process SQL
Configure SQL to be injected at the start of the Data Vault stored procedure
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the target object to which the pipeline will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPreProcessSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */
What It Does
Injects a custom SQL fragment at the very start of the generated Data Vault stored procedure, before any Hub, Link, or Satellite load logic executes. This is the outermost pre-processing hook for the core Data Vault load and is ideal for session configuration, variable initialization, creating temporary tables, or logging the start of a load cycle. The injected SQL runs within the stored procedure scope.
Where It Fires
Consumed in Code/DataVaultProvider.cs (line ~169-173), also Code/DatabricksProvider.cs and Code/FabricProvider.cs. Injected into the generated stored procedure before the main Data Vault load logic. Applied to Hub, Link, Satellite, and Reference objects. Supports MSSQL, SQLDB, SQLDW, SQLMI, Snowflake (SFLDW), and Fabric targets.
When to Use It
- Set session-level options such as
SET NOCOUNT ON,SET XACT_ABORT ON, orSET LOCK_TIMEOUTthat should apply to the entire Data Vault load procedure fordv.HUB_Customerordv.SAT_Customer_Details_awlt. - Declare and initialize variables that will be referenced by other extension points later in the procedure (e.g., a
@LoadBatchIdvariable used by bothDvPostSourceSqlandDvPostProcessSql). - Create temporary tables or table variables that will be populated and consumed during the load -- use this instead of
DvPreSourceSqlwhen the temp table should be available to the entire procedure, not just the source query. - Log the start of the Data Vault load for the current object to an audit table within the
LOAD_BFX_DVbatch. - Execute a pre-flight check (e.g., verify the staging table has data) and abort early if conditions are not met -- use this instead of orchestration-level checks when you want the logic embedded in the generated procedure.
- Prepare hub lookup tables before Data Vault loading begins. Insert SQL that populates or refreshes a temporary lookup table with the latest hub keys before the main Data Vault stored procedure starts its hub/link/satellite inserts. This pre-warms the lookup and avoids repeated full-table scans during satellite loading.
- Run a SQL-based data quality check before DV processing. Add a pre-process SQL fragment that validates referential integrity between staging and the target hub -- logging orphan records to an error table and optionally aborting the load if the error count exceeds a threshold.
Prerequisites
- Target object must be a Hub, Link, Satellite, or Reference type in BimlFlex metadata.
- The
BFX_DVconnection must be configured. - Does not apply to PIT or Bridge objects -- use
DvPitPreProcessSqlorDvBridgePreProcessSqlfor those.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvPreProcessSqland thetargetattribute to your Data Vault object. - Write the SQL that should execute at the start of the procedure.
- Build and review the generated stored procedure to confirm the SQL appears at the beginning of the procedure body.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPreProcessSql" target="HUB_Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
SET XACT_ABORT ON;
DECLARE @LoadBatchId UNIQUEIDENTIFIER = NEWID();
INSERT INTO dv.AuditLog (ObjectName, BatchId, EventType, LoadDate)
SELECT 'HUB_Customer', @LoadBatchId, 'DvLoadStart', GETDATE();
Example
Before (start of generated DV stored procedure):
CREATE PROCEDURE [dv].[usp_HUB_Customer]
AS
BEGIN
-- Source query begins ...
After (with DvPreProcessSql applied):
CREATE PROCEDURE [dv].[usp_HUB_Customer]
AS
BEGIN
SET XACT_ABORT ON;
DECLARE @LoadBatchId UNIQUEIDENTIFIER = NEWID();
INSERT INTO dv.AuditLog (ObjectName, BatchId, EventType, LoadDate)
SELECT 'HUB_Customer', @LoadBatchId, 'DvLoadStart', GETDATE();
-- Source query begins ...
Common Mistakes
- Using this extension point for PIT or Bridge objects -- Symptom: the extension point is ignored. Fix: use
DvPitPreProcessSqlorDvBridgePreProcessSqlfor PIT and Bridge objects respectively. - Declaring variables that conflict with BimlFlex-generated variable names -- Symptom: "variable already declared" error during procedure creation. Fix: use a distinctive naming prefix like
@ep_for extension point variables, and review the generated procedure to check for conflicts. - Placing post-load logic in PreProcess -- Symptom: the logic runs on stale data from the previous load. Fix: move result-dependent logic to
DvPostProcessSqlorDvPostTargetSql. - Setting
ObjectInherit = truewhen only one object needs pre-processing -- Symptom: the pre-process SQL runs for every Hub, Link, and Satellite in theLOAD_BFX_DVbatch. Fix: useObjectInherit = falseand set thetargetattribute to the specific object.
Related Extension Points
- DvPostProcessSql -- the counterpart that runs at the end of the procedure; use together for setup/teardown patterns.
- DvPreSourceSql -- runs before the source query specifically, not at the start of the entire procedure.
- DvPreTargetSql -- runs before the target INSERT specifically.
- DvPitPreProcessSql / DvBridgePreProcessSql -- the equivalent hooks for PIT and Bridge objects.
Data Vault Pre Source SQL
Configure SQL to be injected before the staging query in the Data Vault stored procedure
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the target object to which the pipeline will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPreSourceSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */
What It Does
Injects a custom SQL fragment immediately before the source (staging) query within the generated Data Vault stored procedure. In an ELT pattern, the source query reads from the staging area and prepares data for the Data Vault target. This extension point lets you set up temporary structures, apply filters, or execute preparatory logic right before the staging data is queried for the Hub, Link, or Satellite load.
Where It Fires
Consumed in Code/DataVaultProvider.cs (lines ~184-194), Code/DatabricksProvider.cs, and Code/FabricProvider.cs. The SQL fragment is placed before the source query SQL block in the generated stored procedure. Applied to Hub, Link, Satellite, and Reference objects. Supports MSSQL, SQLDB, SQLDW, SQLMI, Snowflake (SFLDW), and Fabric targets.
When to Use It
- Create an index on the staging table before the source query executes to improve join performance when loading
dv.HUB_CustomerfromBFX_STG-- use this instead of permanently indexing staging tables when indexes are only needed during the DV load window. - Populate a lookup temp table that the source query will reference -- for example, a list of valid customer IDs to filter the
SalesLT.Customerstaging data before it reaches the Hub. - Update statistics on the staging table to ensure the query optimizer makes good choices for the source query against
BFX_STG. - Log the pre-source state (e.g., record counts in staging) for later comparison with
DvPostSourceSql.
Prerequisites
- Target object must be a Hub, Link, Satellite, or Reference type in BimlFlex metadata.
- The
BFX_DVandBFX_STGconnections must be configured. - Understanding of where this SQL lands relative to
DvPreProcessSqlis important:DvPreProcessSqlruns at the start of the procedure, whileDvPreSourceSqlruns specifically before the source query block. - Does not apply to PIT or Bridge objects.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvPreSourceSqland thetargetattribute to your Data Vault object. - Write the SQL fragment that should execute before the source query.
- Build and review the generated stored procedure to verify placement.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPreSourceSql" target="HUB_Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
-- Create temp index on staging table for join performance
CREATE NONCLUSTERED INDEX IX_STG_Customer_BK
ON [BFX_STG].[awlt].[SalesLT_Customer] ([CustomerID])
WITH (DATA_COMPRESSION = PAGE);
Example
Before (generated procedure source query section):
-- ... procedure start and DvPreProcessSql ...
-- Source query
SELECT [Customer_BK], [FlexRowHashType1], ...
INTO #SrcData
FROM [BFX_STG].[awlt].[SalesLT_Customer]
After (with DvPreSourceSql applied):
-- ... procedure start and DvPreProcessSql ...
-- Create temp index on staging table for join performance
CREATE NONCLUSTERED INDEX IX_STG_Customer_BK
ON [BFX_STG].[awlt].[SalesLT_Customer] ([CustomerID])
WITH (DATA_COMPRESSION = PAGE);
-- Source query
SELECT [Customer_BK], [FlexRowHashType1], ...
INTO #SrcData
FROM [BFX_STG].[awlt].[SalesLT_Customer]
Common Mistakes
- Creating indexes that persist after the procedure and accumulate over time -- Symptom: staging table performance degrades due to excessive indexes. Fix: use
DvPostSourceSqlto drop the index after the source query completes. - Confusing DvPreSourceSql with DvPreProcessSql -- Symptom: SQL appears at the wrong location in the procedure. Fix:
DvPreSourceSqlruns right before the source query;DvPreProcessSqlruns at the very start of the procedure. - Modifying staging data in place (UPDATE/DELETE on the staging table) -- Symptom: other objects that reference the same staging table see modified data. Fix: use a temp table copy instead of modifying the shared staging table.
Related Extension Points
- DvPostSourceSql -- runs after the source query; use together to bracket the source query with setup/teardown logic.
- DvPreProcessSql -- runs at the very start of the procedure; use for session-level setup that applies to the entire procedure.
- DvPreTargetSql -- runs before the target INSERT, not the source query.
Data Vault Pre Target SQL
Configure SQL to be injected before the target source query in the Data Vault stored procedure
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPreTargetSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */
What It Does
Injects a custom SQL fragment immediately before the target INSERT query in the generated Data Vault stored procedure. This runs after the source query has been executed and staged data is ready, but before it is written to the Hub, Link, or Satellite. It is useful for disabling constraints or triggers on the target table, setting up the target for a specific insert strategy, or performing final transformations on the staged data.
Where It Fires
Consumed in Code/DataVaultProvider.cs (lines ~184-194), Code/DatabricksProvider.cs, and Code/FabricProvider.cs. The SQL fragment is placed before the target INSERT...SELECT statement in the generated stored procedure. Applied to Hub, Link, Satellite, and Reference objects. Supports MSSQL, SQLDB, SQLDW, SQLMI, Snowflake (SFLDW), and Fabric targets.
When to Use It
- Disable non-clustered indexes on the Satellite table before a large initial load to improve insert performance -- for example, disabling indexes on
dv.SAT_Customer_Details_awltand rebuilding them viaDvPostTargetSql. - Disable triggers on the target table that should not fire during the ELT load process.
- Apply a final transformation to the staged temp table (e.g., deduplication) before the INSERT into
dv.HUB_Customer. - Set a table lock hint context or adjust transaction isolation level specifically for the target INSERT operation.
Prerequisites
- Target object must be a Hub, Link, Satellite, or Reference type in BimlFlex metadata.
- The
BFX_DVconnection must be configured. - Understanding of the procedure execution order is important:
DvPreProcessSql>DvPreSourceSql> source query >DvPostSourceSql>DvPreTargetSql> target INSERT >DvPostTargetSql>DvPostProcessSql. - Does not apply to PIT or Bridge objects.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvPreTargetSqland thetargetattribute to your Data Vault object. - Write the SQL fragment that should execute before the target INSERT.
- Build and review the generated stored procedure to verify placement.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvPreTargetSql" target="SAT_Customer_Details_awlt" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
-- Disable non-clustered indexes for bulk insert performance
ALTER INDEX IX_SAT_Customer_Details_Hash ON dv.SAT_Customer_Details_awlt DISABLE;
Example
Before (generated procedure target INSERT section):
-- ... source query and DvPostSourceSql complete ...
-- Target INSERT into Satellite
INSERT INTO [dv].[SAT_Customer_Details_awlt] ([Customer_SK], [FlexRowHashType1], ...)
SELECT ...
After (with DvPreTargetSql applied):
-- ... source query and DvPostSourceSql complete ...
-- Disable non-clustered indexes for bulk insert performance
ALTER INDEX IX_SAT_Customer_Details_Hash ON dv.SAT_Customer_Details_awlt DISABLE;
-- Target INSERT into Satellite
INSERT INTO [dv].[SAT_Customer_Details_awlt] ([Customer_SK], [FlexRowHashType1], ...)
SELECT ...
Common Mistakes
- Disabling indexes without re-enabling them in DvPostTargetSql -- Symptom: downstream queries fail or perform poorly because indexes remain disabled. Fix: always pair
ALTER INDEX ... DISABLEinDvPreTargetSqlwithALTER INDEX ... REBUILDinDvPostTargetSql. - Confusing DvPreTargetSql with DvPreSourceSql -- Symptom: SQL runs before the source query instead of before the INSERT. Fix: use
DvPreTargetSqlfor pre-INSERT logic andDvPreSourceSqlfor pre-source-query logic. - Dropping the temp table that the target INSERT will read from -- Symptom: "Invalid object name" error during the INSERT. Fix: never drop or truncate the staged temp table in
DvPreTargetSql.
Related Extension Points
- DvPostTargetSql -- runs after the target INSERT; use together to bracket the INSERT (e.g., disable/rebuild indexes).
- DvPreSourceSql -- runs before the source query; use when your setup logic relates to the source read, not the target write.
- DvPreProcessSql -- runs at the very start of the procedure; use for session-level setup.
Data Vault Source Override
Configure override for the Data Vault Source transformation node
Parameters
Name | Type | Description |
|---|---|---|
| table | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the override will be added |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
| OutputPathName | String | You must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvSourceOverride" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<!-- NOTE: The type of source override must be consistent to the connection type. -->
<# CustomOutput.ObjectInherit = false; #>
<OleDbSource Name="OLE_SRC - MyTable" ConnectionName="MY_SRC" CommandTimeout="0" ValidateExternalMetadata="false">
<DirectInput>SELECT [Code], [Name] FROM [dbo].[MyTable]</DirectInput>
<Columns>
<Column SourceColumn="Name" TargetColumn="MyTableName" />
<Column SourceColumn="Code" TargetColumn="MyTableCode" />
</Columns>
<# CustomOutput.OutputPathName = @"OLE_SRC - MyTable.Output"; #>
</OleDbSource>
What It Does
Completely replaces the default SSIS dataflow source component for a Data Vault load package. Instead of using the BimlFlex-generated OLE DB Source or ADO.NET Source that reads from the staging area, you provide your own source component with custom SQL, a different connection, or an entirely different source type. This is an ETL-only (SSIS) extension point that gives you full control over where and how the Data Vault dataflow reads its input data.
Where It Fires
Consumed in 90.flx-i-get-target-source-transformation.biml. This template generates the SSIS dataflow source component for Data Vault objects. When this extension point is defined, it replaces the entire default source node in the dataflow. SSIS-only -- does not apply to ELT/stored procedure patterns (Snowflake, Databricks, Fabric). The CustomOutput.OutputPathName must be set to connect the custom source to the downstream dataflow transformations.
When to Use It
- Read from a different source table or connection than what BimlFlex generates by default -- for example, sourcing
dv.HUB_Customerdata from a pre-processed view inAWLT_SRCinstead of the staging table inBFX_STG. - Apply a custom SQL query with complex joins or filters at the source level -- use this instead of
DvSourcePipelinePrewhen you need to replace the entire source, not just add a transformation after it. - Use a flat file or Excel source for a one-time data migration into the Data Vault when the data does not come from a relational staging area.
- Override the source to read from an ODS connection (
BFX_ODS) for a re-load scenario where data needs to be re-ingested into the Data Vault from an intermediate layer.
Prerequisites
- SSIS (ETL) architecture must be in use. This extension point does not apply to ELT patterns.
- The connection referenced in the source component must be defined in BimlFlex metadata.
CustomOutput.OutputPathNamemust be set to the output path of your source component so downstream transformations can connect.- The source component type must be consistent with the connection type (e.g.,
OleDbSourcefor OLE DB connections,AdoNetSourcefor ADO.NET connections).
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvSourceOverrideand thetargetattribute to your Data Vault object. - Define the replacement source component in Biml XML, including the connection, query, and column mappings.
- Set
CustomOutput.OutputPathNameto the output path name of your source component. - Build and verify the generated SSIS package in Visual Studio to confirm the source component is wired correctly.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvSourceOverride" target="HUB_Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<# CustomOutput.ObjectInherit = false; #>
<OleDbSource Name="OLE_SRC - Customer_Filtered" ConnectionName="AWLT_SRC" CommandTimeout="0" ValidateExternalMetadata="false">
<DirectInput>
SELECT [CustomerID] AS [Customer_BK], [CustomerName]
FROM [SalesLT].[Customer]
WHERE [IsActive] = 1
</DirectInput>
<# CustomOutput.OutputPathName = @"OLE_SRC - Customer_Filtered.Output"; #>
</OleDbSource>
Example
Before (default generated SSIS source component):
<OleDbSource Name="OLE_SRC - BFX_STG_awlt_Customer" ConnectionName="BFX_STG" CommandTimeout="0">
<DirectInput>SELECT * FROM [awlt].[SalesLT_Customer]</DirectInput>
</OleDbSource>
After (with DvSourceOverride applied):
<OleDbSource Name="OLE_SRC - Customer_Filtered" ConnectionName="AWLT_SRC" CommandTimeout="0" ValidateExternalMetadata="false">
<DirectInput>
SELECT [CustomerID] AS [Customer_BK], [CustomerName]
FROM [SalesLT].[Customer]
WHERE [IsActive] = 1
</DirectInput>
</OleDbSource>
Common Mistakes
- Forgetting to set
CustomOutput.OutputPathName-- Symptom: the SSIS package fails validation because downstream transformations have no connected input. Fix: always setCustomOutput.OutputPathNameto"<SourceName>.Output"matching your source component'sNameattribute. - Column name or data type mismatches between the custom source and expected downstream columns -- Symptom: SSIS validation errors or data truncation. Fix: ensure column names and types in your custom source match what the downstream transformations expect; use
<Columns>withSourceColumn/TargetColumnmappings to alias if needed. - Using this extension point with ELT architecture -- Symptom: the extension point is ignored; the stored procedure source query is unchanged. Fix: for ELT patterns, use
DvPreSourceSql/DvPostSourceSqlto modify the source query SQL. - Using an incompatible source component type for the connection -- Symptom: SSIS build error about incompatible connection manager. Fix: match the source component to the connection type (e.g.,
OleDbSourcefor OLE DB,AdoNetSourcefor ADO.NET).
Related Extension Points
- DvSourcePipelinePre / DvSourcePipelinePost -- use these to add transformations before or after the source in the dataflow rather than replacing the source entirely.
- DvTargetOverride -- the target-side equivalent; replaces the destination component in the SSIS dataflow.
- DvPreSourceSql / DvPostSourceSql -- ELT equivalents for modifying the source query in stored procedures.
Data Vault Source Pipeline Post
Configure pipeline logic that will be injected after the Data Vault source transformations
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the target object to which the pipeline will be added |
| inputPath | String | Contains the output path of the preceding task |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
| OutputPathName | String | You must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvSourcePipelinePost" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>
<!-- This can be any anything defined within the SSIS Data Flow. -->
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>
What It Does
Injects custom SSIS dataflow transformation components after the Data Vault source-side transformations but before the data reaches the target-side logic. This extension point lets you add data conversions, derived columns, lookups, conditional splits, or any other SSIS dataflow component into the pipeline after the source data has been read and initial transformations have been applied. Use it to reshape or enrich the data in-flight before it is written to the Hub, Link, or Satellite.
Where It Fires
Consumed in 90.flx-i-get-target-source-conversion.biml and 90.flx-i-get-source-rowcount.biml. These templates generate the SSIS dataflow source-side transformation chain for Data Vault objects. The custom component is inserted after BimlFlex's standard source-side transformations (row counts, data conversions). SSIS-only -- does not apply to ELT/stored procedure patterns.
When to Use It
- Add a Data Conversion component to change the data type of a column coming from the
AWLT_SRCsource before it reaches the Hub insert -- use this instead ofDvSourceOverridewhen you want to keep the default source and just add a transformation. - Add a Derived Column transformation to compute a composite business key or apply a string normalization function before loading
dv.HUB_Customer. - Add a Lookup transformation against a reference table to enrich Satellite attributes before they are inserted into
dv.SAT_Customer_Details_awlt. - Insert a Conditional Split to route error rows to a separate output path for logging while allowing valid rows to continue to the Data Vault target.
- Add a Row Count transformation for custom audit logging -- use this instead of
DvSourcePipelinePrewhen you need the count after source-side transformations have been applied.
Prerequisites
- SSIS (ETL) architecture must be in use. This extension point does not apply to ELT patterns.
- The
inputPathparameter provides the output path name from the preceding component; you must reference it in your component's<InputPath>element. CustomOutput.OutputPathNamemust be set so downstream components can connect.- The transformation component must produce column names and types compatible with what downstream components expect.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvSourcePipelinePostand thetargetattribute to your Data Vault object. - Define the SSIS dataflow component in Biml XML, referencing
inputPathfor the input and settingCustomOutput.OutputPathNamefor the output. - Build and verify the generated SSIS package in Visual Studio.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvSourcePipelinePost" target="HUB_Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>
<# CustomOutput.ObjectInherit = false; #>
<DerivedColumns Name="DER - Normalize CustomerName">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column Name="Customer_BK" DataType="String" Length="50" ReplaceExisting="true">UPPER(TRIM(Customer_BK))</Column>
</Columns>
<# CustomOutput.OutputPathName = @"DER - Normalize CustomerName.Output"; #>
</DerivedColumns>
Example
Before (default SSIS dataflow source-side chain):
<!-- BimlFlex source transformations -->
<RowCount Name="RC_SRC - Customer" VariableName="User.SourceRowCount">
<InputPath OutputPathName="OLE_SRC - Customer.Output" />
</RowCount>
<!-- Data flows directly to target-side transformations -->
After (with DvSourcePipelinePost applied):
<!-- BimlFlex source transformations -->
<RowCount Name="RC_SRC - Customer" VariableName="User.SourceRowCount">
<InputPath OutputPathName="OLE_SRC - Customer.Output" />
</RowCount>
<DerivedColumns Name="DER - Normalize CustomerName">
<InputPath OutputPathName="RC_SRC - Customer.Output" />
<Columns>
<Column Name="Customer_BK" DataType="String" Length="50" ReplaceExisting="true">UPPER(TRIM(Customer_BK))</Column>
</Columns>
</DerivedColumns>
<!-- Data flows to target-side transformations from DER output -->
Common Mistakes
- Forgetting to set
CustomOutput.OutputPathName-- Symptom: downstream transformations have no input connection; SSIS package fails validation. Fix: always setCustomOutput.OutputPathNameto"<ComponentName>.Output". - Hardcoding the input path instead of using
<#=inputPath #>-- Symptom: the component connects to the wrong upstream output or fails if BimlFlex changes the preceding component. Fix: always use theinputPathparameter. - Renaming or removing columns that downstream components depend on -- Symptom: SSIS validation error about missing columns. Fix: use
ReplaceExisting="true"for in-place modifications, or add new columns rather than removing existing ones. - Using this extension point with ELT architecture -- Symptom: the extension point is ignored. Fix: for ELT, use
DvPostSourceSqlto add transformations in the stored procedure.
Related Extension Points
- DvSourcePipelinePre -- runs before the source-side transformations; use when you need to transform data immediately after the source read.
- DvTargetPipelinePre / DvTargetPipelinePost -- for adding transformations on the target side of the dataflow.
- DvSourceOverride -- for replacing the entire source component rather than adding transformations after it.
Data Vault Source Pipeline Pre
Configure pipeline logic that will be injected after the Data Vault source transformation node
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the target object to which the pipeline will be added |
| inputPath | String | Contains the output path of the preceding task |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
| OutputPathName | String | You must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvSourcePipelinePre" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>
<!-- This can be any anything defined within the SSIS Data Flow. -->
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>
What It Does
Injects custom SSIS dataflow transformation components immediately after the Data Vault source component but before the standard source-side transformations (such as row counts and data conversions). This extension point lets you add transformations that act on the raw source data before BimlFlex applies its own processing. Use it for early-stage filtering, type casting, or data cleansing that must happen before the standard pipeline logic.
Where It Fires
Consumed in 90.flx-i-get-target-source-conversion.biml and 90.flx-i-get-source-rowcount.biml. These templates generate the SSIS dataflow source-side transformation chain for Data Vault objects. The custom component is inserted after the source component but before BimlFlex's standard source-side transformations. SSIS-only -- does not apply to ELT/stored procedure patterns.
When to Use It
- Add an early Data Conversion to cast columns from the
AWLT_SRCsource to the expected types before BimlFlex's own data conversion runs -- use this instead ofDvSourcePipelinePostwhen the type change must happen before row counts or hash calculations. - Insert a Conditional Split immediately after the source to discard invalid rows before they enter the standard transformation chain.
- Add a Derived Column to create a surrogate or composite column that BimlFlex's standard transformations will reference downstream.
- Insert a data sampling or auditing component for debugging purposes early in the pipeline.
Prerequisites
- SSIS (ETL) architecture must be in use. This extension point does not apply to ELT patterns.
- The
inputPathparameter provides the output path name from the source component; reference it in your<InputPath>element. CustomOutput.OutputPathNamemust be set so the standard BimlFlex transformations can connect downstream.- The transformation must preserve all columns expected by subsequent BimlFlex-generated components.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvSourcePipelinePreand thetargetattribute to your Data Vault object. - Define the SSIS dataflow component in Biml XML, referencing
inputPathand settingCustomOutput.OutputPathName. - Build and verify the generated SSIS package in Visual Studio.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvSourcePipelinePre" target="SAT_Customer_Details_awlt" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>
<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Cast EmailAddress">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="EmailAddress" TargetColumn="cnv_EmailAddress" DataType="String" Length="320" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Cast EmailAddress.Output"; #>
</DataConversion>
Example
Before (default SSIS dataflow beginning):
<OleDbSource Name="OLE_SRC - Customer" ConnectionName="BFX_STG" CommandTimeout="0">
<DirectInput>SELECT * FROM [awlt].[SalesLT_Customer]</DirectInput>
</OleDbSource>
<!-- BimlFlex standard source transformations (row count, etc.) follow -->
After (with DvSourcePipelinePre applied):
<OleDbSource Name="OLE_SRC - Customer" ConnectionName="BFX_STG" CommandTimeout="0">
<DirectInput>SELECT * FROM [awlt].[SalesLT_Customer]</DirectInput>
</OleDbSource>
<DataConversion Name="DCV - Cast EmailAddress">
<InputPath OutputPathName="OLE_SRC - Customer.Output" />
<Columns>
<Column SourceColumn="EmailAddress" TargetColumn="cnv_EmailAddress" DataType="String" Length="320" />
</Columns>
</DataConversion>
<!-- BimlFlex standard source transformations (row count, etc.) follow from DCV output -->
Common Mistakes
- Removing or renaming columns that BimlFlex's standard transformations expect -- Symptom: SSIS validation errors in downstream components. Fix: only add new columns or use
ReplaceExistingfor in-place type changes; never drop columns. - Hardcoding the input path -- Symptom: breaks if BimlFlex changes the source component name. Fix: always use
<#=inputPath #>. - Confusing Pre with Post -- Symptom: transformation runs after BimlFlex row counts and hash calculations instead of before. Fix: use
DvSourcePipelinePrefor early-stage transformations andDvSourcePipelinePostfor post-standard-transformation logic. - Using this with ELT patterns -- Symptom: extension point is ignored. Fix: for ELT, use
DvPreSourceSqlto modify the source query in the stored procedure.
Related Extension Points
- DvSourcePipelinePost -- runs after the standard source-side transformations; use when you need to transform data after BimlFlex's processing.
- DvSourceOverride -- for replacing the entire source component rather than adding transformations after it.
- DvTargetPipelinePre -- the target-side equivalent for adding transformations before target-side processing.
Data Vault Target Override
Configure override for the Data Vault target transformation node
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| inputPath | String | Contains the output path of the preceding task |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvTargetOverride" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>
<!-- NOTE: The type of target override must be consistent to the connection type. -->
<# var targetConnection = table.GetTargetConnection();
var targetScopedName = table.GetTargetScopedName(targetConnection);
var targetSsisSafeScopedName = targetScopedName.MakeSsisSafe();
#>
<# CustomOutput.ObjectInherit = false; #>
<OleDbDestination Name="OLE_DST - <#=targetSsisSafeScopedName#>" ConnectionName="<#=targetConnection#>" MaximumInsertCommitSize="500000" BatchSize="500000" CheckConstraints="false">
<InputPath OutputPathName="<#=inputPath#>" />
<ExternalTableOutput Table="[dbo].[HUB_MyTable]" />
</OleDbDestination>
What It Does
Completely replaces the default SSIS dataflow destination component for a Data Vault load package. Instead of using the BimlFlex-generated OLE DB Destination that writes to the Hub, Link, or Satellite table, you provide your own destination component with custom connection settings, table mappings, or an entirely different destination type. This is an ETL-only (SSIS) extension point that gives you full control over how and where the Data Vault dataflow writes its output.
Where It Fires
Consumed in 90.flx-i-get-target-transformation.biml (line ~17-22). Checked AFTER DwhTargetOverride (the Data Mart target override takes precedence). The fallback order is: DwhTargetOverride > DvTargetOverride > default destination. When defined, the custom destination replaces the entire default target node in the dataflow. SSIS-only -- does not apply to ELT/stored procedure patterns.
When to Use It
- Redirect the Data Vault output to a different table or schema than what BimlFlex metadata specifies -- for example, writing
dv.HUB_Customerto a partitioned staging table with a subsequent swap operation. - Change the destination component type -- for example, use an ADO.NET Destination instead of OLE DB for a specific connectivity requirement to the
BFX_DVtarget. - Adjust batch size, commit size, or constraint-checking settings on the destination for performance tuning during initial loads.
- Write to a flat file or other non-relational destination for a one-time export of Data Vault data through the standard pipeline.
- Add a custom OLE DB Command destination for UPSERT logic -- use this instead of the default INSERT when the target does not support the standard BimlFlex merge pattern.
Prerequisites
- SSIS (ETL) architecture must be in use. This extension point does not apply to ELT patterns.
- The
inputPathparameter provides the output path name from the preceding transformation; reference it in your component's<InputPath>element. - The connection referenced in the destination must be defined in BimlFlex metadata.
- The destination component type must be consistent with the connection type.
- Be aware that
DwhTargetOverridetakes precedence if both are defined for the same object.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvTargetOverrideand thetargetattribute to your Data Vault object. - Define the replacement destination component in Biml XML, referencing
inputPathfor the input. - Build and verify the generated SSIS package in Visual Studio to confirm the destination component is wired correctly.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvTargetOverride" target="HUB_Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>
<# var targetConnection = targetTable.GetTargetConnection();
var targetScopedName = targetTable.GetTargetScopedName(targetConnection);
var targetSsisSafeScopedName = targetScopedName.MakeSsisSafe();
#>
<# CustomOutput.ObjectInherit = false; #>
<OleDbDestination Name="OLE_DST - <#=targetSsisSafeScopedName#>" ConnectionName="<#=targetConnection#>" MaximumInsertCommitSize="200000" BatchSize="200000" CheckConstraints="false">
<InputPath OutputPathName="<#=inputPath#>" />
<ExternalTableOutput Table="[dv].[HUB_Customer]" />
</OleDbDestination>
Example
Before (default generated SSIS destination component):
<OleDbDestination Name="OLE_DST - dv_HUB_Customer" ConnectionName="BFX_DV" MaximumInsertCommitSize="500000" BatchSize="500000" CheckConstraints="false">
<InputPath OutputPathName="RC_TGT - HUB_Customer.Output" />
<ExternalTableOutput Table="[dv].[HUB_Customer]" />
</OleDbDestination>
After (with DvTargetOverride applied -- reduced batch sizes):
<OleDbDestination Name="OLE_DST - dv_HUB_Customer" ConnectionName="BFX_DV" MaximumInsertCommitSize="200000" BatchSize="200000" CheckConstraints="false">
<InputPath OutputPathName="RC_TGT - HUB_Customer.Output" />
<ExternalTableOutput Table="[dv].[HUB_Customer]" />
</OleDbDestination>
Common Mistakes
- Having both DwhTargetOverride and DvTargetOverride defined for the same object -- Symptom:
DvTargetOverrideis silently ignored becauseDwhTargetOverridetakes precedence. Fix: use only one target override per object; remove theDwhTargetOverrideif you intend the DV-specific override to apply. - Forgetting to reference
inputPathin the<InputPath>element -- Symptom: SSIS validation fails because the destination has no input connection. Fix: always use<InputPath OutputPathName="<#=inputPath#>" />. - Mismatching the table schema in
<ExternalTableOutput>-- Symptom: column mapping errors in the SSIS package. Fix: ensure theTableattribute references the correct schema and table name with proper bracket escaping. - Using this with ELT architecture -- Symptom: the extension point is ignored; the stored procedure target INSERT is unchanged. Fix: for ELT, use
DvPreTargetSql/DvPostTargetSqlto modify the target INSERT behavior.
Related Extension Points
- DvTargetPipelinePre / DvTargetPipelinePost -- use these to add transformations before or after the destination rather than replacing it entirely.
- DvSourceOverride -- the source-side equivalent; replaces the source component in the SSIS dataflow.
- DvPreTargetSql / DvPostTargetSql -- ELT equivalents for modifying the target INSERT in stored procedures.
Data Vault Target Pipeline Post
Configure pipeline logic that will be injected before the target transformation node
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| inputPath | String | Contains the output path of the preceding task |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
| OutputPathName | String | You must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvTargetPipelinePost" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>
<!-- This can be any anything defined within the SSIS Data Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>
What It Does
Injects custom SSIS dataflow transformation components after the Data Vault target-side transformations but before the destination component. This extension point lets you add final data conversions, derived columns, row counts, lookups, or other SSIS transformations that act on the data immediately before it is written to the Hub, Link, or Satellite. Use it for last-mile data shaping that must happen after all standard target-side processing.
Where It Fires
Consumed in 90.flx-i-get-target-transformation.biml (line ~12). Conditional logic applies: if the integrationStage is "DWH", the Dwh variant is used; otherwise the Dv variant fires. This ensures the correct extension point activates based on the object's integration stage. SSIS-only -- does not apply to ELT/stored procedure patterns.
When to Use It
- Add a final Data Conversion to ensure column types exactly match the Hub or Satellite table schema before the destination insert -- use this instead of
DvTargetPipelinePrewhen you need the conversion applied after all standard target-side transformations. - Insert a Row Count transformation immediately before the destination for custom audit logging of the exact number of rows reaching
dv.SAT_Customer_Details_awlt. - Add a Multicast to split the data stream and send a copy to a secondary destination (e.g., an audit table) while the primary stream goes to the Data Vault target.
- Insert a Derived Column transformation to add a load timestamp or batch identifier column just before the data reaches
dv.HUB_Customer.
Prerequisites
- SSIS (ETL) architecture must be in use. This extension point does not apply to ELT patterns.
- The
inputPathparameter provides the output path from the preceding target-side transformation; reference it in your<InputPath>element. CustomOutput.OutputPathNamemust be set so the destination component can connect.- The object's integration stage must be Data Vault (not Data Mart/DWH) for the Dv variant to fire.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvTargetPipelinePostand thetargetattribute to your Data Vault object. - Define the SSIS dataflow component in Biml XML, referencing
inputPathand settingCustomOutput.OutputPathName. - Build and verify the generated SSIS package in Visual Studio.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvTargetPipelinePost" target="SAT_Customer_Details_awlt" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>
<# CustomOutput.ObjectInherit = false; #>
<RowCount Name="RC_AUDIT - SAT_Customer_Details" VariableName="User.AuditInsertCount">
<InputPath OutputPathName="<#=inputPath #>" />
<# CustomOutput.OutputPathName = @"RC_AUDIT - SAT_Customer_Details.Output"; #>
</RowCount>
Example
Before (default SSIS dataflow target-side chain before destination):
<!-- BimlFlex target-side transformations complete -->
<RowCount Name="RC_TGT - SAT_Customer_Details" VariableName="User.TargetRowCount">
<InputPath OutputPathName="LKP - SAT_Customer_Details.Output" />
</RowCount>
<!-- Destination component follows -->
After (with DvTargetPipelinePost applied):
<!-- BimlFlex target-side transformations complete -->
<RowCount Name="RC_TGT - SAT_Customer_Details" VariableName="User.TargetRowCount">
<InputPath OutputPathName="LKP - SAT_Customer_Details.Output" />
</RowCount>
<RowCount Name="RC_AUDIT - SAT_Customer_Details" VariableName="User.AuditInsertCount">
<InputPath OutputPathName="RC_TGT - SAT_Customer_Details.Output" />
</RowCount>
<!-- Destination component connects to RC_AUDIT output -->
Common Mistakes
- Forgetting to set
CustomOutput.OutputPathName-- Symptom: the destination component has no input; SSIS package fails validation. Fix: always setCustomOutput.OutputPathNameto your component's output path. - Hardcoding the input path instead of using
<#=inputPath #>-- Symptom: breaks if BimlFlex changes the preceding component name. Fix: always use theinputPathparameter. - Adding transformations that filter or remove rows without accounting for downstream expectations -- Symptom: fewer rows arrive at the destination than expected; audit counts mismatch. Fix: if you need to filter, use a Conditional Split and direct both outputs appropriately.
- Confusing DvTargetPipelinePost with DvTargetPipelinePre -- Symptom: transformation runs before target-side processing instead of after. Fix: use
DvTargetPipelinePostfor post-target-transformation logic.
Related Extension Points
- DvTargetPipelinePre -- runs before the target-side transformations; use for early target-side data shaping.
- DvTargetOverride -- for replacing the entire destination component rather than adding transformations before it.
- DvSourcePipelinePost -- the source-side equivalent for adding post-source transformations.
- DvPostTargetSql -- ELT equivalent for adding logic after the target INSERT in stored procedures.
Data Vault Target Pipeline Pre
Configure pipeline logic that will be injected before the target transformations
Parameters
Name | Type | Description |
|---|---|---|
| sourceTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| targetTable | BimlFlexModelWrapper.ObjectsWrapper | Contains all information related to the object to which the pipeline will be added |
| inputPath | String | Contains the output path of the preceding task |
Outputs
Name | Type | Description |
|---|---|---|
| ObjectInherit | Boolean | If CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch. |
| OutputPathName | String | You must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task. |
Template
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvTargetPipelinePre" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>
<!-- This can be any anything defined within the SSIS Data Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>
What It Does
Injects custom SSIS dataflow transformation components before the Data Vault target-side transformations begin. This extension point lets you add transformations that act on the data after source-side processing is complete but before BimlFlex applies its own target-side logic (such as lookups, hash comparisons, and row filtering for Hub/Link/Satellite inserts). Use it for early target-side data preparation, additional lookups, or filtering.
Where It Fires
Consumed in 90.flx-i-get-target-transformation.biml (line ~12). Conditional logic applies: if the integrationStage is "DWH", the Dwh variant is used; otherwise the Dv variant fires. This ensures the correct extension point activates based on the object's integration stage. SSIS-only -- does not apply to ELT/stored procedure patterns.
When to Use It
- Add a Lookup transformation against a reference table to enrich rows with classification codes before BimlFlex's target-side Hub/Link key lookups execute -- use this instead of
DvSourcePipelinePostwhen the enrichment must happen on the target side of the dataflow. - Insert a Conditional Split to filter out rows that should not participate in target-side processing -- for example, excluding test records before loading
dv.LNK_SalesOrder. - Add a Derived Column transformation to compute a value that the target-side lookup or hash comparison will reference.
- Insert an Aggregate transformation to deduplicate rows before they enter the target-side processing for
dv.HUB_Customer, preventing duplicate business key inserts. - Add a Data Conversion component to change column types to match what target-side lookups expect -- use this instead of
DvTargetPipelinePostwhen the type change must happen before the lookup.
Prerequisites
- SSIS (ETL) architecture must be in use. This extension point does not apply to ELT patterns.
- The
inputPathparameter provides the output path from the source-side chain; reference it in your<InputPath>element. CustomOutput.OutputPathNamemust be set so BimlFlex's target-side transformations can connect.- The object's integration stage must be Data Vault (not Data Mart/DWH) for the Dv variant to fire.
Implementation Steps
- Create a new extension point file in the BimlFlex Extension Points editor.
- Set the
extensionpointdirective toDvTargetPipelinePreand thetargetattribute to your Data Vault object. - Define the SSIS dataflow component in Biml XML, referencing
inputPathand settingCustomOutput.OutputPathName. - Build and verify the generated SSIS package in Visual Studio.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DvTargetPipelinePre" target="HUB_Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>
<# CustomOutput.ObjectInherit = false; #>
<Aggregate Name="AGG - Deduplicate Customer BK">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="Customer_BK" Operation="GroupBy" />
<Column SourceColumn="FlexRowEffectiveFromDate" Operation="Maximum" />
</Columns>
<# CustomOutput.OutputPathName = @"AGG - Deduplicate Customer BK.Output"; #>
</Aggregate>
Example
Before (default SSIS dataflow transition from source-side to target-side):
<!-- Source-side transformations complete -->
<RowCount Name="RC_SRC - Customer" VariableName="User.SourceRowCount">
<InputPath OutputPathName="DCV - Customer.Output" />
</RowCount>
<!-- BimlFlex target-side transformations (lookups, hash checks) begin -->
<Lookup Name="LKP - HUB_Customer" ...>
<InputPath OutputPathName="RC_SRC - Customer.Output" />
After (with DvTargetPipelinePre applied):
<!-- Source-side transformations complete -->
<RowCount Name="RC_SRC - Customer" VariableName="User.SourceRowCount">
<InputPath OutputPathName="DCV - Customer.Output" />
</RowCount>
<Aggregate Name="AGG - Deduplicate Customer BK">
<InputPath OutputPathName="RC_SRC - Customer.Output" />
<Columns>
<Column SourceColumn="Customer_BK" Operation="GroupBy" />
<Column SourceColumn="FlexRowEffectiveFromDate" Operation="Maximum" />
</Columns>
</Aggregate>
<!-- BimlFlex target-side transformations connect to AGG output -->
<Lookup Name="LKP - HUB_Customer" ...>
<InputPath OutputPathName="AGG - Deduplicate Customer BK.Output" />
Common Mistakes
- Forgetting to set
CustomOutput.OutputPathName-- Symptom: target-side transformations have no input; SSIS package fails validation. Fix: always setCustomOutput.OutputPathNameto your component's output path. - Hardcoding the input path instead of using
<#=inputPath #>-- Symptom: breaks if BimlFlex changes the preceding source-side component. Fix: always use theinputPathparameter. - Removing columns that BimlFlex's target-side lookups depend on -- Symptom: SSIS validation error about missing columns in the Lookup component. Fix: preserve all columns expected by downstream components; only add new columns or aggregate existing ones without dropping them.
- Using this with an object that has integration stage DWH -- Symptom: the Dv variant does not fire; the Dwh variant fires instead. Fix: verify the object's integration stage is Data Vault, or use
DwhTargetPipelinePrefor Data Mart objects. - Confusing Pre with Post -- Symptom: transformation runs after target-side processing instead of before. Fix: use
DvTargetPipelinePrefor pre-target-transformation logic andDvTargetPipelinePostfor post-target-transformation logic.
Related Extension Points
- DvTargetPipelinePost -- runs after the target-side transformations; use for final data shaping before the destination.
- DvTargetOverride -- for replacing the entire destination component rather than adding transformations before it.
- DvSourcePipelinePre / DvSourcePipelinePost -- the source-side equivalents for adding transformations in the source portion of the dataflow.
- DvPreTargetSql -- ELT equivalent for adding logic before the target INSERT in stored procedures.