Skip to main content

Data Mart Extension Points

The Data Mart category has the following available Extension Points defined.

Data Warehouse Insert Pipeline

Configure insert pipeline logic that will be injected before the target transformation node

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the Batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhInsertPipeline" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>

What It Does

Injects custom SSIS Data Flow transformations into the insert path of a Data Mart load pipeline. When BimlFlex generates the SSIS package that loads dimension or fact tables, the insert pipeline handles new rows that do not yet exist in the target. This extension point lets you add transformations -- such as data conversions, derived columns, or lookups -- immediately before those new rows reach the destination component. It gives you fine-grained control over the data just before the insert without modifying the generated package by hand.

Where It Fires

Consumed in 95.flx-i-process-insert.biml. The emitted Biml is placed inside the SSIS Data Flow task that handles the insert path for new rows destined for DIM or FCT tables in the Data Mart layer.

When to Use It

  1. You need to convert a column data type (e.g., cast CustomerName from DT_WSTR to DT_STR) before inserting into dim.Customer in the BFX_DM connection.
  2. A derived column must be computed on the fly (e.g., concatenating FirstName and LastName into FullName) before the row lands in the fact or dimension table.
  3. You want to add a Lookup transformation that enriches incoming rows from dv.SAT_Customer_Details_awlt with reference data before they are inserted into dim.Customer.
  4. Use this instead of DwhTargetPipelinePre when your transformation applies only to the insert path and should not affect Type 1 or Type 2 update paths.
  5. A row-level audit column (e.g., InsertBatchId) needs to be stamped via a Derived Column transformation specifically for newly inserted rows in fact.SalesOrder.

Prerequisites

  • The target object must be a DIM or FCT table in a Data Mart integration stage (connection BFX_DM).
  • The object must be part of a batch such as LOAD_BFX_DM.
  • SSIS deployment model; this extension point modifies the SSIS Data Flow and does not apply to ADF or SQL-only builds.
  • You must set CustomOutput.OutputPathName to the output of your last transformation so the pipeline chain remains connected.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhInsertPipeline.
  3. Set the target attribute to the specific Data Mart object (e.g., dim.Customer).
  4. Add your Data Flow transformation Biml, connecting it to the inputPath parameter and setting CustomOutput.OutputPathName.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhInsertPipeline" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>

<DerivedColumns Name="DER - DefaultStatus">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column Name="IsActive" DataType="Boolean">true</Column>
</Columns>
<# CustomOutput.OutputPathName = @"DER - DefaultStatus.Output"; #>
</DerivedColumns>
  1. Build the project in BimlStudio and verify the Data Flow contains the new transformation before the OLE DB Destination.

Example

Before -- the insert pipeline sends rows directly from the source split to the destination:

[Conditional Split] --> [OLE_DST - dim.Customer]

After -- with the extension point applied, a Derived Column transformation is inserted:

[Conditional Split] --> [DER - DefaultStatus] --> [OLE_DST - dim.Customer]

Common Mistakes

MistakeSymptomFix
Not setting CustomOutput.OutputPathName after your last transformation.Build error: downstream destination cannot find its input path.Always set CustomOutput.OutputPathName to @"YourLastTask.Output".
Using this extension point for SQL-based or ADF pipelines.The extension point is silently ignored; no transformation appears.Use a SQL extension point like DwhPreType1InsertSql for stored procedure-based loads.
Applying to an object that is not in the DWH/DM integration stage.Extension point does not fire; no error but no effect.Verify the target object's connection is set to a Data Mart integration stage (e.g., BFX_DM).
Adding a transformation that changes column names without updating downstream mappings.Runtime error: column not found in destination.Ensure destination column mappings match the output columns of your transformation.
  • DwhTargetPipelinePre -- fires for all rows entering the target (not just inserts); use when the transformation should apply to updates as well.
  • DwhType1Pipeline / DwhType2Pipeline -- insert transformations into the Type 1 or Type 2 specific paths instead of the insert path.
  • DwhSourcePipelinePost -- add transformations immediately after the source rather than immediately before the insert destination.
  • DwhTargetOverride -- replaces the entire destination component; use when you need a completely custom target rather than injecting a step before it.

Data Warehouse Post Process SQL

Configure SQL to be injected at the end of the Data Warehouse stored procedure

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostProcessSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */

What It Does

Injects a custom SQL fragment at the very end of a Data Mart stored procedure, after all source queries, delta detection, Type 1 updates, Type 2 inserts, and insert operations have completed. This is the last point in the generated procedure where you can execute SQL logic. Use it to perform post-load cleanup, update audit tables, rebuild indexes, or execute any SQL that should run only after the entire Data Mart load for a specific object has finished successfully.

Where It Fires

Registered in Code/DataWarehouseViewModel.cs (lines ~45-62) via GetDataWarehouseExtensionPoints() and consumed via GetTargetObjectExtensionPoint(). The SQL fragment is injected into the generated stored procedure at the final position, after all other DM load logic has executed.

When to Use It

  1. After loading dim.Customer from dv.SAT_Customer_Details_awlt, you need to rebuild a nonclustered index to maintain query performance for downstream BI reports.
  2. You want to insert a row into a custom audit table recording the completion timestamp and row count after fact.SalesOrder finishes loading in the LOAD_BFX_DM batch.
  3. A post-load data quality check must run to flag any dim.Product rows with NULL values in required columns, logging violations to a QA table.
  4. Use this instead of DwhPostSqlCallProcess when you need the SQL to execute inside the stored procedure itself rather than as a separate SSIS Execute SQL task after the procedure call.
  5. A materialized view or summary table needs to be refreshed after the dimension load completes.

Prerequisites

  • The target object must be a DIM or FCT table in a Data Mart integration stage (connection BFX_DM).
  • The stored procedure for the object must be generated by BimlFlex (SQL-based or Snowflake SQL load pattern).
  • If ObjectInherit = true, the SQL is appended to all objects in the batch.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPostProcessSql.
  3. Set the target attribute to the specific Data Mart object.
  4. Write the SQL fragment that should execute at the end of the procedure.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostProcessSql" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
-- Log completion to audit table
INSERT INTO [BFX_DM].[audit].[LoadLog] (ObjectName, LoadEndTime, Status)
VALUES ('dim.Customer', GETDATE(), 'Complete');
  1. Build the project and inspect the generated stored procedure to confirm the SQL appears at the end.

Example

Before -- the generated stored procedure ends with the final insert/update operation:

-- ... Type 1 updates and inserts above ...
END

After -- with the extension point applied, your SQL appears just before the procedure ends:

-- ... Type 1 updates and inserts above ...
INSERT INTO [BFX_DM].[audit].[LoadLog] (ObjectName, LoadEndTime, Status)
VALUES ('dim.Customer', GETDATE(), 'Complete');
END

Common Mistakes

MistakeSymptomFix
Wrapping your SQL in BEGIN...END or adding GO statements.Syntax error in the generated stored procedure.Provide raw SQL statements only; the procedure body already has its own BEGIN...END block.
Setting ObjectInherit = true unintentionally.The post-process SQL runs for every object in the batch, not just the intended target.Set CustomOutput.ObjectInherit = false and use the target attribute to scope it.
Referencing temp tables created in DwhPreProcessSql that have gone out of scope."Invalid object name" error at runtime.Use permanent staging tables or table variables that persist through the entire procedure.
  • DwhPreProcessSql -- injects SQL at the start of the same stored procedure; pair them for setup/teardown patterns.
  • DwhPostSqlCallProcess -- adds an SSIS Execute SQL task after the stored procedure call; use when the SQL must run outside the procedure.
  • DwhPostType1UpdateSql / DwhPostType1InsertSql / DwhPostType2InsertSql -- inject SQL after specific sub-operations rather than at the very end.

Data Warehouse Post Source SQL

Configure SQL to be injected after the staging query in the Data Warehouse stored procedure

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostSourceSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */

What It Does

Injects a custom SQL fragment immediately after the staging/source query within the Data Mart stored procedure. In the generated load procedure, BimlFlex first queries the source data (typically from Data Vault satellites or hubs) into a working set. This extension point fires right after that source query completes, allowing you to manipulate, filter, or augment the working data before delta detection and Type 1/Type 2 processing begins.

Where It Fires

Registered in Code/DataWarehouseViewModel.cs (lines ~45-62) via GetDataWarehouseExtensionPoints() and consumed via GetTargetObjectExtensionPoint(). The SQL fragment is injected into the generated stored procedure immediately after the source staging query that populates the working data set.

When to Use It

  1. After the source query pulls data from dv.SAT_Customer_Details_awlt into the working set for dim.Customer, you need to update a computed column (e.g., derive CustomerSegment from revenue thresholds) before delta detection runs.
  2. You want to remove duplicate rows from the working set based on business rules before they are compared against the target dim.Product.
  3. A business transformation requires joining the staged source data with a reference table to enrich rows (e.g., mapping region codes to region names) before the Type 1/Type 2 logic in the LOAD_BFX_DM batch.
  4. Use this instead of DwhPreType1DeltaSql when your logic must run immediately after source extraction rather than specifically before the delta detection step.
  5. Data quality rules need to be enforced (e.g., filtering out rows with invalid keys) before the data enters any downstream processing.

Prerequisites

  • The target object must be a DIM or FCT table in a Data Mart integration stage (connection BFX_DM).
  • The stored procedure must use a source query pattern (SQL-based load).
  • Your SQL must reference the temp table or CTE structure used by the generated source query.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPostSourceSql.
  3. Set the target attribute to the specific Data Mart object.
  4. Write the SQL fragment that should execute after the source query.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostSourceSql" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
-- Remove duplicates from working set, keeping latest record
DELETE s FROM #src s
INNER JOIN (
SELECT CustomerKey, MAX(FlexRowEffectiveFromDate) AS MaxDate
FROM #src GROUP BY CustomerKey
) keep ON s.CustomerKey = keep.CustomerKey
AND s.FlexRowEffectiveFromDate < keep.MaxDate;
  1. Build the project and inspect the generated stored procedure to confirm the SQL appears after the source query.

Example

Before -- the source query populates the working set and delta detection follows immediately:

-- Source query
SELECT ... INTO #src FROM [dv].[SAT_Customer_Details_awlt] ...
-- Delta detection begins
SELECT ... FROM #src s LEFT JOIN [dim].[Customer] t ...

After -- your deduplication logic runs between source extraction and delta detection:

SELECT ... INTO #src FROM [dv].[SAT_Customer_Details_awlt] ...
-- Extension Point: DwhPostSourceSql
DELETE s FROM #src s
INNER JOIN (
SELECT CustomerKey, MAX(FlexRowEffectiveFromDate) AS MaxDate
FROM #src GROUP BY CustomerKey
) keep ON s.CustomerKey = keep.CustomerKey
AND s.FlexRowEffectiveFromDate < keep.MaxDate;
-- Delta detection begins
SELECT ... FROM #src s LEFT JOIN [dim].[Customer] t ...

Common Mistakes

MistakeSymptomFix
Referencing the wrong temp table name in your SQL."Invalid object name" error at runtime.Inspect the generated stored procedure to identify the exact temp table name (e.g., #src).
Dropping the source temp table.All downstream steps fail with missing object errors.Only modify or filter rows; do not drop tables that subsequent steps depend on.
Adding logic that should run before the source query rather than after it.Data is not available yet when your SQL executes.Use DwhPreSourceSql instead if your logic must precede the source query.
  • DwhPreSourceSql -- injects SQL before the source query; use for creating helper temp tables or setting session options.
  • DwhPostProcessSql -- injects SQL at the very end of the procedure; use for cleanup after all operations complete.
  • DwhPreType1DeltaSql -- injects SQL specifically before the Type 1 delta detection step, which is the next logical stage after the source query.

Data Warehouse Post SQL Call Process

Add After the Data Warehouse SQL Call

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostSqlCallProcess" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# var sourceObject = new ObjectViewModel(sourceTable, sourceTable.Connection.RelatedItem.IntegrationStage, "SRC");
var targetObject = new ObjectViewModel(targetTable, targetTable.Connection.RelatedItem.IntegrationStage, "TGT");
#>
<# CustomOutput.ObjectInherit = true; #>
<ExecuteSQL Name="SQL - DwhPostSqlCallProcess" ConnectionName="<#=targetObject.Connection.Name#>">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<Parameters>
<Parameter Name="0" VariableName="User.ExecutionID" DataType="Int64" />
</Parameters>
<DirectInput>EXEC <#=targetObject.SystemType.StoredProcedureName(sourceObject, targetObject.Schema, targetObject.StartDelimiter, targetObject.EndDelimiter)#> ?</DirectInput>
<# CustomOutput.OutputPathName = @"SQL - DwhPostSqlCallProcess.Output"; #>
</ExecuteSQL>

What It Does

Adds an SSIS Execute SQL task (or other control flow task) that runs immediately after the main Data Mart stored procedure call completes. Unlike SQL extension points that inject code inside the stored procedure, this extension point operates at the SSIS package level, allowing you to execute independent SQL commands, call separate procedures, or perform post-load orchestration steps as discrete tasks in the control flow.

Where It Fires

Consumed in 90.flx-i-get-pkg-sql-dm-batch.biml (line ~209) and 90.flx-i-get-pkg-snowflake-sql-batch.biml. The emitted Biml is placed in the SSIS control flow after the Execute SQL task that calls the DM stored procedure. It uses a precedence constraint to chain after the main SQL call.

When to Use It

  1. After the dim.Customer stored procedure finishes in the LOAD_BFX_DM batch, you need to call a separate stored procedure that refreshes a dependent summary table in a different database.
  2. You want to execute a data quality validation procedure against fact.SalesOrder after it loads, and log the results to an external audit system via a different connection.
  3. A downstream notification (e.g., calling a Service Broker queue or sending a database mail) should fire after the DM load procedure completes for dim.Product.
  4. Use this instead of DwhPostProcessSql when the post-load logic must run as a separate SSIS task (e.g., against a different connection) rather than inside the generated stored procedure.
  5. An index rebuild or statistics update needs to run as a separate step with its own timeout and error handling, independent of the main load procedure.

Prerequisites

  • The target object must be in a Data Mart integration stage (connection BFX_DM).
  • SSIS deployment model; this extension point adds control flow tasks and does not apply to ADF-only builds.
  • You must set CustomOutput.OutputPathName so that subsequent tasks in the control flow can chain correctly.
  • The precedenceConstraint parameter must be wired to connect your task to the preceding step.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPostSqlCallProcess.
  3. Set the target attribute to the specific Data Mart object.
  4. Define the Execute SQL task or other control flow task, using the precedenceConstraint to chain it after the main SQL call.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostSqlCallProcess" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# var targetObject = new ObjectViewModel(targetTable, targetTable.Connection.RelatedItem.IntegrationStage, "TGT"); #>
<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Refresh Customer Summary" ConnectionName="<#=targetObject.Connection.Name#>">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) { #>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<DirectInput>EXEC [dm].[usp_RefreshCustomerSummary]</DirectInput>
<# CustomOutput.OutputPathName = @"SQL - Refresh Customer Summary.Output"; #>
</ExecuteSQL>
  1. Build the project and verify the new task appears in the SSIS control flow after the main stored procedure call.

Example

Before -- the control flow runs the DM stored procedure and proceeds to the next object:

[SQL - Load dim.Customer] --> [Next Task]

After -- with the extension point, a post-call task is inserted:

[SQL - Load dim.Customer] --> [SQL - Refresh Customer Summary] --> [Next Task]

Common Mistakes

MistakeSymptomFix
Not setting CustomOutput.OutputPathName.The pipeline chain breaks; subsequent tasks have no input to connect to.Always set CustomOutput.OutputPathName to your task's output (e.g., @"SQL - MyTask.Output").
Ignoring the precedenceConstraint parameter.Your task runs in parallel with the main SQL call instead of after it.Always wire the <PrecedenceConstraints> using the provided precedenceConstraint value.
Using this for logic that belongs inside the stored procedure.Unnecessary round-trips between SSIS and the database, adding latency.Use DwhPostProcessSql if the SQL can run inside the same procedure.
  • DwhPreSqlCallProcess -- adds a task before the stored procedure call; pair with this for pre/post orchestration.
  • DwhSqlCallOverride -- replaces the entire SQL call task; use when you need to completely change how the procedure is invoked.
  • DwhPostProcessSql -- injects SQL inside the stored procedure at the end; use when separate SSIS task overhead is not needed.

Data Warehouse Post Type1 Delta SQL

Configure SQL to be injected after the Type1 delta query in the Data Warehouse stored procedure

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostType1DeltaSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */

What It Does

Injects a custom SQL fragment immediately after the Type 1 SCD delta detection query in the Data Mart stored procedure. During a Type 1 load, BimlFlex identifies which rows have changed by comparing source and target columns. This extension point fires right after that comparison completes, giving you an opportunity to manipulate the delta result set before the actual Type 1 update and insert statements execute. Use it to filter, augment, or log the set of changed rows.

Where It Fires

Registered in Code/DataWarehouseViewModel.cs (lines ~45-62) via GetDataWarehouseExtensionPoints() and consumed via GetTargetObjectExtensionPoint(). The SQL fragment is injected into the generated stored procedure immediately after the delta detection query that identifies changed rows for Type 1 processing.

When to Use It

  1. After delta detection identifies changed rows for dim.Customer, you want to log the count of changed records to an audit table before the updates are applied.
  2. You need to exclude certain rows from the Type 1 delta set based on business rules (e.g., skip updates for customers flagged as "frozen" in a control table) before updates hit dim.Customer in BFX_DM.
  3. A data quality check must validate that the delta set does not contain unexpected NULL values in key columns before proceeding with the update of dim.Address.
  4. Use this instead of DwhPostSourceSql when your logic should apply specifically after delta detection rather than after the initial source extraction.
  5. You want to capture a before-image of the rows about to be updated by copying them to a history table before the Type 1 overwrite occurs.

Prerequisites

  • The target object must be a DIM or FCT table with Type 1 SCD columns in a Data Mart integration stage.
  • The object must have columns configured with ChangeType = Type1 in BimlFlex metadata.
  • SQL-based load pattern (stored procedure generation).

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPostType1DeltaSql.
  3. Set the target attribute to the specific Data Mart object.
  4. Write the SQL fragment that manipulates or inspects the delta result set.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostType1DeltaSql" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
-- Log Type 1 delta count before applying updates
INSERT INTO [BFX_DM].[audit].[DeltaLog] (ObjectName, DeltaType, RowCount, LogTime)
SELECT 'dim.Customer', 'Type1', COUNT(*), GETDATE()
FROM #type1_delta;
  1. Build the project and inspect the generated stored procedure to confirm the SQL appears after the delta query.

Example

Before -- delta detection flows directly into the Type 1 update:

-- Delta detection
SELECT ... INTO #type1_delta FROM #src s JOIN [dim].[Customer] t ...
-- Type 1 Update
UPDATE t SET ... FROM [dim].[Customer] t JOIN #type1_delta d ...

After -- your logging SQL is injected between delta detection and the update:

SELECT ... INTO #type1_delta FROM #src s JOIN [dim].[Customer] t ...
-- Extension Point: DwhPostType1DeltaSql
INSERT INTO [BFX_DM].[audit].[DeltaLog] (ObjectName, DeltaType, RowCount, LogTime)
SELECT 'dim.Customer', 'Type1', COUNT(*), GETDATE() FROM #type1_delta;
-- Type 1 Update
UPDATE t SET ... FROM [dim].[Customer] t JOIN #type1_delta d ...

Common Mistakes

MistakeSymptomFix
Modifying the delta temp table structure (adding/dropping columns).Downstream Type 1 update fails with column mismatch.Only filter rows or add logging; do not alter the schema of the delta temp table.
Assuming this fires for objects without Type 1 columns.Extension point is silently skipped.Verify the target object has columns with ChangeType = Type1 in BimlFlex metadata.
Confusing this with DwhPreType1DeltaSql (which fires before delta detection).SQL runs at the wrong stage.Use Post to act on the delta results; use Pre to prepare data before the comparison.
  • DwhPreType1DeltaSql -- fires before the delta detection query; use for pre-comparison data preparation.
  • DwhPostType1UpdateSql -- fires after the Type 1 update statement; use for post-update actions.
  • DwhPostType1InsertSql -- fires after the Type 1 insert statement; use for post-insert actions.
  • DwhPostSourceSql -- fires earlier, immediately after the source query; use for broader source data manipulation.

Data Warehouse Post Type1 Insert SQL

Configure SQL to be injected after the Type1 insert query in the Data Warehouse stored procedure

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostType1InsertSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */

What It Does

Injects a custom SQL fragment immediately after the Type 1 SCD insert statement in the Data Mart stored procedure. During a Type 1 load, after delta detection identifies new rows that do not exist in the target, BimlFlex inserts them. This extension point fires right after that insert completes, allowing you to perform post-insert actions such as logging row counts, updating surrogate key lookup tables, or triggering dependent processes for the newly added dimension or fact rows.

Where It Fires

Registered in Code/DataWarehouseViewModel.cs (lines ~45-62) via GetDataWarehouseExtensionPoints() and consumed via GetTargetObjectExtensionPoint(). The SQL fragment is injected into the generated stored procedure immediately after the INSERT statement that adds new rows to the target as part of Type 1 processing.

When to Use It

  1. After new rows are inserted into dim.Customer, you need to populate a surrogate key mapping table that other fact tables reference during their loads in the LOAD_BFX_DM batch.
  2. You want to capture the @@ROWCOUNT of inserted rows and write it to an audit table for operational monitoring of dim.Product loads.
  3. A downstream fact load depends on newly inserted dimension keys; you need to refresh a lookup cache or temp table after dim.Address inserts complete.
  4. Use this instead of DwhPostProcessSql when your logic must execute specifically after the Type 1 insert and before any subsequent Type 1 update or Type 2 operations.
  5. A notification or flag must be set when new dimension members are detected (e.g., flagging new customers for review in a control table).

Prerequisites

  • The target object must be a DIM or FCT table with Type 1 SCD columns in a Data Mart integration stage.
  • The stored procedure must include a Type 1 insert step (columns configured with ChangeType = Type1).
  • SQL-based load pattern.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPostType1InsertSql.
  3. Set the target attribute to the specific Data Mart object.
  4. Write the SQL fragment.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostType1InsertSql" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
-- Log newly inserted dimension members
INSERT INTO [BFX_DM].[audit].[InsertLog] (ObjectName, RowsInserted, LogTime)
VALUES ('dim.Customer', @@ROWCOUNT, GETDATE());
  1. Build and verify the SQL appears in the generated stored procedure after the Type 1 insert.

Example

Before -- the Type 1 insert runs and the procedure moves on:

-- Type 1 Insert
INSERT INTO [dim].[Customer] (CustomerKey, CustomerName, ...)
SELECT ... FROM #new_rows;
-- Next operation (Type 1 Update or end)

After -- your logging SQL is injected immediately after the insert:

INSERT INTO [dim].[Customer] (CustomerKey, CustomerName, ...)
SELECT ... FROM #new_rows;
-- Extension Point: DwhPostType1InsertSql
INSERT INTO [BFX_DM].[audit].[InsertLog] (ObjectName, RowsInserted, LogTime)
VALUES ('dim.Customer', @@ROWCOUNT, GETDATE());
-- Next operation

Common Mistakes

MistakeSymptomFix
Using @@ROWCOUNT after additional SQL statements in your fragment.@@ROWCOUNT returns the count of your own statements, not the insert.Capture @@ROWCOUNT in the very first statement of your extension point SQL.
Confusing this with DwhPreType1InsertSql (which fires before the insert).SQL runs before rows exist in the target, so lookups against newly inserted rows return nothing.Use Post for actions that depend on inserted data; use Pre for preparation.
Applying this to an object with no new rows to insert.Extension point SQL still fires but @@ROWCOUNT is 0, which may cause misleading audit entries.Add a conditional check (e.g., IF @@ROWCOUNT > 0) in your SQL.
  • DwhPreType1InsertSql -- fires before the Type 1 insert; use for pre-insert validation or temp table setup.
  • DwhPostType1UpdateSql -- fires after the Type 1 update; use when you need to act on updated rows instead of inserted rows.
  • DwhPostType1DeltaSql -- fires after delta detection; use to manipulate the delta set before any inserts or updates.
  • DwhPostProcessSql -- fires at the very end of the procedure; use for final cleanup after all operations.

Data Warehouse Post Type1 Update SQL

Configure SQL to be injected after the Type1 update query in the Data Warehouse stored procedure

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostType1UpdateSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */

What It Does

Injects a custom SQL fragment immediately after the Type 1 SCD update statement in the Data Mart stored procedure. During a Type 1 load, BimlFlex overwrites existing attribute values in the target with the latest values from the source (no history is preserved). This extension point fires right after that UPDATE completes, allowing you to log the number of rows overwritten, trigger change notifications, or perform corrective actions on the updated rows.

Where It Fires

Registered in Code/DataWarehouseViewModel.cs (lines ~45-62) via GetDataWarehouseExtensionPoints() and consumed via GetTargetObjectExtensionPoint(). The SQL fragment is injected into the generated stored procedure immediately after the UPDATE statement that overwrites existing Type 1 attribute values in the target table.

When to Use It

  1. After Type 1 attributes are updated in dim.Customer, you need to log which columns changed and how many rows were affected to a change-tracking audit table in BFX_DM.
  2. A downstream reporting cache must be invalidated whenever dim.Product attributes are overwritten, so you set a flag in a control table after the update completes.
  3. You want to capture @@ROWCOUNT from the Type 1 update and compare it against expected thresholds; if too many rows changed, insert a warning into an alert table for the LOAD_BFX_DM batch.
  4. Use this instead of DwhPostType1DeltaSql when you need to act after the update is physically applied rather than after the delta is detected.
  5. Updated dimension rows need a secondary transformation (e.g., recomputing a hash column or updating a full-text search index) that must run immediately after the overwrite.

Prerequisites

  • The target object must be a DIM or FCT table with Type 1 SCD columns in a Data Mart integration stage.
  • The stored procedure must include a Type 1 update step (columns configured with ChangeType = Type1).
  • SQL-based load pattern.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPostType1UpdateSql.
  3. Set the target attribute to the specific Data Mart object.
  4. Write the SQL fragment.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostType1UpdateSql" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
-- Alert if an unusually large number of rows were updated
IF @@ROWCOUNT > 10000
INSERT INTO [BFX_DM].[audit].[Alerts] (ObjectName, AlertType, Message, LogTime)
VALUES ('dim.Customer', 'HighUpdateCount', 'Type 1 update affected more than 10000 rows', GETDATE());
  1. Build and inspect the generated stored procedure to confirm placement after the UPDATE statement.

Example

Before -- the Type 1 update runs and the procedure continues:

-- Type 1 Update
UPDATE t SET t.CustomerName = s.CustomerName, ...
FROM [dim].[Customer] t JOIN #type1_delta s ON ...
-- Next operation

After -- your alert SQL is injected after the update:

UPDATE t SET t.CustomerName = s.CustomerName, ...
FROM [dim].[Customer] t JOIN #type1_delta s ON ...
-- Extension Point: DwhPostType1UpdateSql
IF @@ROWCOUNT > 10000
INSERT INTO [BFX_DM].[audit].[Alerts] (ObjectName, AlertType, Message, LogTime)
VALUES ('dim.Customer', 'HighUpdateCount', 'Type 1 update affected more than 10000 rows', GETDATE());
-- Next operation

Common Mistakes

MistakeSymptomFix
Running additional DML before capturing @@ROWCOUNT.@@ROWCOUNT reflects your DML, not the Type 1 update.Capture @@ROWCOUNT in a variable as the first statement: DECLARE @cnt INT = @@ROWCOUNT;.
Confusing Type 1 update (overwrite) with Type 2 insert (versioning).Logic intended for Type 2 versioned rows executes against overwritten rows.Use DwhPostType2InsertSql for Type 2 scenarios.
Setting ObjectInherit = true when the alert threshold is only meaningful for one dimension.All objects in the batch get the same threshold, generating false alerts.Set ObjectInherit = false and use the target attribute.
  • DwhPreType1UpdateSql -- fires before the Type 1 update; use for pre-update validation or snapshots.
  • DwhPostType1InsertSql -- fires after the Type 1 insert; use for actions specific to new rows.
  • DwhPostType1DeltaSql -- fires after delta detection; use to manipulate the delta set before updates and inserts.
  • DwhPostType2InsertSql -- fires after the Type 2 insert; use for Type 2 versioning scenarios.

Data Warehouse Post Type2 Insert SQL

Configure SQL to be injected after the Type2 insert query in the Data Warehouse stored procedure

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostType2InsertSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */

What It Does

Injects a custom SQL fragment immediately after the Type 2 SCD insert statement in the Data Mart stored procedure. During a Type 2 load, BimlFlex creates new versioned rows with effective date ranges when attribute values change, preserving the full history of a dimension. This extension point fires right after those new version rows are inserted, allowing you to log version counts, update related lookup structures, or trigger processes that depend on the newly created historical records.

Where It Fires

Registered in Code/DataWarehouseViewModel.cs (lines ~45-62) via GetDataWarehouseExtensionPoints() and consumed via GetTargetObjectExtensionPoint(). The SQL fragment is injected into the generated stored procedure immediately after the INSERT statement that creates new versioned rows as part of Type 2 processing.

When to Use It

  1. After new version rows are inserted into dim.Customer (Type 2), you need to update the CurrentFlag or IsCurrent column on the previous version rows to reflect that they are no longer the active version.
  2. You want to log how many new Type 2 versions were created per load cycle for dim.Product to monitor dimension growth in the LOAD_BFX_DM batch.
  3. A downstream process needs to be notified when new historical versions appear (e.g., triggering a slowly changing dimension audit report for dim.Address).
  4. Use this instead of DwhPostProcessSql when your logic must execute specifically after Type 2 inserts and before any subsequent steps in the procedure.
  5. Surrogate key assignment or sequence management needs to be validated after the new versioned rows are created.

Prerequisites

  • The target object must be a DIM table with Type 2 SCD columns in a Data Mart integration stage.
  • The object must have columns configured with ChangeType = Type2 in BimlFlex metadata.
  • SQL-based load pattern (stored procedure generation).

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPostType2InsertSql.
  3. Set the target attribute to the specific Data Mart dimension.
  4. Write the SQL fragment.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPostType2InsertSql" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
-- Log Type 2 version creation
INSERT INTO [BFX_DM].[audit].[VersionLog] (ObjectName, NewVersions, LogTime)
VALUES ('dim.Customer', @@ROWCOUNT, GETDATE());
  1. Build and verify the SQL appears in the generated stored procedure after the Type 2 insert.

Example

Before -- the Type 2 insert creates new version rows and the procedure continues:

-- Type 2 Insert (new versions)
INSERT INTO [dim].[Customer] (CustomerKey, CustomerName, ..., FlexRowEffectiveFromDate, FlexRowIsCurrent)
SELECT ... FROM #type2_changes;
-- Next operation

After -- your version logging SQL is injected after the insert:

INSERT INTO [dim].[Customer] (CustomerKey, CustomerName, ..., FlexRowEffectiveFromDate, FlexRowIsCurrent)
SELECT ... FROM #type2_changes;
-- Extension Point: DwhPostType2InsertSql
INSERT INTO [BFX_DM].[audit].[VersionLog] (ObjectName, NewVersions, LogTime)
VALUES ('dim.Customer', @@ROWCOUNT, GETDATE());
-- Next operation

Common Mistakes

MistakeSymptomFix
Attempting to use this extension point on a table with no Type 2 columns.The extension point never fires; no SQL is injected.Verify the dimension has columns with ChangeType = Type2 in metadata.
Running DML before capturing @@ROWCOUNT.The row count reflects your statement, not the Type 2 insert.Capture @@ROWCOUNT immediately: DECLARE @newVersions INT = @@ROWCOUNT;.
Confusing this with DwhPostType1InsertSql for Type 1 inserts.Logic intended for versioned rows runs against non-versioned inserts.Use this extension point only for Type 2 scenarios; use the Type 1 variant for overwrites.
  • DwhPreType2InsertSql -- fires before the Type 2 insert; use for pre-insert validation or preparing staging data.
  • DwhPostType1InsertSql -- fires after the Type 1 insert; use for non-versioned insert scenarios.
  • DwhType2Pipeline -- modifies the SSIS Data Flow for Type 2 processing; use when working with pipeline transformations rather than SQL.
  • DwhPostProcessSql -- fires at the very end of the procedure; use for final post-load actions.

Data Warehouse Pre Process SQL

Configure SQL to be injected at the start of the Data Warehouse stored procedure

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreProcessSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */

What It Does

Injects a custom SQL fragment at the very beginning of a Data Mart stored procedure, before any source queries, delta detection, or load operations execute. This is the first point in the generated procedure where you can run SQL. Use it to set session-level options, create temporary tables, declare variables, truncate staging structures, or perform any initialization that subsequent steps in the procedure depend on.

Where It Fires

Registered in Code/DataWarehouseViewModel.cs (lines ~45-62) via GetDataWarehouseExtensionPoints() and consumed via GetTargetObjectExtensionPoint(). The SQL fragment is injected into the generated stored procedure at the very first position, before all other DM load logic.

When to Use It

  1. You need to create a temp table that will be used by subsequent extension points (e.g., DwhPostSourceSql or DwhPostType1DeltaSql) during the load of dim.Customer from dv.SAT_Customer_Details_awlt.
  2. A session-level SET option (e.g., SET NOCOUNT ON or SET ANSI_NULLS ON) must be configured before the stored procedure logic runs for objects in the LOAD_BFX_DM batch.
  3. You want to truncate a work table or clear a staging area before the load of fact.SalesOrder begins, ensuring a clean state.
  4. Use this instead of DwhPreSqlCallProcess when the initialization logic must run inside the stored procedure rather than as a separate SSIS task before the procedure is called.
  5. A variable needs to be declared and populated (e.g., the current batch run timestamp) for use by SQL injected via other extension points later in the procedure.

Prerequisites

  • The target object must be a DIM or FCT table in a Data Mart integration stage (connection BFX_DM).
  • The stored procedure must be generated by BimlFlex (SQL-based or Snowflake SQL load pattern).
  • If ObjectInherit = true, the SQL is prepended to all objects in the batch.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPreProcessSql.
  3. Set the target attribute to the specific Data Mart object.
  4. Write the SQL fragment that should execute at the start of the procedure.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreProcessSql" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
-- Create a temp table for audit logging used by later extension points
CREATE TABLE #AuditLog (
StepName NVARCHAR(100),
RowCount INT,
LogTime DATETIME DEFAULT GETDATE()
);
  1. Build the project and inspect the generated stored procedure to confirm the SQL appears at the beginning.

Example

Before -- the stored procedure starts directly with the source query:

CREATE PROCEDURE [dm].[usp_Load_dim_Customer] ...
AS
BEGIN
-- Source query
SELECT ... INTO #src FROM [dv].[SAT_Customer_Details_awlt] ...

After -- your initialization SQL appears first:

CREATE PROCEDURE [dm].[usp_Load_dim_Customer] ...
AS
BEGIN
-- Extension Point: DwhPreProcessSql
CREATE TABLE #AuditLog (
StepName NVARCHAR(100),
RowCount INT,
LogTime DATETIME DEFAULT GETDATE()
);
-- Source query
SELECT ... INTO #src FROM [dv].[SAT_Customer_Details_awlt] ...

Common Mistakes

MistakeSymptomFix
Adding GO statements or batch separators.Syntax error: GO is not valid inside a stored procedure.Use only T-SQL statements that are valid within a procedure body.
Creating a temp table with a name that conflicts with BimlFlex-generated temp tables."There is already an object named '#src'" error.Inspect the generated procedure first and choose unique temp table names.
Setting ObjectInherit = true when the initialization is only relevant to one object.All objects in the batch receive the same setup SQL, which may conflict or cause errors.Use ObjectInherit = false and the target attribute.
  • DwhPostProcessSql -- injects SQL at the end of the same procedure; pair them for setup/teardown patterns.
  • DwhPreSqlCallProcess -- adds an SSIS task before the procedure call; use when initialization must occur outside the procedure.
  • DwhPreSourceSql -- fires before the source query but after DwhPreProcessSql; use for source-specific preparation.

Data Warehouse Pre Source SQL

Configure SQL to be injected before the staging query in the Data Warehouse stored procedure

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreSourceSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */

What It Does

Injects a custom SQL fragment immediately before the staging/source query within the Data Mart stored procedure. BimlFlex generates a source query that extracts data from the Data Vault (or other source layer) into a working set. This extension point fires right before that query runs, allowing you to prepare the environment -- create helper tables, populate reference data, or set variables -- that the source query or subsequent steps will use.

Where It Fires

Registered in Code/DataWarehouseViewModel.cs (lines ~45-62) via GetDataWarehouseExtensionPoints() and consumed via GetTargetObjectExtensionPoint(). The SQL fragment is injected into the generated stored procedure immediately before the source staging query.

When to Use It

  1. You need to populate a reference temp table with lookup values (e.g., region code mappings) that the source query for dim.Customer will join against in BFX_DM.
  2. A session variable or temp table must be set before the source query runs to control filtering logic (e.g., setting a snapshot date for fact.SalesOrder loads in the LOAD_BFX_DM batch).
  3. You want to log the start time of the source extraction phase for dim.Address to an audit table for performance monitoring.
  4. Use this instead of DwhPreProcessSql when the logic is specifically related to source query preparation rather than general procedure initialization.
  5. A control table must be read to determine whether the source query should use an incremental or full load pattern for dim.Product.

Prerequisites

  • The target object must be a DIM or FCT table in a Data Mart integration stage (connection BFX_DM).
  • The stored procedure must use a source query pattern (SQL-based load).
  • Your SQL must not interfere with temp tables that BimlFlex creates for the source query.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPreSourceSql.
  3. Set the target attribute to the specific Data Mart object.
  4. Write the SQL fragment.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreSourceSql" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
-- Create a lookup table for region codes before the source query
SELECT RegionCode, RegionName
INTO #RegionLookup
FROM [BFX_DM].[ref].[Region]
WHERE IsActive = 1;
  1. Build and inspect the generated stored procedure to confirm the SQL appears before the source query.

Example

Before -- the source query runs directly:

-- Source query
SELECT ... INTO #src FROM [dv].[SAT_Customer_Details_awlt] ...

After -- your lookup table is created before the source query:

-- Extension Point: DwhPreSourceSql
SELECT RegionCode, RegionName
INTO #RegionLookup
FROM [BFX_DM].[ref].[Region]
WHERE IsActive = 1;
-- Source query
SELECT ... INTO #src FROM [dv].[SAT_Customer_Details_awlt] ...

Common Mistakes

MistakeSymptomFix
Creating a temp table with the same name BimlFlex uses for the source query."There is already an object named '#src'" error.Use unique names for your temp tables (e.g., #RegionLookup).
Adding logic that depends on the source data that has not been extracted yet.References to #src fail because it does not exist yet.Use DwhPostSourceSql if your logic requires the source data.
Confusing this with DwhPreProcessSql which fires even earlier.Logic that should be the very first statement ends up after other initialization.Use DwhPreProcessSql for the earliest possible injection point.
  • DwhPostSourceSql -- fires after the source query; use for post-extraction manipulation of the source data.
  • DwhPreProcessSql -- fires at the very start of the procedure; use for general initialization before any logic runs.
  • DwhSourceOverride -- replaces the SSIS Data Flow source component entirely; use for pipeline-level source changes rather than SQL-level.

Data Warehouse Pre SQL Call Process

Add Before the Data Warehouse SQL Call

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreSqlCallProcess" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# var sourceObject = new ObjectViewModel(sourceTable, sourceTable.Connection.RelatedItem.IntegrationStage, "SRC");
var targetObject = new ObjectViewModel(targetTable, targetTable.Connection.RelatedItem.IntegrationStage, "TGT");
#>
<# CustomOutput.ObjectInherit = true; #>
<ExecuteSQL Name="SQL - DwhPreSqlCallProcess" ConnectionName="<#=targetObject.Connection.Name#>">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<Parameters>
<Parameter Name="0" VariableName="User.ExecutionID" DataType="Int64" />
</Parameters>
<DirectInput>EXEC <#=targetObject.SystemType.StoredProcedureName(sourceObject, targetObject.Schema, targetObject.StartDelimiter, targetObject.EndDelimiter)#> ?</DirectInput>
<# CustomOutput.OutputPathName = @"SQL - DwhPreSqlCallProcess.Output"; #>
</ExecuteSQL>

What It Does

Adds an SSIS Execute SQL task (or other control flow task) that runs immediately before the main Data Mart stored procedure call. This extension point operates at the SSIS package level, allowing you to execute preparatory SQL commands, call setup procedures, or perform pre-load orchestration steps as discrete tasks in the control flow before the DM load procedure begins.

Where It Fires

Consumed in 90.flx-i-get-pkg-sql-dm-batch.biml (line ~209) and 90.flx-i-get-pkg-snowflake-sql-batch.biml. The emitted Biml is placed in the SSIS control flow before the Execute SQL task that calls the DM stored procedure. It uses a precedence constraint to chain before the main SQL call.

When to Use It

  1. Before the dim.Customer stored procedure runs in the LOAD_BFX_DM batch, you need to call a staging cleanup procedure that truncates intermediate tables on the BFX_DM connection.
  2. You want to execute a pre-validation check against dv.SAT_Customer_Details_awlt in the BFX_DV connection to ensure source data is available before the DM load starts.
  3. A lock or semaphore must be acquired in a control table before fact.SalesOrder begins loading to prevent concurrent execution conflicts.
  4. Use this instead of DwhPreProcessSql when the pre-load logic must run as a separate SSIS task (e.g., against a different connection or with its own timeout) rather than inside the generated stored procedure.
  5. Environment setup (e.g., enabling change tracking or setting database options) needs to happen outside the stored procedure context.

Prerequisites

  • The target object must be in a Data Mart integration stage (connection BFX_DM).
  • SSIS deployment model; this extension point adds control flow tasks and does not apply to ADF-only builds.
  • You must set CustomOutput.OutputPathName so the main SQL call task can chain after your task.
  • The precedenceConstraint parameter must be wired to connect your task to the preceding step.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPreSqlCallProcess.
  3. Set the target attribute to the specific Data Mart object.
  4. Define the Execute SQL task, using the precedenceConstraint to chain it.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreSqlCallProcess" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# var targetObject = new ObjectViewModel(targetTable, targetTable.Connection.RelatedItem.IntegrationStage, "TGT"); #>
<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Pre-Validate Source Data" ConnectionName="<#=targetObject.Connection.Name#>">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) { #>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<DirectInput>EXEC [dm].[usp_ValidateSourceData] 'dim.Customer'</DirectInput>
<# CustomOutput.OutputPathName = @"SQL - Pre-Validate Source Data.Output"; #>
</ExecuteSQL>
  1. Build the project and verify the new task appears in the SSIS control flow before the main stored procedure call.

Example

Before -- the control flow runs the DM stored procedure directly:

[Previous Task] --> [SQL - Load dim.Customer]

After -- with the extension point, a pre-call validation task is inserted:

[Previous Task] --> [SQL - Pre-Validate Source Data] --> [SQL - Load dim.Customer]

Common Mistakes

MistakeSymptomFix
Not setting CustomOutput.OutputPathName.The main SQL call task has no input to connect to; pipeline chain breaks.Always set CustomOutput.OutputPathName to your task's output.
Ignoring the precedenceConstraint parameter.Your task runs in parallel with the preceding task instead of after it.Always wire the <PrecedenceConstraints> using the provided precedenceConstraint value.
Using this for logic that could run inside the stored procedure.Unnecessary SSIS task overhead and additional database round-trips.Use DwhPreProcessSql if the SQL can run inside the same procedure.
  • DwhPostSqlCallProcess -- adds a task after the stored procedure call; pair with this for pre/post orchestration.
  • DwhSqlCallOverride -- replaces the entire SQL call task; use when the standard procedure invocation must be completely changed.
  • DwhPreProcessSql -- injects SQL inside the stored procedure at the start; use when separate SSIS task overhead is not needed.

Data Warehouse Pre Type1 Delta SQL

Configure SQL to be injected before the Type1 delta query in the Data Warehouse stored procedure

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreType1DeltaSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */

What It Does

Injects a custom SQL fragment immediately before the Type 1 SCD delta detection query in the Data Mart stored procedure. Delta detection compares source rows against the current target to identify which rows have changed. This extension point fires right before that comparison, giving you an opportunity to prepare the data, create indexes on temp tables for performance, or apply filtering rules that influence which rows enter delta detection.

Where It Fires

Registered in Code/DataWarehouseViewModel.cs (lines ~45-62) via GetDataWarehouseExtensionPoints() and consumed via GetTargetObjectExtensionPoint(). The SQL fragment is injected into the generated stored procedure immediately before the delta detection query that compares source and target rows for Type 1 processing.

When to Use It

  1. You need to add a nonclustered index on the source temp table's key columns before delta detection runs for dim.Customer, improving join performance when the working set is large.
  2. Business rules require filtering certain rows out of the source data before they enter the Type 1 comparison (e.g., excluding test accounts from dv.SAT_Customer_Details_awlt before loading dim.Customer in BFX_DM).
  3. A calculated column must be added to the source temp table that the delta detection query will use for comparison but that does not come from the raw source data.
  4. Use this instead of DwhPostSourceSql when your manipulation is specifically tied to optimizing or preparing for the Type 1 delta detection step rather than general source post-processing.
  5. You want to log the row count entering delta detection for dim.Product in the LOAD_BFX_DM batch for performance tracking.

Prerequisites

  • The target object must be a DIM or FCT table with Type 1 SCD columns in a Data Mart integration stage.
  • The object must have columns configured with ChangeType = Type1 in BimlFlex metadata.
  • SQL-based load pattern.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPreType1DeltaSql.
  3. Set the target attribute to the specific Data Mart object.
  4. Write the SQL fragment.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreType1DeltaSql" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
-- Add index to improve delta detection performance
CREATE NONCLUSTERED INDEX IX_src_CustomerKey ON #src (CustomerKey);
  1. Build and inspect the generated stored procedure to confirm the SQL appears before the delta detection query.

Example

Before -- delta detection runs directly against the unindexed source temp table:

SELECT ... INTO #src FROM [dv].[SAT_Customer_Details_awlt] ...
-- Delta detection
SELECT ... INTO #type1_delta FROM #src s JOIN [dim].[Customer] t ...

After -- an index is created before the comparison for better performance:

SELECT ... INTO #src FROM [dv].[SAT_Customer_Details_awlt] ...
-- Extension Point: DwhPreType1DeltaSql
CREATE NONCLUSTERED INDEX IX_src_CustomerKey ON #src (CustomerKey);
-- Delta detection
SELECT ... INTO #type1_delta FROM #src s JOIN [dim].[Customer] t ...

Common Mistakes

MistakeSymptomFix
Dropping or truncating the source temp table before delta detection.Delta detection finds zero rows; no updates or inserts occur.Only add indexes, filter rows, or augment columns; do not remove the source data.
Assuming this fires for objects without Type 1 columns.Extension point is silently skipped.Verify the target has columns with ChangeType = Type1.
Creating an index with a name that conflicts with an existing index."An index with name already exists" error.Use unique, descriptive index names.
  • DwhPostType1DeltaSql -- fires after the delta detection query; use to manipulate the delta result set.
  • DwhPreType1InsertSql -- fires before the Type 1 insert; use to prepare for the insert operation specifically.
  • DwhPostSourceSql -- fires earlier, right after the source query; use for general source data manipulation.
  • DwhPreType1UpdateSql -- fires before the Type 1 update; use to prepare for updates specifically.

Data Warehouse Pre Type1 Insert SQL

Configure SQL to be injected before the Type1 insert query in the Data Warehouse stored procedure

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreType1InsertSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */

What It Does

Injects a custom SQL fragment immediately before the Type 1 SCD insert statement in the Data Mart stored procedure. After delta detection identifies rows that are new (not yet present in the target), BimlFlex inserts them. This extension point fires right before that insert, allowing you to validate the new rows, apply default values, generate surrogate keys, or perform any last-moment preparation on the data about to be inserted into the dimension or fact table.

Where It Fires

Registered in Code/DataWarehouseViewModel.cs (lines ~45-62) via GetDataWarehouseExtensionPoints() and consumed via GetTargetObjectExtensionPoint(). The SQL fragment is injected into the generated stored procedure immediately before the INSERT statement that adds new rows to the target as part of Type 1 processing.

When to Use It

  1. Before new rows are inserted into dim.Customer, you need to assign a default value to a CustomerSegment column that is not sourced from dv.SAT_Customer_Details_awlt but is required by the target in BFX_DM.
  2. You want to validate that no duplicate keys exist in the set of new rows before they are inserted into dim.Product in the LOAD_BFX_DM batch, raising an error if duplicates are found.
  3. A surrogate key sequence needs to be pre-allocated or a key mapping table needs to be consulted before inserting new rows into dim.Address.
  4. Use this instead of DwhPreType1DeltaSql when your logic is specifically about preparing for the insert operation rather than preparing for delta detection.
  5. A pre-insert snapshot of the target table's row count needs to be captured for reconciliation after the insert completes.

Prerequisites

  • The target object must be a DIM or FCT table with Type 1 SCD columns in a Data Mart integration stage.
  • The stored procedure must include a Type 1 insert step.
  • SQL-based load pattern.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPreType1InsertSql.
  3. Set the target attribute to the specific Data Mart object.
  4. Write the SQL fragment.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreType1InsertSql" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
-- Validate no duplicate keys before insert
IF EXISTS (
SELECT CustomerKey FROM #new_rows
GROUP BY CustomerKey HAVING COUNT(*) > 1
)
RAISERROR('Duplicate CustomerKey found in new rows for dim.Customer', 16, 1);
  1. Build and verify the SQL appears before the Type 1 insert in the generated stored procedure.

Example

Before -- the insert runs directly after delta detection:

-- Delta detection produced #new_rows
-- Type 1 Insert
INSERT INTO [dim].[Customer] (CustomerKey, CustomerName, ...)
SELECT ... FROM #new_rows;

After -- your validation runs before the insert:

-- Delta detection produced #new_rows
-- Extension Point: DwhPreType1InsertSql
IF EXISTS (
SELECT CustomerKey FROM #new_rows
GROUP BY CustomerKey HAVING COUNT(*) > 1
)
RAISERROR('Duplicate CustomerKey found in new rows for dim.Customer', 16, 1);
-- Type 1 Insert
INSERT INTO [dim].[Customer] (CustomerKey, CustomerName, ...)
SELECT ... FROM #new_rows;

Common Mistakes

MistakeSymptomFix
Modifying the structure of the new-rows temp table.The subsequent INSERT fails because column names or types no longer match.Only add validation or logging; do not alter the temp table schema.
Using RAISERROR with severity < 16 expecting the load to stop.The error is logged as a warning but the insert still proceeds.Use severity 16 or higher with RAISERROR to abort execution.
Confusing this with DwhPostType1InsertSql which fires after the insert.Validation logic runs too late; bad rows are already in the target.Use Pre for validation before insert; use Post for post-insert logging.
  • DwhPostType1InsertSql -- fires after the Type 1 insert; use for post-insert logging or notifications.
  • DwhPreType1UpdateSql -- fires before the Type 1 update; use for pre-update preparation.
  • DwhPreType1DeltaSql -- fires before delta detection; use for earlier-stage data preparation.
  • DwhPreType2InsertSql -- fires before the Type 2 insert; use for Type 2 versioned insert scenarios.

Data Warehouse Pre Type1 Update SQL

Configure SQL to be injected before the Type1 update query in the Data Warehouse stored procedure

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreType1UpdateSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */

What It Does

Injects a custom SQL fragment immediately before the Type 1 SCD update statement in the Data Mart stored procedure. During a Type 1 load, after delta detection identifies changed rows, BimlFlex overwrites the existing attribute values in the target. This extension point fires right before that UPDATE, allowing you to capture a before-image of the rows about to be overwritten, validate the update set, or apply last-minute transformations to the delta data.

Where It Fires

Registered in Code/DataWarehouseViewModel.cs (lines ~45-62) via GetDataWarehouseExtensionPoints() and consumed via GetTargetObjectExtensionPoint(). The SQL fragment is injected into the generated stored procedure immediately before the UPDATE statement that overwrites existing Type 1 attribute values.

When to Use It

  1. Before Type 1 attributes are overwritten in dim.Customer, you want to capture the current (pre-update) values into a history table so you have a record of what changed, even though Type 1 does not normally preserve history.
  2. A business rule requires validation that the new values for dim.Product are not blank or NULL before the overwrite occurs in the LOAD_BFX_DM batch.
  3. You need to update a "last modified" timestamp column in the delta set before the Type 1 update applies the changes to dim.Address in BFX_DM.
  4. Use this instead of DwhPostType1DeltaSql when your logic must run specifically before the update statement rather than before both the update and insert operations.
  5. A pre-update row count needs to be captured for reconciliation with the post-update row count logged by DwhPostType1UpdateSql.

Prerequisites

  • The target object must be a DIM or FCT table with Type 1 SCD columns in a Data Mart integration stage.
  • The stored procedure must include a Type 1 update step (columns configured with ChangeType = Type1).
  • SQL-based load pattern.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPreType1UpdateSql.
  3. Set the target attribute to the specific Data Mart object.
  4. Write the SQL fragment.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreType1UpdateSql" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
-- Capture before-image of rows about to be overwritten
INSERT INTO [BFX_DM].[audit].[CustomerHistory] (CustomerKey, CustomerName, CaptureTime)
SELECT t.CustomerKey, t.CustomerName, GETDATE()
FROM [dim].[Customer] t
INNER JOIN #type1_delta d ON t.CustomerKey = d.CustomerKey;
  1. Build and inspect the generated stored procedure to confirm the SQL appears before the UPDATE statement.

Example

Before -- the Type 1 update runs directly:

-- Type 1 Update
UPDATE t SET t.CustomerName = d.CustomerName, ...
FROM [dim].[Customer] t JOIN #type1_delta d ON ...

After -- your before-image capture runs first:

-- Extension Point: DwhPreType1UpdateSql
INSERT INTO [BFX_DM].[audit].[CustomerHistory] (CustomerKey, CustomerName, CaptureTime)
SELECT t.CustomerKey, t.CustomerName, GETDATE()
FROM [dim].[Customer] t
INNER JOIN #type1_delta d ON t.CustomerKey = d.CustomerKey;
-- Type 1 Update
UPDATE t SET t.CustomerName = d.CustomerName, ...
FROM [dim].[Customer] t JOIN #type1_delta d ON ...

Common Mistakes

MistakeSymptomFix
Modifying the delta temp table in a way that breaks the subsequent UPDATE.Column mismatch or missing data in the update.Only read from or augment the delta data; do not drop or rename columns.
Assuming the before-image capture includes all columns.Missing columns in your history table.Explicitly list the columns you need in the SELECT.
Placing logic here that should run after the update.Data captured reflects pre-update state when you wanted post-update state.Use DwhPostType1UpdateSql for post-update actions.
  • DwhPostType1UpdateSql -- fires after the Type 1 update; use for post-update logging or notifications.
  • DwhPreType1InsertSql -- fires before the Type 1 insert; use for pre-insert preparation.
  • DwhPostType1DeltaSql -- fires after delta detection; use to manipulate the delta set before both updates and inserts.
  • DwhPreType2InsertSql -- fires before the Type 2 insert; use for Type 2 versioned scenarios.

Data Warehouse Pre Type2 Insert SQL

Configure SQL to be injected before the Type2 insert query in the Data Warehouse stored procedure

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreType2InsertSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
/* Add your SQL fragment here */

What It Does

Injects a custom SQL fragment immediately before the Type 2 SCD insert statement in the Data Mart stored procedure. During a Type 2 load, BimlFlex creates new versioned rows with effective dates when attribute values change. This extension point fires right before those new version rows are inserted, allowing you to validate the versioning data, adjust effective dates, set default values on new version rows, or perform any preparation needed before the historical records are created.

Where It Fires

Registered in Code/DataWarehouseViewModel.cs (lines ~45-62) via GetDataWarehouseExtensionPoints() and consumed via GetTargetObjectExtensionPoint(). The SQL fragment is injected into the generated stored procedure immediately before the INSERT statement that creates new versioned rows as part of Type 2 processing.

When to Use It

  1. Before new version rows are inserted into dim.Customer (Type 2), you need to expire the current version by updating its FlexRowEffectiveToDate and setting FlexRowIsCurrent = 0 in BFX_DM.
  2. You want to validate that the effective date range for new versions of dim.Product does not overlap with existing versions in the LOAD_BFX_DM batch.
  3. A default value needs to be set on the new version rows (e.g., setting an ApprovedBy column to NULL) before they are inserted into dim.Address.
  4. Use this instead of DwhPreType1InsertSql when dealing with Type 2 versioned dimension inserts rather than Type 1 non-versioned inserts.
  5. A business rule requires logging all rows about to receive new versions to a change notification queue before the insert occurs.

Prerequisites

  • The target object must be a DIM table with Type 2 SCD columns in a Data Mart integration stage.
  • The object must have columns configured with ChangeType = Type2 in BimlFlex metadata.
  • SQL-based load pattern (stored procedure generation).

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhPreType2InsertSql.
  3. Set the target attribute to the specific Data Mart dimension.
  4. Write the SQL fragment.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhPreType2InsertSql" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
-- Validate no overlapping effective dates before inserting new versions
IF EXISTS (
SELECT 1 FROM #type2_changes c
INNER JOIN [dim].[Customer] t
ON c.CustomerKey = t.CustomerKey
AND c.FlexRowEffectiveFromDate <= t.FlexRowEffectiveToDate
AND t.FlexRowIsCurrent = 1
)
RAISERROR('Overlapping effective dates detected for dim.Customer Type 2 insert', 16, 1);
  1. Build and verify the SQL appears before the Type 2 insert in the generated stored procedure.

Example

Before -- the Type 2 insert runs directly:

-- Type 2 Insert (new versions)
INSERT INTO [dim].[Customer] (CustomerKey, CustomerName, ..., FlexRowEffectiveFromDate, FlexRowIsCurrent)
SELECT ... FROM #type2_changes;

After -- your validation runs before the insert:

-- Extension Point: DwhPreType2InsertSql
IF EXISTS (
SELECT 1 FROM #type2_changes c
INNER JOIN [dim].[Customer] t
ON c.CustomerKey = t.CustomerKey
AND c.FlexRowEffectiveFromDate <= t.FlexRowEffectiveToDate
AND t.FlexRowIsCurrent = 1
)
RAISERROR('Overlapping effective dates detected for dim.Customer Type 2 insert', 16, 1);
-- Type 2 Insert (new versions)
INSERT INTO [dim].[Customer] (CustomerKey, CustomerName, ..., FlexRowEffectiveFromDate, FlexRowIsCurrent)
SELECT ... FROM #type2_changes;

Common Mistakes

MistakeSymptomFix
Attempting to use this on a table with no Type 2 columns.The extension point never fires.Verify the dimension has columns with ChangeType = Type2 in metadata.
Modifying the #type2_changes temp table structure.The subsequent INSERT fails with column mismatch.Only validate or log; do not alter the schema of the staging temp table.
Confusing this with DwhPreType1InsertSql for Type 1 inserts.Logic intended for versioned rows runs against non-versioned inserts.Use the correct variant matching your SCD type.
Not accounting for the case where #type2_changes is empty.Validation runs unnecessarily; possible false positives.Add a row count check before your validation logic.
  • DwhPostType2InsertSql -- fires after the Type 2 insert; use for post-insert logging or notifications.
  • DwhPreType1InsertSql -- fires before the Type 1 insert; use for non-versioned insert scenarios.
  • DwhType2Pipeline -- modifies the SSIS Data Flow for Type 2 processing; use when working with pipeline transformations rather than SQL.
  • DwhPreType1UpdateSql -- fires before the Type 1 update; a different SCD pattern.

Data Warehouse Source Override

Configure override for the Data Mart Source transformation node

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the override will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhSourceOverride" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- NOTE: The type of source override must be consistent to the connection type. -->
<# CustomOutput.ObjectInherit = false; #>
<OleDbSource Name="OLE_SRC - MyTable" ConnectionName="MY_SRC" CommandTimeout="0" ValidateExternalMetadata="false">
<DirectInput>SELECT [Code], [Name] FROM [dbo].[MyTable]</DirectInput>
<Columns>
<Column SourceColumn="Name" TargetColumn="MyTableName" />
<Column SourceColumn="Code" TargetColumn="MyTableCode" />
</Columns>
<# CustomOutput.OutputPathName = @"OLE_SRC - MyTable.Output"; #>
</OleDbSource>

What It Does

Completely replaces the default SSIS Data Flow source component that BimlFlex generates for a Data Mart load pipeline. Instead of using the standard OLE DB Source (or other source type) that BimlFlex derives from metadata, this extension point lets you define an entirely custom source component with your own SQL query, connection, and column mappings. Use it when the default source generation does not meet your requirements and you need full control over where and how data is read.

Where It Fires

Consumed in 90.flx-i-get-target-source-transformation.biml. The emitted Biml replaces the entire source component in the SSIS Data Flow task for the Data Mart object. This is an override, so the default source component is not generated when this extension point is present.

When to Use It

  1. The default source query generated by BimlFlex for dim.Customer does not include a complex join or business logic that you need; you want to write a completely custom SQL query against dv.HUB_Customer and dv.SAT_Customer_Details_awlt.
  2. You need to source data from a different connection (e.g., AWLT_SRC directly) rather than from the Data Vault layer for a specific dimension in BFX_DM.
  3. The source requires a stored procedure call rather than a SELECT query, and BimlFlex's default source does not support your specific procedure signature.
  4. Use this instead of DwhSourcePipelinePre/Post when you need to replace the source entirely rather than adding transformations before or after it.
  5. You need to use a Flat File Source or ADO.NET Source instead of the default OLE DB Source for a particular fact.SalesOrder load in the LOAD_BFX_DM batch.

Prerequisites

  • The target object must be in a Data Mart integration stage (connection BFX_DM).
  • SSIS deployment model; this extension point modifies the SSIS Data Flow.
  • The source component type must match the connection type (e.g., OLE DB Source for OLE DB connections).
  • You must set CustomOutput.OutputPathName so downstream transformations can connect.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhSourceOverride.
  3. Set the target attribute to the specific Data Mart object.
  4. Define the complete source component with connection, query, and output path.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhSourceOverride" target="dim.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
<OleDbSource Name="OLE_SRC - Custom Customer" ConnectionName="BFX_DV" CommandTimeout="0" ValidateExternalMetadata="false">
<DirectInput>
SELECT h.CustomerKey, s.CustomerName, s.Email
FROM [dv].[HUB_Customer] h
INNER JOIN [dv].[SAT_Customer_Details_awlt] s
ON h.HUB_Customer_SK = s.HUB_Customer_SK
WHERE s.FlexRowIsCurrent = 1
</DirectInput>
<# CustomOutput.OutputPathName = @"OLE_SRC - Custom Customer.Output"; #>
</OleDbSource>
  1. Build the project and verify the SSIS Data Flow uses your custom source instead of the default.

Example

Before -- BimlFlex generates the default source:

<OleDbSource Name="OLE_SRC - dim.Customer" ConnectionName="BFX_DV" ...>
<DirectInput>SELECT ... FROM [dv].[SAT_Customer_Details_awlt]</DirectInput>
</OleDbSource>

After -- your override replaces it entirely:

<OleDbSource Name="OLE_SRC - Custom Customer" ConnectionName="BFX_DV" ...>
<DirectInput>
SELECT h.CustomerKey, s.CustomerName, s.Email
FROM [dv].[HUB_Customer] h
INNER JOIN [dv].[SAT_Customer_Details_awlt] s
ON h.HUB_Customer_SK = s.HUB_Customer_SK
WHERE s.FlexRowIsCurrent = 1
</DirectInput>
</OleDbSource>

Common Mistakes

MistakeSymptomFix
Not setting CustomOutput.OutputPathName.Downstream transformations cannot find their input; build error.Always set CustomOutput.OutputPathName to your source's output.
Using a source component type that does not match the connection.Runtime error: connection manager type mismatch.Match the source type to the connection (e.g., OLE DB Source for OLE DB connections).
Returning different columns than the downstream pipeline expects.Column not found errors in subsequent transformations or destinations.Ensure your SELECT returns the same columns (names and types) that the rest of the pipeline expects.
Forgetting to set ValidateExternalMetadata="false" when the query uses temp tables or dynamic SQL.Build-time validation failure.Add ValidateExternalMetadata="false" to suppress design-time validation.
  • DwhSourcePipelinePre / DwhSourcePipelinePost -- add transformations before or after the source without replacing it; use when the default source is acceptable but needs augmentation.
  • DwhTargetOverride -- replaces the target destination component; often paired with DwhSourceOverride for fully custom pipelines.
  • DwhSqlCallOverride -- replaces the SSIS Execute SQL task for SQL-based loads; use for SQL-based overrides rather than Data Flow overrides.

Data Warehouse Source Pipeline Post

Configure pipeline logic that will be injected after the Data Mart source transformations

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhSourcePipelinePost" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>

What It Does

Injects custom SSIS Data Flow transformations immediately after the source transformations (source component plus any standard conversions) in a Data Mart load pipeline. This extension point fires after the data has been read from the source and after BimlFlex's built-in source conversions have been applied, but before the data moves into the main pipeline logic (row counts, target transformations, etc.). Use it to add additional data conversions, derived columns, or lookups right after the source stage.

Where It Fires

Consumed in 90.flx-i-get-target-source-conversion.biml and 90.flx-i-get-source-rowcount.biml. The emitted Biml is placed in the SSIS Data Flow after the source component and its standard conversions. SSIS-only; does not apply to ADF or SQL-based builds.

When to Use It

  1. After the source reads data from dv.SAT_Customer_Details_awlt, you need to apply a data type conversion (e.g., convert CustomerCode from Unicode to ANSI) before the data enters the main pipeline for dim.Customer in BFX_DM.
  2. A derived column must be computed immediately after the source (e.g., trimming whitespace from ProductName) before any downstream transformations in the LOAD_BFX_DM batch.
  3. You want to add a Lookup transformation that enriches source rows with reference data (e.g., looking up RegionName from a reference table) right after the source for dim.Address.
  4. Use this instead of DwhSourceOverride when the default source is fine but you need additional transformations after it; use instead of DwhTargetPipelinePre when the transformation should happen at the source stage rather than near the target.
  5. A Multicast transformation needs to split the source data into two paths -- one for the main DM load and one for a separate audit destination.

Prerequisites

  • The target object must be in a Data Mart integration stage (connection BFX_DM).
  • SSIS deployment model; this extension point modifies the SSIS Data Flow.
  • You must connect to the inputPath parameter and set CustomOutput.OutputPathName.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhSourcePipelinePost.
  3. Set the target attribute to the specific Data Mart object.
  4. Define the Data Flow transformation.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhSourcePipelinePost" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>

<DerivedColumns Name="DER - TrimCustomerName">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column Name="CustomerName" DataType="String" Length="100" ReplaceExisting="true">TRIM(CustomerName)</Column>
</Columns>
<# CustomOutput.OutputPathName = @"DER - TrimCustomerName.Output"; #>
</DerivedColumns>
  1. Build and verify the transformation appears in the SSIS Data Flow after the source component.

Example

Before -- data flows from the source directly to the row count:

[OLE_SRC - dim.Customer] --> [Source Conversions] --> [Row Count] --> [Target Pipeline]

After -- your derived column transformation is inserted after source conversions:

[OLE_SRC - dim.Customer] --> [Source Conversions] --> [DER - TrimCustomerName] --> [Row Count] --> [Target Pipeline]

Common Mistakes

MistakeSymptomFix
Not connecting to the inputPath parameter.Build error: transformation has no input.Always use <InputPath OutputPathName="<#=inputPath #>" />.
Not setting CustomOutput.OutputPathName.Downstream pipeline cannot find its input; build error.Always set CustomOutput.OutputPathName.
Applying this on an ADF or SQL-only build.Extension point is silently ignored.This is SSIS-only; use SQL extension points for non-SSIS builds.
  • DwhSourcePipelinePre -- fires before the source conversions (immediately after the source component); use for transformations that should precede BimlFlex's built-in conversions.
  • DwhSourceOverride -- replaces the entire source component; use when you need a completely custom source.
  • DwhTargetPipelinePre -- fires later in the pipeline, near the target; use for target-stage transformations.

Data Warehouse Source Pipeline Pre

Configure pipeline logic that will be injected after the Data Mart source transformation node

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhSourcePipelinePre" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>

What It Does

Injects custom SSIS Data Flow transformations immediately after the source component in a Data Mart load pipeline, before BimlFlex's built-in source conversions. This is the earliest point in the Data Flow where you can intercept the raw source data. Use it to apply transformations that must happen before any standard conversions, such as adding audit columns to the raw stream, performing early-stage filtering with a Conditional Split, or applying transformations that the built-in conversions depend on.

Where It Fires

Consumed in 90.flx-i-get-target-source-conversion.biml and 90.flx-i-get-source-rowcount.biml. The emitted Biml is placed in the SSIS Data Flow immediately after the source component but before BimlFlex's standard source conversion transformations. SSIS-only; does not apply to ADF or SQL-based builds.

When to Use It

  1. You need to add a Derived Column that stamps each raw source row with an ExtractTimestamp before any conversions are applied during the load of dim.Customer from dv.SAT_Customer_Details_awlt.
  2. A Conditional Split is needed to route invalid rows (e.g., rows with NULL key values) to an error output before they enter the standard conversion pipeline for fact.SalesOrder in BFX_DM.
  3. You want to apply an early-stage data type conversion that BimlFlex's built-in conversions do not handle (e.g., converting a binary column to a string) for dim.Product in the LOAD_BFX_DM batch.
  4. Use this instead of DwhSourcePipelinePost when the transformation must occur before BimlFlex's standard source conversions rather than after them.
  5. A row-level hash column needs to be computed from the raw source data before any other transformations modify the values.

Prerequisites

  • The target object must be in a Data Mart integration stage (connection BFX_DM).
  • SSIS deployment model; this extension point modifies the SSIS Data Flow.
  • You must connect to the inputPath parameter and set CustomOutput.OutputPathName.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhSourcePipelinePre.
  3. Set the target attribute to the specific Data Mart object.
  4. Define the Data Flow transformation.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhSourcePipelinePre" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>

<DerivedColumns Name="DER - AddExtractTimestamp">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column Name="ExtractTimestamp" DataType="DateTime">GETDATE()</Column>
</Columns>
<# CustomOutput.OutputPathName = @"DER - AddExtractTimestamp.Output"; #>
</DerivedColumns>
  1. Build and verify the transformation appears in the SSIS Data Flow immediately after the source component.

Example

Before -- data flows from the source to BimlFlex's built-in conversions:

[OLE_SRC - dim.Customer] --> [Standard Source Conversions] --> [Row Count]

After -- your transformation is inserted before the standard conversions:

[OLE_SRC - dim.Customer] --> [DER - AddExtractTimestamp] --> [Standard Source Conversions] --> [Row Count]

Common Mistakes

MistakeSymptomFix
Adding a transformation that changes column names used by downstream standard conversions.Build error: standard conversions cannot find expected column names.Preserve original column names or use ReplaceExisting="true" carefully.
Confusing Pre (before standard conversions) and Post (after standard conversions).Transformation fires at the wrong stage.Use DwhSourcePipelinePre for before conversions, DwhSourcePipelinePost for after.
Applying this on an ADF or SQL-only build.Extension point is silently ignored.This is SSIS-only; use SQL extension points for non-SSIS builds.
  • DwhSourcePipelinePost -- fires after the source conversions; use when the transformation should operate on already-converted data.
  • DwhSourceOverride -- replaces the entire source component; use when you need a completely custom source rather than adding a transformation after it.
  • DwhTargetPipelinePre -- fires later in the pipeline, near the target; use for transformations closer to the destination.
  • DwhInsertPipeline -- fires specifically in the insert path; use for insert-specific transformations.

Data Warehouse SQL Call Override

Override the Data Warehouse SQL Call

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhSqlCallOverride" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# var sourceObject = new ObjectViewModel(sourceTable, sourceTable.Connection.RelatedItem.IntegrationStage, "SRC");
var targetObject = new ObjectViewModel(targetTable, targetTable.Connection.RelatedItem.IntegrationStage, "TGT");
#>
<# CustomOutput.ObjectInherit = true; #>
<ExecuteSQL Name="SQL - DwhSqlCallOverride" ConnectionName="<#=targetObject.Connection.Name#>">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<Parameters>
<Parameter Name="0" VariableName="User.ExecutionID" DataType="Int64" />
</Parameters>
<DirectInput>EXEC <#=targetObject.SystemType.StoredProcedureName(sourceObject, targetObject.Schema, targetObject.StartDelimiter, targetObject.EndDelimiter)#> ?</DirectInput>
<# CustomOutput.OutputPathName = @"SQL - DwhSqlCallOverride.Output"; #>
</ExecuteSQL>

What It Does

Completely replaces the SSIS Execute SQL task that calls the Data Mart stored procedure. By default, BimlFlex generates an Execute SQL task that invokes the DM load stored procedure with the execution ID parameter. This override lets you replace that entire task with your own implementation -- calling a different procedure, passing additional parameters, or using an entirely different task type. Use it when the standard stored procedure invocation pattern does not meet your requirements.

Where It Fires

Consumed in 90.flx-i-get-pkg-sql-dm-batch.biml (line ~209) and 90.flx-i-get-pkg-snowflake-sql-batch.biml. The emitted Biml replaces the default Execute SQL task in the SSIS control flow. This override applies only to DM/DWH integration stages.

When to Use It

  1. You need to call a completely custom stored procedure instead of the BimlFlex-generated one for dim.Customer in the LOAD_BFX_DM batch (e.g., a hand-written procedure with complex business logic).
  2. The stored procedure for fact.SalesOrder requires additional parameters beyond the standard ExecutionID, and you need to pass them via the Execute SQL task.
  3. You want to replace the Execute SQL task with a Script Task or a different task type that implements custom orchestration logic for the DM load.
  4. Use this instead of DwhPreSqlCallProcess/DwhPostSqlCallProcess when you need to replace the entire SQL call rather than adding tasks before or after it.
  5. The DM load for dim.Product must call multiple procedures in sequence within a single control flow task, and the default single-procedure call pattern is insufficient.

Prerequisites

  • The target object must be in a Data Mart integration stage (connection BFX_DM).
  • SSIS deployment model; this extension point replaces an SSIS control flow task.
  • You must set CustomOutput.OutputPathName so subsequent tasks in the control flow can chain correctly.
  • The precedenceConstraint parameter must be wired to connect your task to the preceding step.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhSqlCallOverride.
  3. Set the target attribute to the specific Data Mart object.
  4. Define the replacement Execute SQL task (or other task type).
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhSqlCallOverride" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# var targetObject = new ObjectViewModel(targetTable, targetTable.Connection.RelatedItem.IntegrationStage, "TGT"); #>
<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Custom DM Load" ConnectionName="<#=targetObject.Connection.Name#>">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) { #>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<Parameters>
<Parameter Name="0" VariableName="User.ExecutionID" DataType="Int64" />
</Parameters>
<DirectInput>EXEC [dm].[usp_CustomLoad_dim_Customer] ?</DirectInput>
<# CustomOutput.OutputPathName = @"SQL - Custom DM Load.Output"; #>
</ExecuteSQL>
  1. Build and verify the SSIS package uses your custom task instead of the default stored procedure call.

Example

Before -- BimlFlex generates the default Execute SQL task:

<ExecuteSQL Name="SQL - Load dim.Customer" ConnectionName="BFX_DM">
<DirectInput>EXEC [dm].[usp_Load_dim_Customer] ?</DirectInput>
</ExecuteSQL>

After -- your override replaces it with a custom procedure call:

<ExecuteSQL Name="SQL - Custom DM Load" ConnectionName="BFX_DM">
<DirectInput>EXEC [dm].[usp_CustomLoad_dim_Customer] ?</DirectInput>
</ExecuteSQL>

Common Mistakes

MistakeSymptomFix
Not setting CustomOutput.OutputPathName.The pipeline chain breaks; subsequent tasks have no input.Always set CustomOutput.OutputPathName.
Forgetting to pass the ExecutionID parameter.BimlFlex audit/logging framework cannot track the execution.Include the ExecutionID parameter unless you intentionally want to bypass auditing.
Using this when Pre/Post process extension points would suffice.Over-engineering; the default procedure call works fine but is replaced unnecessarily.Use DwhPreSqlCallProcess/DwhPostSqlCallProcess for additions around the standard call.
Not wiring the precedenceConstraint.Task runs in parallel with predecessor instead of sequentially.Always wire <PrecedenceConstraints> using the provided parameter.
  • DwhPreSqlCallProcess / DwhPostSqlCallProcess -- add tasks before or after the SQL call without replacing it; use for augmentation rather than replacement.
  • DwhPreProcessSql / DwhPostProcessSql -- inject SQL inside the stored procedure itself; use when modifications should be within the procedure rather than at the SSIS task level.
  • DwhTargetOverride -- replaces the SSIS Data Flow target component; a different kind of override for the pipeline rather than the control flow.

Data Warehouse Target Override

Configure override for the Data Mart target transformation node

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhTargetOverride" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- NOTE: The type of target override must be consistent to the connection type. -->
<# var targetConnection = table.GetTargetConnection();
var targetScopedName = table.GetTargetScopedName(targetConnection);
var targetSsisSafeScopedName = targetScopedName.MakeSsisSafe();
#>
<# CustomOutput.ObjectInherit = false; #>
<OleDbDestination Name="OLE_DST - <#=targetSsisSafeScopedName#>" ConnectionName="<#=targetConnection#>" MaximumInsertCommitSize="500000" BatchSize="500000" CheckConstraints="false">
<InputPath OutputPathName="<#=inputPath#>" />
<ExternalTableOutput Table="[dbo].[HUB_MyTable]" />
</OleDbDestination>

What It Does

Completely replaces the default SSIS Data Flow destination component that BimlFlex generates for a Data Mart load pipeline. Instead of using the standard OLE DB Destination (or other destination type) derived from metadata, this extension point lets you define an entirely custom destination with your own connection, table mapping, and insert behavior. Use it when the default destination does not meet requirements -- for example, when you need a different commit size, a custom table name, or an entirely different destination type.

Where It Fires

Consumed in 90.flx-i-get-target-transformation.biml (line ~17). This override is checked before DvTargetOverride, meaning the Data Mart variant takes precedence when both exist. Also referenced in 90.flx-i-get-target-error-transformation.biml and 00.flx-i-get-object-variables.biml.

When to Use It

  1. You need to redirect the output of the dim.Customer pipeline to a different target table (e.g., a staging table [dm_staging].[Customer]) instead of the default [dim].[Customer] in BFX_DM.
  2. The default OLE DB Destination batch size and commit settings need to be changed for fact.SalesOrder to handle very large row volumes with specific performance tuning.
  3. You want to use an ADO.NET Destination or Flat File Destination instead of the default OLE DB Destination for a specific object in the LOAD_BFX_DM batch.
  4. Use this instead of DwhTargetPipelinePost when you need to replace the entire destination rather than adding a transformation before it.
  5. The destination table has a different schema or column mapping than what BimlFlex generates by default for dim.Product.

Prerequisites

  • The target object must be in a Data Mart integration stage (connection BFX_DM).
  • SSIS deployment model; this extension point replaces the SSIS Data Flow destination.
  • The destination component type must be compatible with the connection type.
  • The inputPath parameter must be used to connect the destination to the upstream pipeline.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhTargetOverride.
  3. Set the target attribute to the specific Data Mart object.
  4. Define the complete destination component.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhTargetOverride" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>

<# CustomOutput.ObjectInherit = false; #>
<OleDbDestination Name="OLE_DST - dim.Customer_Custom" ConnectionName="BFX_DM" MaximumInsertCommitSize="100000" BatchSize="100000" CheckConstraints="false">
<InputPath OutputPathName="<#=inputPath#>" />
<ExternalTableOutput Table="[dim].[Customer]" />
</OleDbDestination>
  1. Build and verify the SSIS Data Flow uses your custom destination.

Example

Before -- BimlFlex generates the default destination:

<OleDbDestination Name="OLE_DST - dim.Customer" ConnectionName="BFX_DM" MaximumInsertCommitSize="500000" BatchSize="500000">
<InputPath OutputPathName="..." />
<ExternalTableOutput Table="[dim].[Customer]" />
</OleDbDestination>

After -- your override changes the batch size and commit settings:

<OleDbDestination Name="OLE_DST - dim.Customer_Custom" ConnectionName="BFX_DM" MaximumInsertCommitSize="100000" BatchSize="100000" CheckConstraints="false">
<InputPath OutputPathName="..." />
<ExternalTableOutput Table="[dim].[Customer]" />
</OleDbDestination>

Common Mistakes

MistakeSymptomFix
Using a destination type that does not match the connection.Runtime error: connection manager type mismatch.Match the destination type to the connection (e.g., OLE DB Destination for OLE DB connections).
Specifying a table name with incorrect schema or delimiters."Invalid object name" error at runtime.Double-check the full table name including schema (e.g., [dim].[Customer]).
Forgetting to connect to inputPath.Build error: destination has no input.Always include <InputPath OutputPathName="<#=inputPath#>" />.
Having both DwhTargetOverride and DvTargetOverride for the same object.DwhTargetOverride takes precedence; DvTargetOverride is ignored.Remove the DV variant if you intend to use the DWH variant.
  • DwhTargetPipelinePre / DwhTargetPipelinePost -- add transformations before or after the target without replacing it; use when the default destination is acceptable.
  • DwhSourceOverride -- replaces the source component; often paired with DwhTargetOverride for fully custom pipelines.
  • DwhInsertPipeline -- injects transformations in the insert path before the destination; use for insert-path-specific customization.

Data Warehouse Target Pipeline Post

Configure pipeline logic that will be injected before the target transformation node

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhTargetPipelinePost" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>

What It Does

Injects custom SSIS Data Flow transformations immediately after the target transformation area in a Data Mart load pipeline. This extension point fires after the data has passed through the target-stage processing but before it reaches the final destination component. It allows you to add last-mile transformations, auditing components, or secondary outputs after the main target pipeline logic has been applied. Note that this fires in the target area of the pipeline, after DwhTargetPipelinePre and before the destination.

Where It Fires

Consumed in 90.flx-i-get-target-transformation.biml (line ~12). The emitted Biml is placed in the SSIS Data Flow in the target transformation area. Conditional: when integrationStage=="DWH", BimlFlex uses the Dwh variant of the target pipeline extension points.

When to Use It

  1. After the main target transformations have been applied to dim.Customer, you need to add a Row Count transformation to capture the number of rows entering the destination for operational monitoring in the LOAD_BFX_DM batch.
  2. A Multicast transformation is needed to send a copy of the data to both the main dim.Product destination and a secondary audit table.
  3. You want to add a final data quality check (e.g., a Conditional Split that routes invalid rows to an error output) after all target transformations but before the destination for fact.SalesOrder in BFX_DM.
  4. Use this instead of DwhTargetOverride when you want to add transformations after the target pipeline area without replacing the destination component.
  5. A derived column needs to compute a final hash or checksum on the data immediately before it enters the destination for dim.Address.

Prerequisites

  • The target object must be in a Data Mart integration stage (connection BFX_DM).
  • SSIS deployment model; this extension point modifies the SSIS Data Flow.
  • You must connect to the inputPath parameter and set CustomOutput.OutputPathName.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhTargetPipelinePost.
  3. Set the target attribute to the specific Data Mart object.
  4. Define the Data Flow transformation.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhTargetPipelinePost" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>

<# CustomOutput.ObjectInherit = false; #>
<RowCount Name="RC - Target Row Count" VariableName="User.TargetRowCount">
<InputPath OutputPathName="<#=inputPath #>" />
<# CustomOutput.OutputPathName = @"RC - Target Row Count.Output"; #>
</RowCount>
  1. Build and verify the transformation appears in the SSIS Data Flow after the target pipeline area.

Example

Before -- data flows from target transformations directly to the destination:

[Target Transformations] --> [OLE_DST - dim.Customer]

After -- your row count transformation is inserted after the target pipeline:

[Target Transformations] --> [RC - Target Row Count] --> [OLE_DST - dim.Customer]

Common Mistakes

MistakeSymptomFix
Not setting CustomOutput.OutputPathName.The destination component cannot find its input; build error.Always set CustomOutput.OutputPathName.
Confusing Post (after target transformations) with Pre (before target transformations).Transformation fires at the wrong stage in the pipeline.Use DwhTargetPipelinePost for after, DwhTargetPipelinePre for before.
Adding transformations that change column names or types.Destination column mapping errors.Ensure output columns match what the destination expects.
  • DwhTargetPipelinePre -- fires before the target transformations; use for earlier-stage transformations near the target.
  • DwhTargetOverride -- replaces the entire destination component; use when you need a completely custom destination.
  • DwhInsertPipeline -- fires in the insert-specific path; use for transformations that only apply to new rows.
  • DwhSourcePipelinePost -- fires after the source; use for transformations closer to the source stage.

Data Warehouse Target Pipeline Pre

Configure pipeline logic that will be injected before the target transformation node

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhTargetPipelinePre" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>

What It Does

Injects custom SSIS Data Flow transformations into the pipeline immediately before the target transformation area in a Data Mart load. This extension point fires after source-stage processing is complete and the data is approaching the target section of the Data Flow. Use it to add transformations that apply to all rows (inserts, updates) entering the target area, such as lookups for surrogate keys, derived columns for computed values, or data type conversions needed by the destination.

Where It Fires

Consumed in 90.flx-i-get-target-transformation.biml (line ~12). The emitted Biml is placed in the SSIS Data Flow before the target transformation area. Conditional: when integrationStage=="DWH", BimlFlex uses the Dwh variant of the target pipeline extension points.

When to Use It

  1. You need to add a Lookup transformation that resolves surrogate keys from a reference dimension before rows are inserted into fact.SalesOrder in the LOAD_BFX_DM batch.
  2. A Derived Column transformation must compute a composite business key or a calculated measure before the data enters the target area for dim.Customer in BFX_DM.
  3. You want to apply a Union All transformation to merge data from multiple sources before the target transformations process the combined stream for dim.Product.
  4. Use this instead of DwhInsertPipeline when the transformation should apply to all rows entering the target (both inserts and updates), not just the insert path.
  5. A data type conversion is needed for a column that the destination requires in a specific format (e.g., converting NVARCHAR to VARCHAR) for dim.Address.

Prerequisites

  • The target object must be in a Data Mart integration stage (connection BFX_DM).
  • SSIS deployment model; this extension point modifies the SSIS Data Flow.
  • You must connect to the inputPath parameter and set CustomOutput.OutputPathName.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhTargetPipelinePre.
  3. Set the target attribute to the specific Data Mart object.
  4. Define the Data Flow transformation.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhTargetPipelinePre" target="fact.SalesOrder" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>

<# CustomOutput.ObjectInherit = false; #>
<Lookup Name="LKP - Customer SK" OleDbConnectionName="BFX_DM" NoMatchBehavior="IgnoreFailure">
<InputPath OutputPathName="<#=inputPath #>" />
<Inputs>
<Column SourceColumn="CustomerKey" TargetColumn="CustomerKey" />
</Inputs>
<Outputs>
<Column SourceColumn="Customer_SK" TargetColumn="Customer_SK" />
</Outputs>
<DirectInput>SELECT CustomerKey, Customer_SK FROM [dim].[Customer] WHERE FlexRowIsCurrent = 1</DirectInput>
<# CustomOutput.OutputPathName = @"LKP - Customer SK.Output"; #>
</Lookup>
  1. Build and verify the transformation appears in the SSIS Data Flow before the target transformation area.

Example

Before -- data flows from source processing directly to the target area:

[Source Pipeline] --> [Target Transformations] --> [OLE_DST - fact.SalesOrder]

After -- your lookup transformation is inserted before the target area:

[Source Pipeline] --> [LKP - Customer SK] --> [Target Transformations] --> [OLE_DST - fact.SalesOrder]

Common Mistakes

MistakeSymptomFix
Not setting CustomOutput.OutputPathName.Target transformations cannot find their input; build error.Always set CustomOutput.OutputPathName.
Using NoMatchBehavior="FailComponent" without handling unmatched rows.Pipeline fails at runtime when lookup misses occur.Use IgnoreFailure or RedirectRowsToNoMatchOutput and handle unmatched rows.
Confusing Pre (before target area) with Post (after target area).Transformation fires at the wrong stage.Use Pre for before target transformations, Post for after.
Adding this for insert-only logic when DwhInsertPipeline would be more appropriate.Transformation runs for all rows (including updates) when you only intended it for inserts.Use DwhInsertPipeline for insert-path-specific transformations.
  • DwhTargetPipelinePost -- fires after the target transformations; use for last-mile transformations before the destination.
  • DwhTargetOverride -- replaces the entire destination; use when you need a completely custom target.
  • DwhInsertPipeline -- fires in the insert-specific path; use for insert-only transformations.
  • DwhSourcePipelinePost -- fires after the source; use for earlier-stage transformations.

Data Warehouse Type1 Pipeline

Configure type1 update pipeline logic that will be injected before the target transformation node

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the Batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhType1Pipeline" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>

What It Does

Injects custom SSIS Data Flow transformations into the Type 1 SCD processing path of a Data Mart load pipeline. During a Type 1 load, BimlFlex generates a Data Flow that identifies changed rows and applies overwrites to existing dimension or fact table records. This extension point lets you inject transformations into that Type 1-specific path -- for example, adding data conversions, derived columns, or lookups that only apply to the Type 1 update stream and not to inserts or Type 2 versioning.

Where It Fires

Consumed in 95.flx-i-process-type1.biml. The emitted Biml is placed inside the SSIS Data Flow task that handles the Type 1 SCD update logic. This extension point only fires when HasType1Columns==true and the target is a DIM or FCT table.

When to Use It

  1. You need to add a Derived Column transformation that sets a LastModifiedDate to the current timestamp on rows flowing through the Type 1 update path for dim.Customer in BFX_DM.
  2. A data type conversion is required on Type 1 columns before the update is applied to dim.Product in the LOAD_BFX_DM batch.
  3. You want to add a Conditional Split that routes certain rows to an error output based on business rules before the Type 1 update hits dim.Address.
  4. Use this instead of DwhTargetPipelinePre when the transformation should only affect the Type 1 update path and not the insert or Type 2 paths.
  5. A Lookup transformation must resolve a reference value (e.g., mapping a status code to a description) specifically for rows being updated via Type 1 logic in fact.SalesOrder.

Prerequisites

  • The target object must be a DIM or FCT table with Type 1 SCD columns in a Data Mart integration stage.
  • The object must have columns configured with ChangeType = Type1 in BimlFlex metadata (HasType1Columns==true).
  • SSIS deployment model; this extension point modifies the SSIS Data Flow.
  • You must connect to the inputPath parameter and set CustomOutput.OutputPathName.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhType1Pipeline.
  3. Set the target attribute to the specific Data Mart object.
  4. Define the Data Flow transformation.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhType1Pipeline" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>

<DerivedColumns Name="DER - SetLastModified">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column Name="LastModifiedDate" DataType="DateTime">GETDATE()</Column>
</Columns>
<# CustomOutput.OutputPathName = @"DER - SetLastModified.Output"; #>
</DerivedColumns>
  1. Build and verify the transformation appears in the Type 1 Data Flow path.

Example

Before -- the Type 1 path sends changed rows directly to the update destination:

[Conditional Split - Type 1 Changes] --> [OLE_CMD - Update dim.Customer]

After -- your derived column is inserted in the Type 1 path:

[Conditional Split - Type 1 Changes] --> [DER - SetLastModified] --> [OLE_CMD - Update dim.Customer]

Common Mistakes

MistakeSymptomFix
Applying this to an object without Type 1 columns.Extension point never fires; no transformation appears.Verify the object has columns with ChangeType = Type1.
Not setting CustomOutput.OutputPathName.The update destination cannot find its input; build error.Always set CustomOutput.OutputPathName.
Changing column names that the update command expects."Column not found" error in the OLE DB Command.Preserve original column names or update downstream mappings.
Using this when the transformation should apply to all paths (inserts + updates).Only Type 1 updates receive the transformation; inserts are unaffected.Use DwhTargetPipelinePre for transformations that should apply to all data paths.
  • DwhType2Pipeline -- fires in the Type 2 SCD path; use for transformations specific to Type 2 versioning.
  • DwhInsertPipeline -- fires in the insert path; use for transformations specific to new row inserts.
  • DwhTargetPipelinePre -- fires before the target area for all paths; use when the transformation applies to all rows.
  • DwhPreType1UpdateSql / DwhPostType1UpdateSql -- SQL equivalents that inject into the stored procedure rather than the SSIS Data Flow.

Data Warehouse Type2 Pipeline

Configure type2 insert pipeline logic that will be injected before the target transformation node

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhType2Pipeline" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>

What It Does

Injects custom SSIS Data Flow transformations into the Type 2 SCD processing path of a Data Mart load pipeline. During a Type 2 load, BimlFlex generates a Data Flow that identifies rows where tracked attributes have changed and creates new versioned rows with effective date ranges to preserve the full history. This extension point lets you inject transformations into that Type 2-specific path -- for example, adding derived columns, lookups, or data conversions that only apply to the rows being versioned.

Where It Fires

Consumed in 95.flx-i-process-type2.biml. The emitted Biml is placed inside the SSIS Data Flow task that handles the Type 2 SCD versioning logic. This extension point only fires when HasType2Columns==true.

When to Use It

  1. You need to add a Derived Column that sets the FlexRowEffectiveFromDate to a specific business date (rather than the system date) for new version rows of dim.Customer in BFX_DM.
  2. A data type conversion is required on Type 2 columns before the versioned rows are inserted into dim.Product in the LOAD_BFX_DM batch.
  3. You want to add a Lookup transformation that enriches new version rows with reference data (e.g., resolving a manager hierarchy) before they are written to dim.Address.
  4. Use this instead of DwhType1Pipeline when the transformation applies specifically to the Type 2 versioning path rather than the Type 1 overwrite path.
  5. A Conditional Split is needed to route certain Type 2 version rows to an error output based on validation rules before they are inserted into the target dimension.

Prerequisites

  • The target object must be a DIM table with Type 2 SCD columns in a Data Mart integration stage.
  • The object must have columns configured with ChangeType = Type2 in BimlFlex metadata (HasType2Columns==true).
  • SSIS deployment model; this extension point modifies the SSIS Data Flow.
  • You must connect to the inputPath parameter and set CustomOutput.OutputPathName.

Implementation Steps

  1. Open the BimlFlex App or BimlStudio and navigate to the Extension Points editor.
  2. Create a new Extension Point and set its type to DwhType2Pipeline.
  3. Set the target attribute to the specific Data Mart dimension.
  4. Define the Data Flow transformation.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DwhType2Pipeline" target="dim.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>

<DerivedColumns Name="DER - SetVersionMetadata">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column Name="VersionReason" DataType="String" Length="50">"Attribute Change"</Column>
</Columns>
<# CustomOutput.OutputPathName = @"DER - SetVersionMetadata.Output"; #>
</DerivedColumns>
  1. Build and verify the transformation appears in the Type 2 Data Flow path.

Example

Before -- the Type 2 path sends versioned rows directly to the insert destination:

[Conditional Split - Type 2 Changes] --> [OLE_DST - dim.Customer (Type 2 Insert)]

After -- your derived column is inserted in the Type 2 path:

[Conditional Split - Type 2 Changes] --> [DER - SetVersionMetadata] --> [OLE_DST - dim.Customer (Type 2 Insert)]

Common Mistakes

MistakeSymptomFix
Applying this to an object without Type 2 columns.Extension point never fires; no transformation appears.Verify the object has columns with ChangeType = Type2.
Not setting CustomOutput.OutputPathName.The insert destination cannot find its input; build error.Always set CustomOutput.OutputPathName.
Modifying effective date columns in a way that breaks the versioning logic.Overlapping date ranges or incorrect IsCurrent flags in the target.Be cautious when overriding FlexRowEffectiveFromDate or FlexRowIsCurrent.
Using this when the transformation should apply to all paths.Only Type 2 inserts receive the transformation; Type 1 updates and new inserts are unaffected.Use DwhTargetPipelinePre for transformations that should apply to all data paths.
  • DwhType1Pipeline -- fires in the Type 1 SCD path; use for transformations specific to Type 1 overwrites.
  • DwhInsertPipeline -- fires in the insert path for new rows; use for insert-specific transformations.
  • DwhTargetPipelinePre -- fires before the target area for all paths; use when the transformation applies universally.
  • DwhPreType2InsertSql / DwhPostType2InsertSql -- SQL equivalents that inject into the stored procedure rather than the SSIS Data Flow.