Skip to main content

Object Extension Points

The Object category has the following available Extension Points defined.

Append Source SQL

Append to the source query SQL generated by BimlFlex

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="AppendSourceSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
FOR READ ONLY WITH CS

What It Does

Appends custom text to the end of the source SELECT query that BimlFlex generates for an object. Rather than replacing the entire query, this extension point adds clauses such as query hints, locking directives, or additional filtering after the generated SQL. This is particularly useful in ELT patterns where you need to control how the source database handles the read operation.

Where It Fires

The extension point is consumed during package and script generation when the source query is assembled. It appends text directly after the auto-generated SELECT statement for the targeted object. It is used in ELT patterns where the source query is emitted as part of a stored procedure or direct SQL command.

When to Use It

  • You need to add FOR READ ONLY WITH CS (cursor stability) to the source query for SalesLT.Customer on AWLT_SRC when extracting from a DB2 source system.
  • You want to append OPTION (MAXDOP 1) or OPTION (RECOMPILE) query hints to the extraction query for SalesLT.SalesOrderHeader to control query plan behavior.
  • You need to add WITH (NOLOCK) or other table hints at the query level for a specific source object extraction.
  • Use this instead of Override SQL when the auto-generated SELECT is correct and you only need to add trailing clauses or hints rather than rewriting the entire query.
  • You want to apply a consistent query hint to all objects in a batch by setting ObjectInherit = true.

Prerequisites

  • The object must exist in BimlFlex metadata so the extension point can bind to it.
  • The appended text must be syntactically valid when concatenated to the end of the generated SELECT statement.
  • Understand the source database platform (SQL Server, DB2, Oracle, etc.) to use the correct hint syntax.
  • Test the complete query (generated SELECT + your appended text) against the source database to verify it parses correctly.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Append Source SQL from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC) or leave it unset and use ObjectInherit = true to apply to all objects in the batch.
  4. Enter the text to append (e.g., FOR READ ONLY WITH CS or OPTION (MAXDOP 1)).
  5. Build the project and review the generated source query to confirm the appended text appears at the end.
  6. Execute a test extraction to verify the query runs successfully against the source database.

Example

Before (default generated source query):

SELECT [CustomerID]
,[FirstName]
,[LastName]
,[EmailAddress]
,[ModifiedDate]
FROM [SalesLT].[Customer]

After (with AppendSourceSql extension point):

SELECT [CustomerID]
,[FirstName]
,[LastName]
,[EmailAddress]
,[ModifiedDate]
FROM [SalesLT].[Customer]
OPTION (MAXDOP 1, RECOMPILE)

Common Mistakes

  • Adding a full SELECT statement instead of a clause: This extension point appends text, not replaces the query. Adding a full SELECT statement creates invalid SQL. Use Override SQL instead if you need a complete replacement.
  • Syntax errors from missing spaces: The appended text is concatenated directly; ensure it starts with appropriate whitespace or a newline to avoid running into the preceding SQL.
  • Using database-specific hints on the wrong platform: DB2 hints like FOR READ ONLY WITH CS cause errors on SQL Server and vice versa. Match your hint syntax to the source connection type.
  • Setting ObjectInherit to true with platform-specific SQL: If your batch contains objects from different source platforms, a platform-specific hint applied to all objects causes failures on incompatible sources.
  • Confusing with Override SQL: If you need to change column selection or add WHERE clauses, use Override SQL. Append Source SQL is only for trailing text after the existing query.
  • Override SQL -- Replaces the entire source query SQL rather than appending to it.
  • Source Create SQL -- Deploys a physical view or table to the source database.
  • Lookup SQL -- Overrides the SQL used for lookup transformations.

Analysis Metadata

Configure Analysis Metadata for an object. This extension point controls how BimlFlex generates SSAS (SQL Server Analysis Services) metadata for multidimensional and tabular models.

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the analysis metadata will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="AnalysisMetadata" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
<# var objectViewModel = new ObjectViewModel(table, table.Connection.RelatedItem.IntegrationStage, this); #>
<AnalysisMetadata>
<Tabular Name="<#=objectViewModel.ObjectName#>" DataCategory="Uncategorized" AutoCreateSinglePartition="true">
<Columns>
<# foreach (var column in objectViewModel.Columns) { #>
<Column Name="<#=column.ColumnName#>" SourceColumnName="<#=column.ColumnName#>" />
<# } #>
</Columns>
</Tabular>
</AnalysisMetadata>

What It Does

Defines how BimlFlex generates SQL Server Analysis Services (SSAS) metadata for Data Mart objects. This controls whether objects appear as dimensions or measure groups in multidimensional models, or as tables with typed columns in tabular models. The extension point wraps SSAS-specific metadata (attributes, hierarchies, relationships, partitions) around the object's column definitions.

Where It Fires

Referenced in 00.flx-i-get-ssas-multi.biml (multidimensional models) and 00.flx-i-get-ssas-tabular.biml (tabular models). Fires during the SSAS metadata generation phase after Data Mart objects are defined. The output is consumed as <AnalysisMetadata> elements containing either <Dimension> nodes (multidimensional) or <Tabular> nodes (tabular). Sub-extension points (SsasDimension, SsasAttributes, SsasAttribute, SsasRelationships, SsasAttributeHierarchies, SsasMeasureGroup) provide fine-grained control within the AnalysisMetadata structure.

When to Use It

  1. Configure a tabular model for a Data Mart dimension. Your dim.Customer table in BFX_DM needs specific column categorisations (SummarizeBy, DataCategory) for Power BI consumption via a tabular model.
  2. Define a multidimensional dimension with custom attributes. Your dim.Product needs a product category hierarchy with parent-child relationships that BimlFlex does not generate by default.
  3. Add measure groups for fact tables. Your fact.SalesOrder needs custom measure definitions (SUM, COUNT, AVERAGE) exposed through the SSAS cube.
  4. Use this instead of manually configuring SSAS projects when you want the analysis metadata to be generated alongside your Data Mart objects and stay in sync with metadata changes.

Prerequisites

  • Object types: Applies to Data Mart objects (dimensions and facts) — object types DMV (dimension view) and FCV (fact view)
  • Deployment target: SSAS multidimensional or tabular models
  • Required metadata: Objects must be defined in BimlFlex metadata with appropriate column mappings
  • Related extension points: Sub-extension points SsasDimension, SsasAttributes, SsasRelationships, SsasAttributeHierarchies, SsasMeasureGroup provide finer control within the AnalysisMetadata structure

Implementation Steps

  1. In BimlFlex, open the Extension Points editor.
  2. Create a new extension point file. Select AnalysisMetadata as the type.
  3. Set the target attribute to the object name (e.g., dim.Customer).
  4. Define the <AnalysisMetadata> element with either <Tabular> (for tabular models) or <Dimension> (for multidimensional).
  5. Configure column properties, partitions, and hierarchies as needed.
  6. Build and verify the SSAS metadata is generated correctly.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="AnalysisMetadata" target="dim.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
<# var objectViewModel = new ObjectViewModel(table, table.Connection.RelatedItem.IntegrationStage, this); #>
<AnalysisMetadata>
<Tabular Name="Customer" DataCategory="Uncategorized" AutoCreateSinglePartition="true">
<Columns>
<Column Name="Customer_BK" SummarizeBy="None" DataType="Automatic" SourceColumnName="Customer_BK" />
<Column Name="CustomerName" SummarizeBy="None" DataType="Automatic" SourceColumnName="CustomerName" />
<Column Name="EmailAddress" SummarizeBy="None" DataType="Automatic" SourceColumnName="EmailAddress" />
</Columns>
<Partitions>
<Partition Name="Customer">
<Source ConnectionName="BFX_DM">
<Query>SELECT Customer_BK, CustomerName, EmailAddress FROM dim.Customer</Query>
</Source>
</Partition>
</Partitions>
</Tabular>
</AnalysisMetadata>

Example

Before (default — no SSAS metadata):

The Data Mart object dim.Customer generates only the SQL table. No SSAS metadata is produced.

After (AnalysisMetadata applied):

<AnalysisMetadata>
<Tabular Name="Customer" DataCategory="Uncategorized" AutoCreateSinglePartition="true">
<Columns>
<Column Name="Customer_BK" SummarizeBy="None" DataType="Automatic" SourceColumnName="Customer_BK" />
<Column Name="CustomerName" SummarizeBy="None" DataType="Automatic" SourceColumnName="CustomerName" />
</Columns>
</Tabular>
</AnalysisMetadata>

Common Mistakes

  • Mistake: Applying AnalysisMetadata to a staging or Data Vault object. Symptom: The metadata is generated but has no effect — SSAS models are built from Data Mart objects. Fix: Only apply to DM objects (dimensions and facts).
  • Mistake: Column names in the Tabular definition do not match the actual table columns. Symptom: SSAS processing fails with column binding errors. Fix: Use objectViewModel.Columns to dynamically generate column definitions that match the metadata.
  • Mistake: Missing the <Partitions> section in a tabular model. Symptom: The tabular model has no data source — processing returns empty results. Fix: Always include at least one <Partition> with a <Source> query.
  • CreateSql — Control the DDL for the underlying Data Mart table that the SSAS model reads from
  • OverrideMerge — Customise how data is loaded into the DM table before SSAS processing

Azure Post Archive Stage

Configure pipeline logic that will be injected after the blobs are archived

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="AzurePostArchiveStage" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# var sourceObject = new ObjectViewModel(table, table.Connection.RelatedItem.IntegrationStage, "SRC");
var targetObject = new ObjectViewModel(table, sourceObject.TargetConnection.IntegrationStage, "TGT");
var persistentConnection = sourceObject.IsPersistent ? sourceObject.PersistentConnection : sourceObject.ProjectStageConnection;
persistentConnection = new[] {"AZB", "AZDLS"}.Contains(persistentConnection.ConnectionType) ? sourceObject.ProjectTargetConnection : persistentConnection;
#>
<# CustomOutput.ObjectInherit = true; #>
<ExecuteSQL Name="SQL - AzurePreArchiveStage" ConnectionName="<#=persistentConnection.Name#>">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<Parameters>
<Parameter Name="0" VariableName="User.ExecutionID" DataType="Int64" />
</Parameters>
<DirectInput>EXEC <#=targetObject.SystemType.StoredProcedureName(sourceObject, targetObject.Schema, targetObject.StartDelimiter, targetObject.EndDelimiter)#> ?</DirectInput>
<# CustomOutput.OutputPathName = @"SQL - AzurePreArchiveStage.Output"; #>
</ExecuteSQL>

What It Does

Injects custom control flow tasks after blobs have been archived in Azure-based SSIS packages. In Azure landing zone architectures, BimlFlex archives processed blobs (Azure Blob Storage or Azure Data Lake Storage) after data is extracted and staged. This extension point lets you add post-archive logic -- such as executing stored procedures, logging archive completion, or triggering downstream processes -- after the archive step completes.

Where It Fires

The extension point is consumed during control flow generation in Azure-based extraction packages. It fires once per object (or once per batch with ObjectInherit = true) after the blob archive step. The precedenceConstraint parameter connects to the preceding archive task. You must set CustomOutput.OutputPathName so subsequent tasks can chain after your custom post-archive logic.

When to Use It

  1. Execute a post-archive stored procedure. After archiving blobs for SalesLT.Customer in the Azure landing zone, execute a stored procedure that updates a tracking table with the archive completion timestamp.
  2. Trigger a downstream process. After the archive step completes for SalesLT.SalesOrderHeader, trigger a Data Vault load or notify an orchestration system.
  3. Clean up temporary Azure resources. After archiving, delete temporary blobs or containers that were used during the extraction process.
  4. Log archive metrics. Record the number of blobs archived, total bytes processed, or archive duration for monitoring.
  5. Apply to all objects in a batch. Set ObjectInherit = true to execute the same post-archive logic for every object in the batch.

Prerequisites

  • The object must be configured for Azure-based extraction (Azure Blob Storage or Azure Data Lake Storage connections).
  • The precedenceConstraint parameter must be handled to connect to the preceding archive task.
  • You must set CustomOutput.OutputPathName so downstream tasks can chain.
  • Connections referenced (e.g., persistent staging connection) must be available.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Azure Post Archive Stage from the dropdown.
  3. Set the Target to the specific object or use ObjectInherit = true for all objects.
  4. Write the control flow task Biml including precedenceConstraint handling and OutputPathName.
  5. Build and verify the post-archive task appears after the archive step in the generated package.

Example

Before (default package flow -- archive and stop):

... -> Archive Blobs -> [end]

After (AzurePostArchiveStage adds a stored procedure call):

... -> Archive Blobs -> SQL - AzurePostArchiveStage -> [end]

Common Mistakes

  • Not setting OutputPathName: Without CustomOutput.OutputPathName, subsequent tasks cannot connect, potentially breaking the package.
  • Not handling precedenceConstraint: Omitting the conditional precedence constraint pattern causes connection errors.
  • Applying to non-Azure objects: This extension point only applies to Azure-based extraction packages. It has no effect on standard on-premises SSIS packages.
  • Template uses "AzurePreArchiveStage" in the name: Note that the default template's ExecuteSQL task is named SQL - AzurePreArchiveStage despite being the Post Archive extension point. Rename the task to avoid confusion.
  • Azure Pre Archive Stage -- Injects tasks before the blob archive step (counterpart to Post Archive).
  • Post Data Flow -- Injects tasks after the data flow in non-Azure packages.
  • Source File Archive Override -- Overrides file archiving for on-premises file-based extractions.

Azure Pre Archive Stage

Configure pipeline logic that will be injected before the blobs are archived

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="AzurePreArchiveStage" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# var sourceObject = new ObjectViewModel(table, table.Connection.RelatedItem.IntegrationStage, "SRC");
var targetObject = new ObjectViewModel(table, sourceObject.TargetConnection.IntegrationStage, "TGT");
var persistentConnection = sourceObject.IsPersistent ? sourceObject.PersistentConnection : sourceObject.ProjectStageConnection;
persistentConnection = new[] {"AZB", "AZDLS"}.Contains(persistentConnection.ConnectionType) ? sourceObject.ProjectTargetConnection : persistentConnection;
#>
<# CustomOutput.ObjectInherit = true; #>
<ExecuteSQL Name="SQL - AzurePreArchiveStage" ConnectionName="<#=persistentConnection.Name#>">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<Parameters>
<Parameter Name="0" VariableName="User.ExecutionID" DataType="Int64" />
</Parameters>
<DirectInput>EXEC <#=targetObject.SystemType.StoredProcedureName(sourceObject, targetObject.Schema, targetObject.StartDelimiter, targetObject.EndDelimiter)#> ?</DirectInput>
<# CustomOutput.OutputPathName = @"SQL - AzurePreArchiveStage.Output"; #>
</ExecuteSQL>

What It Does

Injects custom control flow tasks before blobs are archived in Azure-based SSIS packages. In Azure landing zone architectures, BimlFlex archives processed blobs after extraction. This extension point lets you add pre-archive logic -- such as executing stored procedures to process the staged data, validating data quality, or performing cleanup on the staging tables -- before the archive step moves or deletes the source blobs.

Where It Fires

The extension point is consumed during control flow generation in Azure-based extraction packages. It fires once per object (or once per batch with ObjectInherit = true) before the blob archive step. The precedenceConstraint parameter connects to the preceding task (typically the staging load). You must set CustomOutput.OutputPathName so the archive step can chain after your custom pre-archive logic.

When to Use It

  1. Execute a stored procedure before archiving. After staging data from SalesLT.Customer blobs, execute a stored procedure that merges staged data into the persistent staging table before the blobs are archived.
  2. Validate data before archive. Run data quality checks on the staged data from SalesLT.SalesOrderHeader. If validation fails, prevent archiving so the blobs can be reprocessed.
  3. Process staged data. Execute ELT stored procedures that transform the staged data from SalesLT.Product before the source blobs are moved to the archive location.
  4. Update a tracking table. Record the processing status in a control table before the blobs are archived.
  5. Apply to all objects in a batch. Set ObjectInherit = true to execute the same pre-archive logic for every object in the batch.

Prerequisites

  • The object must be configured for Azure-based extraction (Azure Blob Storage or Azure Data Lake Storage connections).
  • The precedenceConstraint parameter must be handled.
  • You must set CustomOutput.OutputPathName so the archive step can connect.
  • The persistent staging connection must be available for executing stored procedures.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Azure Pre Archive Stage from the dropdown.
  3. Set the Target to the specific object or use ObjectInherit = true for all objects.
  4. Write the control flow task Biml including precedenceConstraint handling and OutputPathName.
  5. Build and verify the pre-archive task appears before the archive step in the generated package.

Example

Before (default package flow -- extract, stage, archive):

... -> Stage Data -> Archive Blobs

After (AzurePreArchiveStage adds a stored procedure call before archive):

... -> Stage Data -> SQL - AzurePreArchiveStage -> Archive Blobs

Common Mistakes

  • Not setting OutputPathName: Without CustomOutput.OutputPathName, the archive step cannot connect, and blobs are not archived.
  • Not handling precedenceConstraint: Omitting the conditional precedence constraint pattern causes connection errors.
  • Long-running procedures blocking archive: If the pre-archive stored procedure takes too long, it delays the archive step, potentially causing timeout issues. Optimize the procedure.
  • Applying to non-Azure objects: This extension point only applies to Azure-based extraction packages. It has no effect on standard on-premises packages.
  • Azure Post Archive Stage -- Injects tasks after the blob archive step (counterpart to Pre Archive).
  • Pre Data Flow -- Injects tasks before the data flow in non-Azure packages.
  • Source File Archive Override -- Overrides file archiving for on-premises file-based extractions.

Create SQL

Configure override Create DDL for an Object.

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the Object to which the SQL will be added.

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="CreateSql" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
IF NOT EXISTS (
SELECT *
FROM sys.schemas
WHERE NAME = N'awlt'
)
EXEC ('CREATE SCHEMA [awlt] AUTHORIZATION [dbo]')
GO

IF NOT EXISTS (
SELECT *
FROM sys.objects
WHERE object_id = OBJECT_ID(N'awlt')
AND type IN (N'U')
)
CREATE TABLE [awlt].[Account] (
-- Columns Definition
[AccountCodeAlternateKey] INT NOT NULL
,[ParentAccountCodeAlternateKey] INT
,[AccountDescription] NVARCHAR(50)
,[AccountType] NVARCHAR(50)
,[Operator] NVARCHAR(50)
,[CustomMembers] NVARCHAR(300)
,[ValueType] NVARCHAR(50)
,[CustomMemberOptions] NVARCHAR(200)
,[FlexRowEffectiveFromDate] DATETIME2(7) NOT NULL
,[FlexRowEffectiveToDate] DATETIME2(7) NOT NULL
,[FlexRowAuditId] BIGINT NOT NULL
,[FlexRowRecordSource] VARCHAR(10) NOT NULL
,[FlexRowSourceId] INT NOT NULL
,[FlexRowIsCurrent] BIT NOT NULL
,[FlexRowHash] VARCHAR(80)
,[FlexRowHashKey] VARCHAR(40)
-- Constraints
,CONSTRAINT [PK_awlt_Account] PRIMARY KEY CLUSTERED (
[AccountCodeAlternateKey] ASC
,[FlexRowEffectiveFromDate] ASC
) WITH (
PAD_INDEX = OFF
,IGNORE_DUP_KEY = OFF
)
ON "default"
) ON "default"
WITH (DATA_COMPRESSION = NONE)
GO

What It Does

Replaces the entire CREATE TABLE DDL statement that BimlFlex generates for an object. Instead of relying on the metadata-driven table definition, you provide the complete DDL script, giving you full control over column definitions, data types, constraints, compression settings, filegroup placement, and any other table-level options that the standard generation does not expose.

Where It Fires

The extension point is stored as an annotation during metadata processing and consumed during deployment in the 0.90.x-flx-deploy-*-script.biml files. It replaces the full CREATE TABLE statement for the targeted object. It fires once per object during the DDL script generation phase.

When to Use It

  • You need a custom CREATE TABLE for awlt.Customer in BFX_STG that includes data compression, specific filegroup placement, or partition schemes not available through metadata settings.
  • The staging table for awlt.SalesOrderHeader requires a schema check and conditional creation pattern (IF NOT EXISTS) that differs from the standard generated DDL.
  • You want to add inline CHECK constraints, DEFAULT bindings, or computed columns to a table in BFX_ODS that cannot be expressed through BimlFlex column metadata.
  • Use this instead of Override Primary Key SQL or Override Unique Key SQL when you need to change the entire table definition, not just a specific constraint.
  • Use this instead of Source Create SQL when the target object is a staging, ODS, Data Vault, or Data Mart table rather than a source-side object.

Prerequisites

  • The object must exist in BimlFlex metadata so the extension point can bind to it.
  • Your custom DDL must define all columns that downstream pipelines, stored procedures, and hash calculations expect.
  • You must have CREATE TABLE permissions on the target database and schema.
  • If the table already exists, include appropriate IF NOT EXISTS or DROP/CREATE logic to make the script idempotent.
  • Coordinate with any Pre Create SQL or Post Create SQL extension points defined for the same object.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Create SQL from the dropdown.
  3. Set the Target to the specific object (e.g., awlt.Customer on BFX_STG).
  4. Write the complete CREATE TABLE DDL including schema creation, existence checks, column definitions, and all constraints.
  5. Include BimlFlex framework columns (FlexRowEffectiveFromDate, FlexRowAuditId, FlexRowRecordSource, etc.) to ensure pipeline compatibility.
  6. Build the project and verify the generated deployment script contains your custom DDL.
  7. Deploy the script and confirm the table structure matches expectations.

Example

Before (default auto-generated DDL):

CREATE TABLE [awlt].[Customer] (
[CustomerID] INT NOT NULL,
[FirstName] NVARCHAR(50),
[LastName] NVARCHAR(50),
[FlexRowEffectiveFromDate] DATETIME2(7) NOT NULL,
[FlexRowAuditId] BIGINT NOT NULL,
CONSTRAINT [PK_awlt_Customer] PRIMARY KEY CLUSTERED ([CustomerID] ASC)
)

After (with CreateSql extension point):

IF NOT EXISTS (SELECT * FROM sys.schemas WHERE name = N'awlt')
EXEC ('CREATE SCHEMA [awlt] AUTHORIZATION [dbo]')
GO
IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[awlt].[Customer]') AND type IN (N'U'))
CREATE TABLE [awlt].[Customer] (
[CustomerID] INT NOT NULL,
[FirstName] NVARCHAR(50),
[LastName] NVARCHAR(50),
[FlexRowEffectiveFromDate] DATETIME2(7) NOT NULL,
[FlexRowAuditId] BIGINT NOT NULL,
[FlexRowRecordSource] VARCHAR(10) NOT NULL,
CONSTRAINT [PK_awlt_Customer] PRIMARY KEY CLUSTERED (
[CustomerID] ASC,
[FlexRowEffectiveFromDate] ASC
) WITH (PAD_INDEX = OFF, IGNORE_DUP_KEY = OFF)
ON [PRIMARY]
) ON [PRIMARY]
WITH (DATA_COMPRESSION = PAGE)
GO

Common Mistakes

  • Missing BimlFlex framework columns: Omitting columns like FlexRowAuditId, FlexRowHash, or FlexRowRecordSource causes pipeline failures because the generated stored procedures expect them.
  • Column data type mismatches: If column types in your custom DDL differ from the metadata (e.g., NVARCHAR(50) vs. NVARCHAR(100)), data truncation or conversion errors occur at runtime.
  • Forgetting idempotent checks: Without IF NOT EXISTS guards, re-deploying the script fails when the table already exists.
  • Not including GO separators: Schema creation and table creation must be in separate batches in SQL Server.
  • Overriding without necessity: If you only need to change the PK or add a post-deployment index, use the more targeted extension points (Override Primary Key SQL, Post Create SQL) instead of replacing the entire DDL.
  • Source Create SQL -- Overrides CREATE DDL specifically for source-side objects (views, tables in the source database).
  • Override Primary Key SQL -- Overrides just the PK constraint within the CREATE TABLE.
  • Override Unique Key SQL -- Overrides just the unique key constraint within the CREATE TABLE.
  • Pre Create SQL / Post Create SQL -- Inject SQL before or after the CREATE DDL.
  • Override Alter SQL -- Overrides ALTER TABLE statements for modifying existing tables.

Data Flow Override

Override the Data Flow

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DataflowOverride" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# CustomOutput.ObjectInherit = false; #>
<Dataflow Name="DFT - Load MySource" DelayValidation="true">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<Transformations>
<OleDbSource Name="OLE_SRC - MySource" ConnectionName="MySourceConnection">
<DirectInput>SELECT * FROM [dbo].[MySourceTable]</DirectInput>
</OleDbSource>
<OleDbDestination Name="OLE_DST - MyTarget" ConnectionName="MyTargetConnection">
<InputPath OutputPathName="OLE_SRC - MySource.Output" />
<ExternalTableOutput Table="[dbo].[MyTargetTable]" />
</OleDbDestination>
</Transformations>
</Dataflow>
<# CustomOutput.OutputPathName = @"DFT - Load MySource.Output"; #>

What It Does

Replaces the entire SSIS data flow task that BimlFlex generates for an object. Instead of the standard data flow containing the BimlFlex source, transformations (hashing, lookups, derived columns), and destination, you provide a complete custom <Dataflow> element with your own source, transformations, and destination. The control flow tasks around the data flow (pre/post processing, logging) remain intact. You must set CustomOutput.OutputPathName so subsequent control flow tasks can chain after your custom data flow.

Where It Fires

The extension point is consumed during SSIS control flow generation and replaces the data flow task inside the SEQC - Main sequence container. It fires once per object when the data flow is rendered. The precedenceConstraint parameter connects your data flow to preceding control flow tasks. Unlike Override Main which replaces the entire sequence container, Data Flow Override only replaces the data flow task, preserving pre/post processing, logging, and other control flow tasks.

When to Use It

  • You need a completely custom data flow for extracting SalesLT.Customer from AWLT_SRC into awlt.Customer on BFX_STG that cannot be achieved by combining Source Override, Source Pipeline, Target Pipeline, and Target Override.
  • You want to implement a data flow with multiple sources (e.g., merging SalesLT.Customer with a lookup table) that the standard BimlFlex pattern does not support.
  • You need a data flow with a non-standard transformation chain (e.g., custom script components, third-party transformations) for SalesLT.SalesOrderHeader.
  • Use this instead of Override Main when you want to keep the control flow tasks (pre/post processing, logging) but replace only the data flow logic.
  • Use this instead of Source Override + Target Override when you need to replace the entire transformation chain, not just the endpoints.

Prerequisites

  • The object must exist in BimlFlex metadata so the extension point can bind to it.
  • You must set CustomOutput.OutputPathName to the output path of your custom data flow task so subsequent control flow tasks can connect.
  • The precedenceConstraint parameter must be used to connect your data flow to preceding tasks.
  • All connections referenced in your custom data flow must be defined in the project.
  • Column outputs from your data flow must match what downstream processes expect (if applicable).

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Data Flow Override from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Write the complete <Dataflow> element including source, transformations, and destination.
  5. Include the precedenceConstraint connection logic and set CustomOutput.OutputPathName.
  6. Build the project in BimlStudio and verify the data flow in the generated package.
  7. Execute a test run to confirm end-to-end data movement through your custom data flow.

Example

Before (default BimlFlex data flow):

DFT - Extract SalesLT.Customer
├─ OLE_SRC - SalesLT.Customer
├─ [Hashing/Transformations]
└─ OLE_DST - awlt.Customer

After (with DataflowOverride replacing the entire data flow):

<Dataflow Name="DFT - Load Customer Custom" DelayValidation="true">
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<Transformations>
<OleDbSource Name="OLE_SRC - Customer" ConnectionName="AWLT_SRC">
<DirectInput>SELECT [CustomerID], [FirstName], [LastName] FROM [SalesLT].[Customer]</DirectInput>
</OleDbSource>
<OleDbDestination Name="OLE_DST - Customer" ConnectionName="BFX_STG">
<InputPath OutputPathName="OLE_SRC - Customer.Output" />
<ExternalTableOutput Table="[awlt].[Customer]" />
</OleDbDestination>
</Transformations>
</Dataflow>
<# CustomOutput.OutputPathName = @"DFT - Load Customer Custom.Output"; #>

Common Mistakes

  • Forgetting to set OutputPathName: Without CustomOutput.OutputPathName, subsequent control flow tasks (Post Data Flow, post-processing) cannot connect to your data flow, breaking the execution chain.
  • Not handling the precedenceConstraint: Omitting the precedenceConstraint connection disconnects your data flow from preceding tasks (Pre Data Flow, pre-processing), causing execution order issues.
  • Losing BimlFlex transformations: All BimlFlex-generated transformations (hashing, row numbering, audit columns, change detection) are lost. You must implement any you still need in your custom data flow.
  • Using Data Flow Override when targeted extension points suffice: If you only need to change the source, use Source Override. If you only need to add transformations, use Source Pipeline or Target Pipeline. Data Flow Override should be reserved for cases where the entire data flow must be custom.
  • Mismatched data flow name in OutputPathName: The name in CustomOutput.OutputPathName must match the Name attribute of your <Dataflow> element plus .Output.
  • Override Main -- Replaces the entire SEQC - Main container including all control flow tasks (more invasive than Data Flow Override).
  • Source Override -- Replaces only the source component within the data flow.
  • Target Override -- Replaces only the target destination within the data flow.
  • Source Pipeline / Target Pipeline -- Injects transformations inside the data flow without replacing it.
  • Data Flow Properties -- Adds expressions to the data flow task container.

Data Flow Properties

Configure logic that will be inject

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the Data Flow element will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DataflowProperties" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
<Expression ExternalProperty="SsisBufferTempStoragePath">@[User::SsisBufferTempStoragePath]</Expression>
<Expression ExternalProperty="SsisBLOBTempStoragePath">@[User::SsisBLOBTempStoragePath]</Expression>

What It Does

Adds custom property expressions to the SSIS data flow task container. These expressions control data flow engine behavior at the task level -- such as buffer temp storage paths, BLOB temp storage paths, default buffer sizes, and engine thread counts. This is distinct from adding properties to individual source or destination components; it configures the data flow engine itself.

Where It Fires

The extension point is consumed during data flow task generation and adds <Expression> elements to the Dataflow task's <Expressions> collection. It is referenced in two separate Biml template files that generate data flow tasks. It fires once per object when the data flow task container is rendered.

When to Use It

  • You need to redirect SSIS buffer temp storage to a faster disk (e.g., SSD) for the SalesLT.Customer extraction by setting SsisBufferTempStoragePath to a variable-controlled path.
  • You want to configure the BLOB temp storage path for SalesLT.SalesOrderHeader extraction where large text or binary columns are extracted, preventing temp files from filling the default temp directory.
  • You need to override DefaultBufferSize or DefaultBufferMaxRows via expressions to optimize memory usage for a specific high-volume extraction.
  • Use this instead of Source Property or Target Property when you need to configure data flow engine-level settings rather than individual component properties.
  • You want to apply the same data flow properties to all objects in a batch by setting ObjectInherit = true.

Prerequisites

  • The object must exist in BimlFlex metadata with a valid data flow.
  • The variables referenced in expressions (e.g., User::SsisBufferTempStoragePath) must be defined in the package, either through BimlFlex metadata or via a Package Variable extension point.
  • The ExternalProperty values must be valid SSIS data flow task properties.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Data Flow Properties from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC) or leave it unset and use ObjectInherit = true for all objects.
  4. Add one or more <Expression> elements with valid data flow task property names and SSIS expression values.
  5. Ensure the referenced variables are defined in the package.
  6. Build the project in BimlStudio and verify the expressions appear on the data flow task in the generated package.
  7. Execute the package and confirm the data flow engine uses the specified property values.

Example

Before (default data flow task without custom expressions):

<Dataflow Name="DFT - Extract SalesLT.Customer" DelayValidation="true">
<Transformations>
...
</Transformations>
</Dataflow>

After (with DataflowProperties adding buffer path expressions):

<Dataflow Name="DFT - Extract SalesLT.Customer" DelayValidation="true">
<Expressions>
<Expression ExternalProperty="SsisBufferTempStoragePath">@[User::SsisBufferTempStoragePath]</Expression>
<Expression ExternalProperty="SsisBLOBTempStoragePath">@[User::SsisBLOBTempStoragePath]</Expression>
</Expressions>
<Transformations>
...
</Transformations>
</Dataflow>

Common Mistakes

  • Invalid ExternalProperty names: The property name must match a valid SSIS data flow task property exactly (e.g., SsisBufferTempStoragePath, DefaultBufferSize). Misspelled names cause build errors.
  • Referencing undefined variables: Using @[User::SsisBufferTempStoragePath] without defining the variable causes a validation error. Define variables via Package Variable extension points or BimlFlex settings.
  • Confusing with Source/Target Property: Data Flow Properties configures the data flow task container, not individual source or destination components. Use Source Property or Target Property for component-level properties.
  • Expression syntax errors: SSIS expression syntax differs from SQL. Use @[User::VariableName] for package variables and @[$Project::ParameterName] for project parameters.
  • Source Property -- Adds properties to the source component (component-level, not task-level).
  • Target Property -- Adds properties to the target destination component.
  • Data Flow Override -- Replaces the entire data flow task (includes the ability to set any property).
  • Package Variable -- Defines custom package variables referenced in data flow expressions.

Delete Detect Apply Dv Override

Override the SQL that is used by the Delete Detection Raw Data Vault process

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DeleteDetectionApplyDv" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# CustomOutput.ObjectInherit = true; #>
<#
var psaIntegrationStage = "PSA";
var stagingTable = new ObjectViewModel(table, "STG", "STG");
var persistentTable = new ObjectViewModel(table, "PSA", "PSA");
var tableConfigurations = new TableConfigurations(table, "STG");
var psaTableConfigurations = new TableConfigurations(table, "PSA");
var enableEndDatePsa = tableConfigurations.EnableEndDatePsa;
var enableEndDateDv = tableConfigurations.DvEndDateSatellite;
var isCurrent = tableConfigurations.RowIsCurrent.PersistentStagingAttribute;
var deleteObjectNamePattern = tableConfigurations.DeleteObjectNamePattern;

var stagingConnection = stagingTable.Connection;
var stagingScopedName = stagingTable.ScopedName;
var stagingSchemaQualifiedName = stagingTable.SchemaQualifiedName;
var stagingObjectName = stagingTable.ObjectName;
var stagingQualifiedName = stagingTable.QualifiedName;

var deleteScopedName = deleteObjectNamePattern.Replace("@@this", stagingScopedName);

var sd = persistentTable.StartDelimiter;
var ed = persistentTable.EndDelimiter;
var persistentScopedName = persistentTable.ScopedName;
var persistentSchemaQualifiedName = persistentTable.SchemaQualifiedName;
var persistentQualifiedName = persistentTable.QualifiedName;
var persistentFullyQualifiedName = persistentSchemaQualifiedName == persistentQualifiedName ? $"{sd}{persistentTable.Database}{ed}.{persistentQualifiedName}" : persistentQualifiedName;

var qualifiedEffectiveFromDate = tableConfigurations.QualifiedRowEffectiveFromDate;
var qualifiedEffectiveToDate = tableConfigurations.QualifiedRowEffectiveToDate;
var qualifiedRowChangeType = tableConfigurations.QualifiedRowChangeType;
var indexPrimaryKeyColumns = table.GetIndexPrimaryKeyColumns(psaIntegrationStage);
var joinClause = "ON\t" + indexPrimaryKeyColumns.Select(s => "PSA." + sd + s + ed + " = DEL." + sd + s + ed)
.Aggregate((current, next) => current + "\nAND\t" + next);
var currentListJoinClause = "ON\t" + indexPrimaryKeyColumns.Select(s => "PSA." + sd + s + ed + " = CL." + sd + s + ed)
.Aggregate((current, next) => current + "\nAND\t" + next);
var sqlPsaSelectPk = indexPrimaryKeyColumns.Select(s => "PSA." + sd + s + ed)
.Aggregate((current, next) => current + "\n\t\t," + next);
var whereClause = "WHERE\t" + indexPrimaryKeyColumns.Select(s => "DEL." + sd + s + ed + " IS NOT NULL").First();

var psaPkList = "\t" + indexPrimaryKeyColumns.Select(s => sd + s + ed)
.Aggregate((current, next) => current + ", " + next);

var groupByClause = "\t" + indexPrimaryKeyColumns.Select(s => "PSA." + sd + s + ed)
.Aggregate((current, next) => current + "\n," + next);

var deleteTable = RootNode.Tables[deleteScopedName];
var deleteTableDatabaseQualifiedName = deleteTable.DatabaseQualifiedName;
var deleteTableSchemaQualifiedName = deleteTable.SchemaQualifiedName;

var mappedHubObject = table.MetadataModelDynamicObject.Columns
.Where(e => e.Object.RelatedItem == table && e.TargetColumn.RelatedItem != null && new []{"HUB"}.Contains(e.TargetColumn.RelatedItem.Object.RelatedItem.ObjectType))
.Select(s => s.TargetColumn.RelatedItem.Object.RelatedItem).SingleOrDefault();
var mappedLinkColumns = table.MetadataModelDynamicObject.Columns
.Where(e => e.Object.RelatedItem == table && e.TargetColumn.RelatedItem != null && new []{"LNK"}.Contains(e.TargetColumn.RelatedItem.Object.RelatedItem.ObjectType));

var mappedSatelliteObjects = table.MetadataModelDynamicObject.Columns
.Where(e => e.Object.RelatedItem == table && e.TargetColumn.RelatedItem != null && new []{"LSAT","SAT"}.Contains(e.TargetColumn.RelatedItem.Object.RelatedItem.ObjectType))
.Select(s => s.TargetColumn.RelatedItem.Object.RelatedItem).Distinct();

var sqlPsaMultiActiveKeys = stagingTable.Columns.Where(cl => cl.IsPrimaryKey == "Y" && cl.ChangeType == "MAK");
var sqlPsaCompareMultiActiveKeys = "";
var sqlPsaSelectMultiActiveKeys = "";
var sqlPsaSatSelectMultiActiveKeys = "";
foreach (var multiActiveKey in sqlPsaMultiActiveKeys)
{
var sqlCompareMultiActiveKeys = multiActiveKey.GetSqlToBk(stagingTable.StartDelimiter, stagingTable.EndDelimiter, "PSA", "DV");
sqlPsaSelectMultiActiveKeys += $"\n\t\t\t\t,{sqlCompareMultiActiveKeys}";
sqlPsaSatSelectMultiActiveKeys += $"\n\t\t\t\t,{sqlCompareMultiActiveKeys} AS {sd}{multiActiveKey.GetColumnName()}{ed}";
sqlPsaCompareMultiActiveKeys += $"\n\t\t\tAND\t{sqlCompareMultiActiveKeys} = {sqlCompareMultiActiveKeys.Replace("PSA.", "DEL.")}";
}
#>
<# foreach(var satellite in mappedSatelliteObjects) {
var satellitePk = satellite.GetPrimaryKey();
var satelliteObjectType = satellite.ObjectType;
var hubPk = satellitePk.ReferenceColumn.RelatedItem;
var hubObject = hubPk.Object.RelatedItem;
var dvConnection = satellite.Connection.RelatedItem;
var satelliteTable = new ObjectViewModel(satellite, "DV", "SAT");
var hubTable = new ObjectViewModel(hubPk.Object.RelatedItem, "DV", "HUB");
var satelliteConfigurations = new TableConfigurations(satellite, "SAT");

var satelliteConfigurationsQualifiedRowIsCurrent = satelliteConfigurations.RowIsCurrent.SatelliteAttribute == "IGN" ? "" : satelliteConfigurations.QualifiedRowIsCurrent;

var sqlSatMultiActive = satellite.GetSqlSatMultiActive(satelliteTable.StartDelimiter, satelliteTable.EndDelimiter);
var sqlMultiActiveKeys = sqlSatMultiActive.Item1;
var sqlCompareMultiActiveKeys = sqlSatMultiActive.Item2;
var qualifiedSurrogateKey = satelliteTable.StartDelimiter + satellitePk.GetColumnName() + satelliteTable.EndDelimiter;

var qualifiedHubSurrogateKey = hubTable.StartDelimiter + hubPk.GetColumnName() + hubTable.EndDelimiter;
var linkHubJoinSk = "";
var linkHubJoinBk = "";
var linkPsaBk = "";
var psaKeyJoin = "";
var lsatJoinTest = "";
var hubAlias = "";

if (satelliteObjectType == "LSAT" && mappedHubObject == null){
var linkHubColumns = mappedLinkColumns.Where(s => s.TargetColumn.RelatedItem != null && s.TargetColumn.RelatedItem.Object.RelatedItem == hubObject && s.TargetColumn.RelatedItem.IsPrimaryKey != "Y");
var iHub = 1;
foreach(var linkHubColumn in linkHubColumns){
var targetLinkColumn = linkHubColumn.TargetColumn.RelatedItem;
var targetLinkHubColumn = targetLinkColumn.ReferenceColumn.RelatedItem;
if (targetLinkHubColumn == null) continue;
var linkHubTable = new ObjectViewModel(targetLinkHubColumn.Object.RelatedItem, "DV", "HUB");
//.Where(c => c.ReferenceColumn.RelatedItem != null && new []{"HUB"}.Contains(c.ReferenceColumn.RelatedItem.Object.RelatedItem.ObjectType) && c.Object.RelatedItem == hubObject);

linkHubJoinSk += $"\n\t\tINNER JOIN {linkHubTable.QualifiedName} H{iHub}\n\t\t\tON\tHUB.{sd}{targetLinkColumn.GetColumnName()}{ed} = H{iHub}.{linkHubTable.PrimaryKeyQualifiedName}";
linkHubJoinBk += $"\n\t\tAND\tH{iHub}.{linkHubTable.IntegrationKeyQualifiedName} = PSA.{sd}{linkHubColumn.GetColumnName()}{ed}";
var linkPsaBkExpression = linkHubColumn.GetSqlToBk(sd, ed, "PSA", "PSA");
linkPsaBk += $",{linkPsaBkExpression} AS {sd}{linkHubColumn.GetColumnName()}{ed}";

lsatJoinTest += $"DEL.{sd}{linkHubColumn.GetColumnName()}{ed} = H{iHub}.{linkHubTable.IntegrationKeyQualifiedName} AND ";
iHub ++;

}
lsatJoinTest = lsatJoinTest.Substring(0, lsatJoinTest.Length-4);


linkPsaBk = linkPsaBk.Substring(1);
var lsatHubTable = new ObjectViewModel(mappedHubObject, "DV", "HUB");
//linkHubJoin = $"\n\t\tINNER JOIN {lsatHubTable.QualifiedName} LHUB\n\t\t\tON\tHUB.[{linkHubColumn.GetColumnName()}] = LHUB.[{lsatHubTable.GetPrimaryKey().GetColumnName()}]";
//psaKeyJoin = $"\tON\tLHUB.[{lsatHubTable.BusinessKey.GetColumnName()}] = ";
} else if (satelliteObjectType == "LSAT" && mappedHubObject != null) {
var lsatHubTable = new ObjectViewModel(mappedHubObject, "DV", "HUB");
linkHubJoinSk = $"\n\t\tINNER JOIN {lsatHubTable.QualifiedName} LHUB\n\t\t\tON\tHUB.{lsatHubTable.PrimaryKeyQualifiedName} = LHUB.{lsatHubTable.PrimaryKeyQualifiedName}";
linkHubJoinBk = $"\n\t\tAND\tLHUB.{lsatHubTable.IntegrationKeyQualifiedName} = PSA.{stagingTable.IntegrationKeyQualifiedName}";
hubAlias = "LHUB";
} else {
linkHubJoinBk = $"\n\t\tAND HUB.{hubTable.IntegrationKeyQualifiedName} = PSA.{stagingTable.IntegrationKeyQualifiedName}";
hubAlias = "HUB";
}
#>
<ExecuteSQL Name="SQL - Insert Deleted Into <#=satellite.ObjectName.MakeSsisSafe()#>" ConnectionName="<#=dvConnection.Name #>" ResultSet="None">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<DirectInput>DECLARE @<#=satelliteConfigurations.RowEffectiveFromDateValue#> DATETIME2(7)
SELECT @<#=satelliteConfigurations.RowEffectiveFromDateValue#> = ISNULL(MAX(<#=tableConfigurations.QualifiedRowEffectiveFromDate#>), GETDATE()) FROM <#=deleteTableDatabaseQualifiedName#>

DECLARE @RowsInDelTable INT
SELECT @RowsInDelTable = CASE WHEN EXISTS (SELECT 1 FROM <#=deleteTableDatabaseQualifiedName#>) THEN 1 ELSE 0 END
IF @RowsInDelTable = 1
BEGIN
BEGIN TRANSACTION [<#=stagingTable.Id.Replace("-", "")#>]
BEGIN TRY
;WITH currentList AS
(
SELECT *
FROM
(
SELECT
<#=satellitePk.GetColumnName() #>
,<#=qualifiedRowChangeType #>
,ROW_NUMBER() OVER (PARTITION BY <#=satellitePk.GetColumnName() #> ORDER BY <#=qualifiedEffectiveFromDate #> DESC) as rn
FROM <#=satelliteTable.QualifiedName #>
) cur
WHERE rn = 1 AND <#=qualifiedRowChangeType #> &lt;&gt; 'D'
)
INSERT INTO <#=satelliteTable.QualifiedName #>
(<#=satelliteTable.InsertColumnsList.Replace("SAT.", "")#>)
SELECT <#=satelliteTable.InsertColumnsList.Replace($"SAT.{qualifiedRowChangeType}", $"CONVERT(VARCHAR(1), 'D') AS {qualifiedRowChangeType}").Replace($"SAT.{satelliteConfigurations.QualifiedRowEffectiveFromDate}", $"@{satelliteConfigurations.RowEffectiveFromDateValue} AS {satelliteConfigurations.QualifiedRowEffectiveFromDate}")#>
FROM <#=hubTable.QualifiedName #> HUB
INNER JOIN <#=satelliteTable.QualifiedName #> SAT
ON HUB.[<#=hubPk.GetColumnName()#>] = SAT.[<#=satellitePk.GetColumnName() #>]<#=linkHubJoinSk #>
AND EXISTS (
SELECT 1 FROM (
SELECT <#if (!string.IsNullOrEmpty(linkPsaBk)) {#><#=linkPsaBk #><#} else {#><#=stagingTable.IntegrationKey.GetSqlToBk(sd, ed, "PSA", "DV")#> AS <#=stagingTable.IntegrationKeyQualifiedName#>
<#} #><#=sqlPsaSatSelectMultiActiveKeys #>
FROM <#=deleteTableDatabaseQualifiedName #> PSA ) AS DEL
WHERE
<#if ( !string.IsNullOrEmpty(linkPsaBk)) {#><#=lsatJoinTest #><#} else {#> DEL.<#=stagingTable.IntegrationKeyQualifiedName#> = <#=hubAlias #>.<#=stagingTable.IntegrationKeyQualifiedName#>
<#} #><#=sqlCompareMultiActiveKeys.Replace("TGT.", "SAT.").Replace("SRC.", "DEL.") #>
)
INNER JOIN currentList CL ON CL.[<#=satellitePk.GetColumnName()#>] = SAT.[<#=satellitePk.GetColumnName() #>]

COMMIT TRANSACTION [<#=stagingTable.Id.Replace("-", "")#>]
END TRY
BEGIN CATCH
DECLARE @ErrorMessage NVARCHAR(4000), @ErrorSeverity INT, @ErrorState INT;
SELECT @ErrorMessage = ERROR_MESSAGE(), @ErrorSeverity = ERROR_SEVERITY(), @ErrorState = ERROR_STATE();
RAISERROR ( @ErrorMessage, @ErrorSeverity, @ErrorState );
ROLLBACK TRANSACTION [<#=stagingTable.Id.Replace("-", "")#>]
END CATCH
END
</DirectInput>
</ExecuteSQL>
<# } #>

What It Does

Replaces the SQL logic that applies delete detection results to the Raw Data Vault. When BimlFlex detects that source records have been deleted (via the delete detection process), this extension point controls how those deletions are propagated to Data Vault satellites -- typically by inserting a row with FlexRowChangeType = 'D' into each affected satellite table. The default template handles hub lookups, satellite inserts, multi-active key logic, and link-satellite join patterns.

Where It Fires

The extension point is consumed during the delete detection package generation for Data Vault targets. It fires once per object and emits ExecuteSQL tasks that insert "deleted" marker rows into satellite tables. The template iterates over all mapped satellite objects and generates an INSERT statement for each one, joining the delete detection results with hub and satellite tables to produce the correct surrogate key references.

When to Use It

  1. Customize the delete marker logic. The default inserts a row with FlexRowChangeType = 'D'. You need to also update the FlexRowEffectiveToDate on the previous current record when a deletion is detected for SalesLT.Customer in BFX_DV.
  2. Handle multi-active satellites differently. Your dv.SAT_Customer_Details_awlt has multi-active keys and the default join logic does not match your pattern.
  3. Skip certain satellites. Not all satellites should receive delete markers. Override to exclude specific satellites from the delete detection apply step.
  4. Add transaction handling. Wrap the delete detection apply in a custom transaction scope with different error handling than the default.
  5. Use this when the default Data Vault delete detection apply logic does not match your modeling pattern (e.g., pit tables, bridge tables, or custom satellite structures).

Prerequisites

  • Delete detection must be enabled for the object in BimlFlex metadata.
  • The object must have Data Vault target mappings (hubs, satellites, links).
  • The delete detection process must have already identified deleted records (via Delete Detect Data Flow Override or default logic).
  • The delete staging table (produced by the delete detection data flow) must be populated.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Delete Detect Apply Dv Override from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Customize the SQL that inserts delete markers into the satellite tables.
  5. Handle the precedenceConstraint parameter for control flow chaining.
  6. Build and test the delete detection package end-to-end.

Example

Before (default delete detection apply):

The default template inserts rows with FlexRowChangeType = 'D' into each mapped satellite, using hub lookups to resolve surrogate keys.

After (custom delete detection that also end-dates the previous record):

The custom logic adds an UPDATE statement to set FlexRowEffectiveToDate on the current record before inserting the delete marker row.

Common Mistakes

  • Breaking the hub/satellite join logic: The default template carefully joins delete detection results with hub tables to resolve surrogate keys. Modifying the join incorrectly causes orphaned satellite rows or missing delete markers.
  • Not handling multi-active keys: If the satellite has multi-active keys, the delete detection must account for them in the join. The default template handles this, but custom overrides often miss it.
  • Missing transaction handling: The default wraps the operation in a transaction. Removing the transaction can leave the Data Vault in an inconsistent state if the process fails partway.
  • Not handling the precedenceConstraint: Omitting the precedence constraint disconnects the task from the control flow.
  • Delete Detect Apply Psa Override -- Applies delete detection to the Persistent Staging Area (instead of Data Vault).
  • Delete Detect Data Flow Override -- Overrides the data flow that identifies deleted records.
  • Delete Detect Raw File Override -- Overrides the raw file format used by delete detection.
  • Delete Initialize Raw File Data Flow Override -- Overrides the initialization of the raw file used for delete detection.

Delete Detect Apply Psa Override

Override the SQL that is used by the Delete Detection Persistent Staging process

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DeleteDetectionApplyPsa" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# CustomOutput.ObjectInherit = true; #>
<#
var psaIntegrationStage = "PSA";
var stagingTable = new ObjectViewModel(table, "STG", "STG");
var persistentTable = new ObjectViewModel(table, "PSA", "PSA");
var tableConfigurations = new TableConfigurations(table, "STG");
var psaTableConfigurations = new TableConfigurations(table, "PSA");
var enableEndDatePsa = tableConfigurations.EnableEndDatePsa;
var enableEndDateDv = tableConfigurations.DvEndDateSatellite;
var isCurrent = tableConfigurations.RowIsCurrent.PersistentStagingAttribute;
var deleteObjectNamePattern = tableConfigurations.DeleteObjectNamePattern;

var stagingConnection = stagingTable.Connection;
var stagingScopedName = stagingTable.ScopedName;
var stagingSchemaQualifiedName = stagingTable.SchemaQualifiedName;
var stagingObjectName = stagingTable.ObjectName;
var stagingQualifiedName = stagingTable.QualifiedName;

var deleteScopedName = deleteObjectNamePattern.Replace("@@this", stagingScopedName);

var sd = persistentTable.StartDelimiter;
var ed = persistentTable.EndDelimiter;
var persistentScopedName = persistentTable.ScopedName;
var persistentSchemaQualifiedName = persistentTable.SchemaQualifiedName;
var persistentQualifiedName = persistentTable.QualifiedName;
var persistentFullyQualifiedName = persistentSchemaQualifiedName == persistentQualifiedName ? $"{sd}{persistentTable.Database}{ed}.{persistentQualifiedName}" : persistentQualifiedName;

var qualifiedEffectiveFromDate = tableConfigurations.QualifiedRowEffectiveFromDate;
var qualifiedEffectiveToDate = tableConfigurations.QualifiedRowEffectiveToDate;
var qualifiedRowChangeType = tableConfigurations.QualifiedRowChangeType;
var indexPrimaryKeyColumns = table.GetIndexPrimaryKeyColumns(psaIntegrationStage);
var joinClause = "ON\t" + indexPrimaryKeyColumns.Select(s => "PSA." + sd + s + ed + " = DEL." + sd + s + ed)
.Aggregate((current, next) => current + "\nAND\t" + next);
var currentListJoinClause = "ON\t" + indexPrimaryKeyColumns.Select(s => "PSA." + sd + s + ed + " = CL." + sd + s + ed)
.Aggregate((current, next) => current + "\nAND\t" + next);
var sqlPsaSelectPk = indexPrimaryKeyColumns.Select(s => "PSA." + sd + s + ed)
.Aggregate((current, next) => current + "\n\t\t," + next);
var whereClause = "WHERE\t" + indexPrimaryKeyColumns.Select(s => "DEL." + sd + s + ed + " IS NOT NULL").First();

var psaPkList = "\t" + indexPrimaryKeyColumns.Select(s => sd + s + ed)
.Aggregate((current, next) => current + ", " + next);

var groupByClause = "\t" + indexPrimaryKeyColumns.Select(s => "PSA." + sd + s + ed)
.Aggregate((current, next) => current + "\n," + next);

var deleteTable = RootNode.Tables[deleteScopedName];
var deleteTableDatabaseQualifiedName = deleteTable.DatabaseQualifiedName;
var deleteTableSchemaQualifiedName = deleteTable.SchemaQualifiedName;

var mappedHubObject = table.MetadataModelDynamicObject.Columns
.Where(e => e.Object.RelatedItem == table && e.TargetColumn.RelatedItem != null && new []{"HUB"}.Contains(e.TargetColumn.RelatedItem.Object.RelatedItem.ObjectType))
.Select(s => s.TargetColumn.RelatedItem.Object.RelatedItem).SingleOrDefault();
var mappedLinkColumns = table.MetadataModelDynamicObject.Columns
.Where(e => e.Object.RelatedItem == table && e.TargetColumn.RelatedItem != null && new []{"LNK"}.Contains(e.TargetColumn.RelatedItem.Object.RelatedItem.ObjectType));

var mappedSatelliteObjects = table.MetadataModelDynamicObject.Columns
.Where(e => e.Object.RelatedItem == table && e.TargetColumn.RelatedItem != null && new []{"LSAT","SAT"}.Contains(e.TargetColumn.RelatedItem.Object.RelatedItem.ObjectType))
.Select(s => s.TargetColumn.RelatedItem.Object.RelatedItem).Distinct();

var sqlPsaMultiActiveKeys = stagingTable.Columns.Where(cl => cl.IsPrimaryKey == "Y" && cl.ChangeType == "MAK");
var sqlPsaCompareMultiActiveKeys = "";
var sqlPsaSelectMultiActiveKeys = "";
var sqlPsaSatSelectMultiActiveKeys = "";
foreach (var multiActiveKey in sqlPsaMultiActiveKeys)
{
var sqlCompareMultiActiveKeys = multiActiveKey.GetSqlToBk(stagingTable.StartDelimiter, stagingTable.EndDelimiter, "PSA", "DV");
sqlPsaSelectMultiActiveKeys += $"\n\t\t\t\t,{sqlCompareMultiActiveKeys}";
sqlPsaSatSelectMultiActiveKeys += $"\n\t\t\t\t,{sqlCompareMultiActiveKeys} AS {sd}{multiActiveKey.GetColumnName()}{ed}";
sqlPsaCompareMultiActiveKeys += $"\n\t\t\tAND\t{sqlCompareMultiActiveKeys} = {sqlCompareMultiActiveKeys.Replace("PSA.", "DEL.")}";
}
#>
<ExecuteSQL Name="SQL - Insert Deleted Into Persistent Staging" ConnectionName="<#=stagingConnection.Name #>" ResultSet="None">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<DirectInput>DECLARE @<#=tableConfigurations.RowEffectiveFromDateValue#> DATETIME2(7)
SELECT @<#=tableConfigurations.RowEffectiveFromDateValue#> = ISNULL(MAX(<#=tableConfigurations.QualifiedRowEffectiveFromDate#>), GETDATE()) FROM <#=deleteTableSchemaQualifiedName#>

DECLARE @RowsInDelTable INT
SELECT @RowsInDelTable = CASE WHEN EXISTS (SELECT 1 FROM <#=deleteTableSchemaQualifiedName#>) THEN 1 ELSE 0 END
IF @RowsInDelTable = 1
BEGIN
BEGIN TRANSACTION [<#=stagingTable.Id.Replace("-", "")#>]
BEGIN TRY
;WITH currentList AS
(
SELECT *
FROM
(
SELECT
<#=psaPkList #>
,<#=qualifiedRowChangeType #>
,ROW_NUMBER() OVER (PARTITION BY <#=psaPkList #> ORDER BY <#=qualifiedEffectiveFromDate #> DESC) as rn
FROM <#=persistentQualifiedName #>
) cur
WHERE rn = 1 AND <#=qualifiedRowChangeType #> &lt;&gt; 'D'
)

INSERT INTO <#=persistentQualifiedName #>
(<#=persistentTable.InsertColumnsList#>)
SELECT <#=persistentTable.InsertColumnsList.Replace($"PSA.{qualifiedRowChangeType}", $"CONVERT(VARCHAR(1), 'D') AS {qualifiedRowChangeType}").Replace($"PSA.{tableConfigurations.QualifiedRowEffectiveFromDate}", $"@{tableConfigurations.RowEffectiveFromDateValue} AS {tableConfigurations.QualifiedRowEffectiveFromDate}")#>
FROM <#=persistentQualifiedName #> PSA
INNER JOIN currentList CL <#=currentListJoinClause #>
LEFT OUTER JOIN <#=deleteTableSchemaQualifiedName #> DEL
<#=joinClause #>
<#=whereClause #><# if (enableEndDatePsa){ #>
UPDATE ODS
SET <#=tableConfigurations.QualifiedRowEffectiveToDate#> = @<#=tableConfigurations.RowEffectiveFromDateValue#><# if (isCurrent != "IGN"){ #>,<#=tableConfigurations.QualifiedRowIsCurrent#> = 0
<#} #>
FROM <#=persistentQualifiedName #> ODS
INNER JOIN
(
SELECT <#=groupByClause#>
,MIN(PSA.<#=tableConfigurations.QualifiedRowEffectiveFromDate#>) AS <#=tableConfigurations.QualifiedRowEffectiveFromDate#>
,@<#=tableConfigurations.RowEffectiveFromDateValue#> AS <#=tableConfigurations.QualifiedRowEffectiveToDate#>
FROM <#=persistentQualifiedName #> PSA
LEFT OUTER JOIN <#=deleteTableSchemaQualifiedName #> DEL
<#=joinClause #>
WHERE PSA.<#=tableConfigurations.QualifiedRowEffectiveToDate#> = '9999-12-31'
<#=whereClause #>
GROUP BY <#=groupByClause #>
) SRC
<#=joinClause.Replace("PSA.", "ODS.").Replace("DEL.", "SRC.") #>
AND ODS.<#=tableConfigurations.QualifiedRowEffectiveFromDate#> = SRC.<#=tableConfigurations.QualifiedRowEffectiveFromDate#>
AND ODS.<#=qualifiedRowChangeType #> &lt;&gt; 'D'

<#} #>

COMMIT TRANSACTION [<#=stagingTable.Id.Replace("-", "")#>]
END TRY
BEGIN CATCH
DECLARE @ErrorMessage NVARCHAR(4000), @ErrorSeverity INT, @ErrorState INT;
SELECT @ErrorMessage = ERROR_MESSAGE(), @ErrorSeverity = ERROR_SEVERITY(), @ErrorState = ERROR_STATE();
RAISERROR ( @ErrorMessage, @ErrorSeverity, @ErrorState );
ROLLBACK TRANSACTION [<#=stagingTable.Id.Replace("-", "")#>]
END CATCH
END
</DirectInput>
</ExecuteSQL>

What It Does

Replaces the SQL logic that applies delete detection results to the Persistent Staging Area (PSA). When BimlFlex detects that source records have been deleted, this extension point controls how those deletions are recorded in the PSA table -- typically by inserting a new row with FlexRowChangeType = 'D' and optionally end-dating the previous current row. The default template handles the full PSA insert, end-dating (if enabled), and transaction management.

Where It Fires

The extension point is consumed during the delete detection package generation for PSA targets. It fires once per object and emits an ExecuteSQL task that inserts "deleted" marker rows into the persistent staging table. The template builds the INSERT from the current PSA records that match the delete detection results, replacing the change type with 'D' and using the max effective date from the delete staging table.

When to Use It

  1. Customize the PSA delete marker. The default inserts a row with FlexRowChangeType = 'D'. You need to also set FlexRowIsCurrent = 0 on the previous current record for awlt.Customer in BFX_ODS.
  2. Change the end-dating logic. The default end-dating behavior for awlt.SalesOrderHeader in BFX_ODS does not match your PSA pattern. Override to use a different effective-to-date calculation.
  3. Add custom logging. Insert an audit record into a logging table alongside the PSA delete marker for awlt.Product.
  4. Handle soft deletes differently. Instead of inserting a 'D' row, update the existing PSA record to mark it as deleted.
  5. Use this when the default PSA delete detection apply logic does not match your persistent staging pattern.

Prerequisites

  • Delete detection must be enabled for the object in BimlFlex metadata.
  • The persistent staging table must exist in the target connection (e.g., BFX_ODS).
  • The delete detection process must have already identified deleted records.
  • The delete staging table must be populated before this step runs.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Delete Detect Apply Psa Override from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Customize the SQL that inserts delete markers into the PSA table.
  5. Handle the precedenceConstraint parameter and transaction logic.
  6. Build and test the delete detection package end-to-end.

Example

Before (default PSA delete apply):

The default inserts a row with FlexRowChangeType = 'D' into the PSA table for each deleted record, using a CTE to find the current row.

After (custom PSA delete apply with additional audit logging):

The custom logic adds an INSERT into an audit table before the standard PSA delete marker insert, tracking which records were marked as deleted and when.

Common Mistakes

  • Breaking the transaction logic: The default wraps the operation in a transaction with TRY/CATCH. Removing or incorrectly modifying this can leave the PSA in an inconsistent state.
  • Not handling the end-date setting: If enableEndDatePsa is true in the configuration, the template updates FlexRowEffectiveToDate on the existing PSA records. Omitting this logic causes PSA records to appear current even after deletion.
  • Incorrect primary key join: The join between PSA and the delete staging table uses the primary key columns. Modifying this incorrectly causes missed or duplicate delete markers.
  • Not handling the precedenceConstraint: Omitting the precedence constraint disconnects the task from the control flow.
  • Delete Detect Apply Dv Override -- Applies delete detection to the Data Vault (instead of PSA).
  • Delete Detect Data Flow Override -- Overrides the data flow that identifies deleted records.
  • Delete Detect Raw File Override -- Overrides the raw file format used by delete detection.
  • Delete Initialize Raw File Data Flow Override -- Overrides the raw file initialization data flow.

Delete Detect Data Flow Override

Configure pipeline logic that will override the Delete Detect Data Flow

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DeleteDetectDataflowOverride" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<#
var tableObject = new ObjectViewModel(table, table.Connection.RelatedItem.IntegrationStage);
var stageConnection = tableObject.Project.StageConnection.RelatedItem;
var sourceSsisSafeScopedName = tableObject.SsisSafeScopedName;
var columns = tableObject.Columns;
var primaryKeyColumns = columns.GetIndexPrimaryKeyColumns("DEL");
var sourceColumns = columns.Where(c => primaryKeyColumns.Contains(c.ColumnName));
#>
<# CustomOutput.ObjectInherit = false; #>
<Dataflow Name="DFT - Detect Deletes ExtensionPoint" >
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<Transformations>
<!-- Insert Your Logic Here-->
</Transformations>
<# CustomOutput.OutputPathName = @"DFT - Detect Deletes ExtensionPoint.Output"; #>
</Dataflow>

What It Does

Replaces the entire SSIS data flow task that BimlFlex generates for detecting deleted source records. The default delete detection data flow compares current source records against previously extracted records (stored in a raw file) to identify which records have been removed from the source. This extension point lets you implement a completely custom detection strategy -- such as using a different comparison method, querying a change tracking table, or using a merge-based approach.

Where It Fires

The extension point is consumed during delete detection package generation and replaces the default delete detection data flow task. It fires once per object when the delete detection data flow is rendered. The precedenceConstraint parameter connects to preceding tasks. You must set CustomOutput.OutputPathName so subsequent tasks (the Apply Dv or Apply Psa steps) can chain after your custom data flow.

When to Use It

  1. Use a different detection strategy. Instead of the default raw file comparison, use SQL Server Change Tracking to detect deletes for SalesLT.Customer on AWLT_SRC.
  2. Implement a merge-based comparison. Use a MERGE statement (via ExecuteSQL) rather than a data flow to compare current source records with the previously captured set for SalesLT.SalesOrderHeader.
  3. Query a CDC table. Your source system provides a CDC (Change Data Capture) table that explicitly tracks deletes. Replace the data flow with a query against the CDC table.
  4. Optimize for large tables. The default raw file comparison loads all primary keys into memory. For very large tables, implement a database-side comparison using hash joins or temp tables.
  5. Use this when the default delete detection data flow does not match your source system's capabilities for tracking record removals.

Prerequisites

  • Delete detection must be enabled for the object in BimlFlex metadata.
  • You must set CustomOutput.OutputPathName so the subsequent apply steps can connect.
  • The precedenceConstraint parameter must be handled.
  • Your custom data flow must produce output that the Delete Detect Apply Dv Override or Delete Detect Apply Psa Override steps can consume.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Delete Detect Data Flow Override from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Write the complete <Dataflow> element with your custom delete detection logic.
  5. Include precedenceConstraint handling and set CustomOutput.OutputPathName.
  6. Build and test the delete detection package to verify deletes are correctly identified.

Example

Before (default delete detection data flow):

DFT - Detect Deletes
├─ OLE_SRC - Current Source Keys
├─ RTF_SRC - Previous Keys (Raw File)
├─ MRG_JN - Full Outer Join
└─ RTF_DST - Deleted Keys Output

After (custom delete detection using a database comparison):

<Dataflow Name="DFT - Detect Deletes ExtensionPoint">
<Transformations>
<OleDbSource Name="OLE_SRC - Detect Deletes" ConnectionName="BFX_STG">
<DirectInput>
SELECT PK.[CustomerID] FROM [awlt].[Customer_DEL] PK
LEFT JOIN [SalesLT].[Customer] SRC ON PK.[CustomerID] = SRC.[CustomerID]
WHERE SRC.[CustomerID] IS NULL
</DirectInput>
</OleDbSource>
<OleDbDestination Name="OLE_DST - Deleted Keys" ConnectionName="BFX_STG">
<InputPath OutputPathName="OLE_SRC - Detect Deletes.Output" />
<ExternalTableOutput Table="[awlt].[Customer_DEL_Results]" />
</OleDbDestination>
</Transformations>
</Dataflow>

Common Mistakes

  • Forgetting to set OutputPathName: Without it, the apply steps (Apply Dv or Apply Psa) cannot connect.
  • Producing incompatible output: The downstream apply steps expect specific columns (primary key columns) in the delete detection results. Ensure your output matches.
  • Not handling the precedenceConstraint: Omitting precedence constraint handling disconnects the task from the control flow.
  • Losing the raw file update: The default data flow also updates the raw file with current source keys. If you override the data flow, you may need to handle this update separately.
  • Delete Detect Apply Dv Override -- Applies detected deletes to the Data Vault.
  • Delete Detect Apply Psa Override -- Applies detected deletes to the PSA.
  • Delete Detect Raw File Override -- Overrides the raw file format used for delete detection.
  • Delete Initialize Raw File Data Flow Override -- Overrides the raw file initialization.

Delete Detect Raw File Override

Override the Raw File Format use by the Delete Detection process

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DeleteDetectRawFileOverride" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<#
var tableObject = new ObjectViewModel(table, table.Connection.RelatedItem.IntegrationStage);
var stageConnection = tableObject.Project.StageConnection.RelatedItem;
var sourceSsisSafeScopedName = tableObject.SsisSafeScopedName;
var columns = tableObject.Columns;
var primaryKeyColumns = columns.GetIndexPrimaryKeyColumns("DEL");
var sourceColumns = columns.Where(c => primaryKeyColumns.Contains(c.ColumnName));
#>
<# CustomOutput.ObjectInherit = false; #>
<RawFileFormat Name="rawDEL_<#=sourceSsisSafeScopedName#>" LogicalDisplayFolder="Raw">
<Columns>
<# var i = 1;
foreach(var sourceColumn in sourceColumns) {
var applyDataTypeMappingStg = table.GetConfigurationValue("ApplyDataTypeMappingStg") == "Y" ? true : false;
var bimlDataType = applyDataTypeMappingStg
? sourceColumn.GetMappedDataTypeBiml()
: sourceColumn.GetDataTypeBiml();
#>
<Column Name="<#=sourceColumn.ColumnName#>" <#=bimlDataType#> IndexPosition="<#=i #>" />
<# i ++; #>
<# } #>
</Columns>
</RawFileFormat>

What It Does

Replaces the raw file format definition used by the delete detection process. The delete detection mechanism stores a snapshot of source primary key values in a SSIS raw file, then compares the current source keys against this snapshot to identify deletions. This extension point lets you customize the raw file format -- changing column definitions, data types, index positions, or adding additional columns to the raw file structure.

Where It Fires

The extension point is consumed during the delete detection package generation and replaces the <RawFileFormat> element used by the raw file source and destination components. It fires once per object. The raw file format defines the schema of the .raw file that stores the snapshot of primary key values between executions.

When to Use It

  1. Change column data types. The default raw file format uses metadata-driven data types. You need to override the data types for SalesLT.Customer on AWLT_SRC to use mapped data types instead.
  2. Add columns to the raw file. Beyond primary keys, you want to include additional columns (e.g., ModifiedDate) in the raw file for SalesLT.SalesOrderHeader to support more sophisticated comparison logic.
  3. Fix data type mapping issues. The auto-generated data type mapping for the raw file causes truncation or conversion errors. Override to specify exact data types.
  4. Change the index positions. Reorder the index positions for the raw file columns to match a custom sort order.

Prerequisites

  • Delete detection must be enabled for the object in BimlFlex metadata.
  • The raw file format name must match what the delete detection data flow expects (default: rawDEL_<scopedName>).
  • Column definitions must include all primary key columns used by the delete detection comparison.
  • The data types must be compatible with SSIS raw file format specifications.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Delete Detect Raw File Override from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Define the <RawFileFormat> element with custom column definitions.
  5. Ensure column names and data types match the primary key columns used by the comparison.
  6. Build and verify the raw file format appears correctly in the generated package.

Example

Before (default raw file format):

The raw file format is auto-generated with all primary key columns using metadata data types.

After (custom raw file format with explicit data types):

<RawFileFormat Name="rawDEL_AWLT_SRC_SalesLT_Customer" LogicalDisplayFolder="Raw">
<Columns>
<Column Name="CustomerID" DataType="Int32" IndexPosition="1" />
</Columns>
</RawFileFormat>

Common Mistakes

  • Mismatched format name: The raw file format name must match the naming convention expected by the data flow (rawDEL_<scopedName>). A mismatch causes a "cannot resolve file format" error.
  • Missing primary key columns: All primary key columns must be included in the raw file format. Omitting a column causes incorrect delete detection results.
  • Incompatible data types: SSIS raw files have specific data type requirements. Using incompatible types causes runtime errors when reading or writing the raw file.
  • Not regenerating after metadata changes: If the primary key columns change in metadata, the raw file format override must be updated to match.
  • Delete Initialize Raw File Data Flow Override -- Overrides the data flow that initializes the raw file with current source keys.
  • Delete Detect Data Flow Override -- Overrides the data flow that detects deleted records using the raw file.
  • Delete Detect Apply Dv Override -- Applies detected deletes to the Data Vault.
  • Delete Detect Apply Psa Override -- Applies detected deletes to the PSA.

Delete Initialize Raw File Data Flow Override

Configure pipeline logic that will override the Delete Initialize Raw File Data Flow

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="DeleteInitializeRawFileDataflowOverride" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<#
var tableObject = new ObjectViewModel(table, table.Connection.RelatedItem.IntegrationStage);
var stageConnection = tableObject.Project.StageConnection.RelatedItem;
var sourceSsisSafeScopedName = tableObject.SsisSafeScopedName;
var columns = tableObject.Columns;
var primaryKeyColumns = columns.GetIndexPrimaryKeyColumns("DEL");
var sourceColumns = columns.Where(c => primaryKeyColumns.Contains(c.ColumnName));
#>
<# CustomOutput.ObjectInherit = false; #>
<Dataflow Name="DFT - Initialise Raw File ExtensionPoint">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<Transformations>
<OleDbSource Name="OLE_SRC - <#=sourceSsisSafeScopedName#>" ConnectionName="<#=tableObject.SourceConnection.Name#>" CommandTimeout="0" ValidateExternalMetadata="true" >
<DirectInput>
SELECT [MyColumn] FROM <#=tableObject.SourceSchemaQualifiedName#> ORDER BY [MyColumn]
</DirectInput>
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="MyColumn" SortKeyPosition="1" />
</Columns>
</OleDbSource>
<RawFileDestination Name="RTF_DST - Initialise Raw Output File">
<InputPath OutputPathName="OLE_SRC - <#=sourceSsisSafeScopedName#>.Output" />
<FileFromVariableOutput VariableName="User.SourceRawFilePath"/>
</RawFileDestination>
</Transformations>
</Dataflow>

What It Does

Replaces the data flow that initializes the raw file used by the delete detection process. On the first run (or when the raw file does not exist), this data flow extracts current source primary key values, sorts them, and writes them to a raw file. This raw file then serves as the baseline for subsequent delete detection runs. The extension point lets you customize the source query, sort order, column selection, or destination format for this initialization.

Where It Fires

The extension point is consumed during delete detection package generation and replaces the default raw file initialization data flow. It fires once per object when the initialization data flow is rendered. The data flow runs only when the raw file does not yet exist or needs to be re-initialized. The precedenceConstraint parameter connects to preceding tasks.

When to Use It

  1. Customize the source query. The default queries all primary key columns from the source. You need to add a filter to exclude test records from SalesLT.Customer on AWLT_SRC when initializing the raw file.
  2. Change the sort order. The default sorts by primary key ascending. You need a different sort order for SalesLT.SalesOrderHeader to optimize the merge join in the detection step.
  3. Include additional columns. Beyond primary keys, include ModifiedDate in the raw file for SalesLT.Product to support more granular change detection.
  4. Use a different source connection. Initialize the raw file from a staging table instead of the source system for SalesLT.Customer.
  5. Use this when the default raw file initialization does not read from the correct source or does not produce the format expected by your custom Delete Detect Data Flow Override.

Prerequisites

  • Delete detection must be enabled for the object in BimlFlex metadata.
  • The source connection must be accessible and the source query must return valid results.
  • The raw file destination must use the User.SourceRawFilePath variable for the file path.
  • The output columns must match the raw file format (default or overridden via Delete Detect Raw File Override).

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Delete Initialize Raw File Data Flow Override from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Write the complete <Dataflow> element with source, optional transformations, and raw file destination.
  5. Include precedenceConstraint handling.
  6. Build and test to confirm the raw file is correctly initialized.

Example

Before (default raw file initialization):

DFT - Initialise Raw File
├─ OLE_SRC - Source Keys (sorted by PK)
└─ RTF_DST - Write to Raw File

After (custom initialization with filtered source):

<Dataflow Name="DFT - Initialise Raw File ExtensionPoint">
<Transformations>
<OleDbSource Name="OLE_SRC - Customer" ConnectionName="AWLT_SRC" CommandTimeout="0">
<DirectInput>SELECT [CustomerID] FROM [SalesLT].[Customer] WHERE [IsActive] = 1 ORDER BY [CustomerID]</DirectInput>
<Columns>
<Column SourceColumn="CustomerID" TargetColumn="CustomerID" SortKeyPosition="1" />
</Columns>
</OleDbSource>
<RawFileDestination Name="RTF_DST - Initialise Raw Output File">
<InputPath OutputPathName="OLE_SRC - Customer.Output" />
<FileFromVariableOutput VariableName="User.SourceRawFilePath" />
</RawFileDestination>
</Transformations>
</Dataflow>

Common Mistakes

  • Not sorting the output: The delete detection merge join expects sorted input. If the raw file is not sorted by primary key, the detection step fails or produces incorrect results.
  • Column mismatch with raw file format: The columns written by the initialization data flow must match the Delete Detect Raw File Override format (or the default format). Mismatches cause read errors in subsequent detection runs.
  • Wrong raw file variable: The destination must use User.SourceRawFilePath. Using a different variable or hardcoded path breaks the file management logic.
  • Not handling precedenceConstraint: Omitting the precedence constraint disconnects the initialization from the control flow.
  • Delete Detect Raw File Override -- Overrides the raw file format that this data flow writes to.
  • Delete Detect Data Flow Override -- Overrides the data flow that reads the raw file for comparison.
  • Delete Detect Apply Dv Override -- Applies detected deletes to the Data Vault.
  • Delete Detect Apply Psa Override -- Applies detected deletes to the PSA.

Get Parameter

Configure override to retrieve parameter values

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the process will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="GetParameter" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = true; #>
<# var sourceConnection = table.GetSourceConnection();
var sourceConnectionType = sourceConnection.ConnectionType;
var sourceDatabase = table.GetSourceCatalog();
var sourceScopedName = table.GetSourceScopedName(sourceConnection);
var sourceQualifiedName = table.GetSourceQualifiedName(sourceConnection);
var sourceSchemaQualifiedName = table.GetSourceSchemaQualifiedName(sourceConnection);
var sourceSsisSafeScopedName = sourceScopedName.MakeSsisSafe();
var sourceDisplayName = table.GetSourceDisplayName(sourceConnection);
#>
<# if (table.GetParameters().Any()) { #>
<ExecuteSQL Name="SQL - Get LastModified" ConnectionName="BimlCatalog" ResultSet="SingleRow">
<Results>
<Result Name="0" VariableName="User.CurrentModifiedDate" />
</Results>
<DirectInput>EXEC [ssis].[GetConfigVariable] '<#=sourceConnection.Name#>', '<#=table.Name#>.LastModifiedDate', 'LastModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
<# CustomOutput.OutputPathName = @"SQL - Get LastModified.Output"; #>
</ExecuteSQL>
<# } #>

What It Does

Retrieves parameter values from the BimlCatalog configuration store before an object's data flow executes. This is the read counterpart to Set Parameter: Get Parameter reads watermark or configuration values (such as LastModifiedDate) that were persisted by a previous execution, making them available as SSIS variables for use in parameterized source queries, where clauses, or other data flow logic.

Where It Fires

Consumed in the object control flow generation templates. The extension point is placed before the main data flow, typically as one of the first steps in the object's SSIS package. It fires once per object when parameter columns are defined in the metadata (table.GetParameters().Any()). The template conditionally emits an ExecuteSQL task that calls [ssis].[GetConfigVariable] on the BimlCatalog connection and sets CustomOutput.OutputPathName so the data flow can chain after it.

When to Use It

  1. Read a watermark for incremental extraction. Before extracting SalesLT.Customer from AWLT_SRC, retrieve the LastModifiedDate from the BimlCatalog so the source query can filter by WHERE ModifiedDate > @LastModifiedDate.
  2. Retrieve a custom high-water mark. Your extraction of SalesLT.SalesOrderHeader uses a sequence-based parameter (LastOrderID). Override Get Parameter to read that value from the BimlCatalog.
  3. Read multiple parameter values. The default template reads one parameter. Override to retrieve both LastModifiedDate and LastRowVersion for SalesLT.Product.
  4. Change the retrieval mechanism. Replace the [ssis].[GetConfigVariable] call with a query against a different configuration store or table.
  5. Use this instead of Pre Data Flow when the only pre-processing needed is reading a parameter value.

Prerequisites

  • The object must have parameter columns defined in BimlFlex metadata (table.GetParameters() must return results).
  • The BimlCatalog connection must be available in the project.
  • The [ssis].[GetConfigVariable] stored procedure must exist in the BimlCatalog database.
  • The SSIS variable User.CurrentModifiedDate (or your custom variable) must be defined in the package (via Package Variable or BimlFlex metadata).

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Get Parameter from the dropdown.
  3. Set the Target to the specific object or use ObjectInherit = true for all objects with parameters.
  4. Modify the [ssis].[GetConfigVariable] call to use your connection name, parameter key, and variable name.
  5. Set CustomOutput.OutputPathName so the data flow can chain after the parameter retrieval.
  6. Build the project and verify the Get Parameter task appears before the data flow.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="GetParameter" target="SalesLT.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
<# var sourceConnection = table.GetSourceConnection(); #>
<# if (table.GetParameters().Any()) { #>
<ExecuteSQL Name="SQL - Get LastModified" ConnectionName="BimlCatalog" ResultSet="SingleRow">
<Results>
<Result Name="0" VariableName="User.CurrentModifiedDate" />
</Results>
<DirectInput>EXEC [ssis].[GetConfigVariable] '<#=sourceConnection.Name#>', '<#=table.Name#>.LastModifiedDate', 'LastModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
<# CustomOutput.OutputPathName = @"SQL - Get LastModified.Output"; #>
</ExecuteSQL>
<# } #>

Example

Before (no Get Parameter -- full extract every time):

Every execution extracts all rows from SalesLT.Customer because there is no watermark value.

After (Get Parameter reads the watermark):

<ExecuteSQL Name="SQL - Get LastModified" ConnectionName="BimlCatalog" ResultSet="SingleRow">
<Results>
<Result Name="0" VariableName="User.CurrentModifiedDate" />
</Results>
<DirectInput>EXEC [ssis].[GetConfigVariable] 'AWLT_SRC', 'SalesLT.Customer.LastModifiedDate', 'LastModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
</ExecuteSQL>

Common Mistakes

  • Not pairing with Set Parameter: Get Parameter reads values, but they must be written first. On the first execution, the value may be null or default. Ensure Set Parameter runs after the data flow to persist the updated value.
  • Undefined SSIS variable: The variable User.CurrentModifiedDate must be defined via Package Variable or BimlFlex metadata.
  • Missing OutputPathName: Without CustomOutput.OutputPathName, the data flow cannot connect to the Get Parameter task, breaking the execution chain.
  • Wrong parameter key: The key in GetConfigVariable (e.g., 'SalesLT.Customer.LastModifiedDate') must match the key used in Set Parameter. A mismatch means the value is never found.
  • Applying ObjectInherit = true to objects without parameters: The template includes a conditional check, but if you remove it, objects without parameters get an empty task.
  • Set Parameter -- Writes parameter values back to the BimlCatalog after execution.
  • Pre Data Flow -- Injects control flow tasks before the data flow (alternative location for reading parameters).
  • Source Parameter -- Binds parameters to the source query (consumes the values retrieved by Get Parameter).
  • Package Variable -- Defines the SSIS variables that Get Parameter populates.

Lookup Cache

Configure a cache file for the Lookup SQL

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the source object to which the lookup cache will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the lookup cache will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="LookupCache" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- This can be any anything defined within the SSIS Control Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<# var sourceConnection = table.GetSourceConnection();
var sourceScopedName = table.GetSourceScopedName(sourceConnection);
var sourceSsisSafeScopedName = sourceScopedName.MakeSsisSafe();
#>
<Dataflow Name="DFT - CACHE <#=sourceSsisSafeScopedName#>">
<Transformations>
<OleDbSource Name="OLE_SRC - Select <#=sourceSsisSafeScopedName#>" ConnectionName="<#=sourceConnection#>">
<DirectInput>SELECT SRC.[MyTableCode]
,SRC.[MyTableKey]
,SRC.[FlexRowHashType1] AS [lkpFlexRowHashType1]
FROM [dim].[MyTable] SRC
WHERE SRC.[TenantCode] <> 'UNK'</DirectInput>
</OleDbSource>
<Cache Name="CT - <#=sourceSsisSafeScopedName#>" ConnectionName="MyTableCache">
<InputPath OutputPathName="OLE_SRC - Select <#=sourceSsisSafeScopedName#>.Output" />
</Cache>
</Transformations>
</Dataflow>

What It Does

Replaces the default lookup behavior with a cache-based lookup by defining a custom data flow that pre-populates a cache file. Instead of using the standard OLE DB lookup against a live database connection, this extension point creates a separate data flow that reads lookup data into a cache file. The Lookup component in the main data flow then reads from this cache, improving performance for frequently accessed lookup tables.

Where It Fires

The extension point is consumed during package generation in 0.01.1-flx-import-cache.biml and related lookup generation files (referenced in 6 files). It fires once per object when the lookup cache data flow is rendered. The cache data flow runs before the main extraction data flow, populating the cache file that the Lookup component references via a CacheConnectionName. When this extension point is defined, the Lookup component uses the cache instead of a direct database query.

When to Use It

  1. Pre-cache a dimension lookup. Your SalesLT.Customer extraction from AWLT_SRC performs a lookup against dim.MyTable in BFX_DM. Pre-caching avoids repeated queries to the dimension table during the data flow.
  2. Filter the lookup dataset. The default lookup returns all rows. You need to exclude rows where TenantCode = 'UNK' from the lookup for SalesLT.SalesOrderHeader.
  3. Optimize large lookup tables. The lookup table for SalesLT.Product has millions of rows. A cache-based lookup is faster than full-cache OLE DB lookups because the data is read once into a local file.
  4. Use a different connection for the lookup. The lookup data lives on a different server than the source or target. Define a custom cache data flow that reads from that server.
  5. Use this with Root Add to define the CacheConnection and FileConnection needed by the cache.

Prerequisites

  • A Root Add extension point must define the CacheConnection, FileConnection, and RawFileFormat referenced by the cache data flow.
  • The source query in the cache data flow must return columns that match the lookup key and output columns expected by the main data flow.
  • The cache file path must be writable at runtime.
  • The cache data flow runs before the main extraction; ensure the lookup data is current.

Implementation Steps

  1. Create a Root Add extension point to define the cache connection, file connection, and file format.
  2. Create a Lookup Cache extension point for the same object.
  3. Define a data flow that reads lookup data from the source and writes it to the cache connection.
  4. Verify that the Lookup component in the main data flow references the cache connection.
  5. Build and test the package to confirm the cache is populated before the main data flow runs.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="LookupCache" target="SalesLT.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
<# var sourceConnection = table.GetSourceConnection();
var sourceScopedName = table.GetSourceScopedName(sourceConnection);
var sourceSsisSafeScopedName = sourceScopedName.MakeSsisSafe(); #>
<Dataflow Name="DFT - CACHE <#=sourceSsisSafeScopedName#>">
<Transformations>
<OleDbSource Name="OLE_SRC - Select <#=sourceSsisSafeScopedName#>" ConnectionName="BFX_DM">
<DirectInput>SELECT SRC.[Customer_BK], SRC.[Customer_SK] FROM [dim].[Customer] SRC WHERE SRC.[Customer_BK] &lt;&gt; 'UNK'</DirectInput>
</OleDbSource>
<Cache Name="CT - <#=sourceSsisSafeScopedName#>" ConnectionName="CustomerCache">
<InputPath OutputPathName="OLE_SRC - Select <#=sourceSsisSafeScopedName#>.Output" />
</Cache>
</Transformations>
</Dataflow>

Example

Before (default OLE DB lookup against a live database):

The Lookup component queries dim.MyTable on every execution via a direct OLE DB connection.

After (cache-based lookup via LookupCache):

A separate data flow pre-populates a local cache file from dim.MyTable, and the Lookup component reads from the cache file instead.

Common Mistakes

  • Missing Root Add companion: The CacheConnection referenced in the cache data flow must be defined via a Root Add extension point. Without it, the build fails with "Cannot resolve connection."
  • Cache column mismatch: The columns in the cache data flow must match the columns expected by the Lookup component in the main data flow. Mismatches cause lookup failures.
  • Stale cache data: The cache is populated once before the main data flow. If the lookup source changes during the package execution, the cache contains outdated data.
  • File path permissions: The cache file path must be writable by the SSIS service account. Permission errors cause the cache population to fail silently.
  • Root Add -- Defines the CacheConnection, FileConnection, and RawFileFormat needed by the cache data flow.
  • Lookup SQL -- Overrides the SQL used by the Lookup component when not using cache mode.
  • Lookup Join -- Adds JOIN clauses to the lookup SQL.
  • Lookup Parameter -- Adds parameters to the lookup SQL.

Lookup Join

Configure a Join statement that will be added to the Lookup SQL

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the source object to which the lookup Join will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the lookup Join will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="LookupJoin" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
INNER JOIN [dbo].[MyJoinTable] LKP
ON SRC.[MyTableCode] = LKP.[MyTableCode]

What It Does

Appends a JOIN clause to the lookup SQL query that BimlFlex generates for lookup transformations. This allows you to extend the lookup query with additional table joins without replacing the entire query. The JOIN text is appended to the FROM clause of the lookup SQL, adding related tables that provide additional columns or filtering criteria.

Where It Fires

The extension point is consumed during data flow generation in the lookup query assembly (referenced in 7 files including 90.flx-i-get-derived-columns.biml). It fires once per object when the lookup SQL is constructed. The JOIN clause is appended after the main FROM table in the lookup query. It works in combination with the auto-generated lookup SQL; use Lookup SQL instead if you need to replace the entire query.

When to Use It

  1. Join a reference table into the lookup. The lookup for SalesLT.Customer on AWLT_SRC needs to include columns from a related [dbo].[CustomerCategory] table. Add an INNER JOIN to bring in the category code.
  2. Filter the lookup by joining to a control table. Join to a [dbo].[ActiveTenants] table to restrict the lookup for SalesLT.SalesOrderHeader to only active tenants.
  3. Add a LEFT JOIN for optional enrichment. Enrich the lookup for SalesLT.Product with optional data from a [dbo].[ProductExtension] table using a LEFT OUTER JOIN.
  4. Use this instead of Lookup SQL when the base lookup query is correct and you only need to add a JOIN, not rewrite the entire query.

Prerequisites

  • The object must exist in BimlFlex metadata with lookup columns defined.
  • The joined table must exist on the same connection as the lookup source.
  • The JOIN condition must use the correct alias (SRC for the main lookup table).
  • Columns from the joined table that you want in the lookup output must be added to the lookup column list via Lookup SQL or metadata.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Lookup Join from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Write the JOIN clause (INNER JOIN, LEFT JOIN, etc.) with the table name, alias, and ON condition.
  5. Build the project and verify the generated lookup query includes the JOIN.
  6. Execute and confirm the lookup returns the expected results.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="LookupJoin" target="SalesLT.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
INNER JOIN [dbo].[CustomerCategory] CAT
ON SRC.[CategoryCode] = CAT.[CategoryCode]

Example

Before (default lookup query):

SELECT SRC.[Customer_BK], SRC.[Customer_SK]
FROM [dim].[Customer] SRC

After (with LookupJoin adding a JOIN):

SELECT SRC.[Customer_BK], SRC.[Customer_SK], CAT.[CategoryName]
FROM [dim].[Customer] SRC
INNER JOIN [dbo].[CustomerCategory] CAT
ON SRC.[CategoryCode] = CAT.[CategoryCode]

Common Mistakes

  • Wrong alias: The main lookup table uses SRC as its alias. Using a different alias in the ON condition causes a SQL error.
  • Missing columns in the SELECT: Adding a JOIN does not automatically add columns to the SELECT list. You may need to combine this with Lookup SQL to include columns from the joined table.
  • Cartesian product: A JOIN without a proper ON condition creates a cartesian product, returning far more rows than expected and potentially causing memory issues.
  • Using this when Lookup SQL is more appropriate: If you need to change the SELECT list, WHERE clause, and JOIN simultaneously, use Lookup SQL to replace the entire query instead of combining multiple lookup extension points.
  • Lookup SQL -- Replaces the entire lookup query (more comprehensive than Lookup Join).
  • Lookup Cache -- Pre-populates a cache file for the lookup (alternative to direct DB lookup).
  • Lookup Parameter -- Adds parameters to the lookup query.

Lookup Parameter

Configure a parameter for the Lookup SQL used by Lookup Cache

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the source object to which the lookup parameter will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the lookup parameter will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="LookupParameter" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
<Parameter Name="Parameter1" VariableName="User.CurrentModifiedDate" />

What It Does

Adds parameters to the lookup SQL query used by the Lookup component in the SSIS data flow. These parameters bind SSIS package variables to the lookup query's parameter placeholders (?), enabling dynamic filtering of the lookup dataset at runtime. This is used when the lookup query needs to reference runtime values such as watermark dates, tenant codes, or execution identifiers.

Where It Fires

The extension point is consumed during data flow generation and adds <Parameter> elements to the Lookup component's parameter collection. It fires once per object when the lookup component is rendered. The parameters are bound to SSIS variables and passed to the lookup query at runtime. This works with the Lookup Cache generation process.

When to Use It

  1. Filter the lookup by date. The lookup for SalesLT.Customer on AWLT_SRC queries dim.Customer but should only include records modified after CurrentModifiedDate. Add a parameter to pass the date value.
  2. Restrict the lookup by tenant. The lookup for SalesLT.SalesOrderHeader should only return rows for the current tenant. Pass the TenantCode variable as a parameter.
  3. Use with Lookup SQL. Combine with Lookup SQL that includes ? placeholders to create a parameterized lookup query.
  4. Dynamic cache population. When using Lookup Cache, add parameters to the cache population query to filter the cached data.

Prerequisites

  • The lookup query must contain ? parameter placeholders that match the number and order of parameters defined.
  • The SSIS variables referenced must be defined in the package (via Package Variable or BimlFlex metadata).
  • Parameters are bound by ordinal position, not by name.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Lookup Parameter from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Define <Parameter> elements with sequential names and SSIS variable bindings.
  5. Ensure the lookup query (via Lookup SQL or default) contains matching ? placeholders.
  6. Build and test to confirm the parameterized lookup returns the correct results.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="LookupParameter" target="SalesLT.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
<Parameter Name="Parameter1" VariableName="User.CurrentModifiedDate" />

Example

Before (lookup query with no parameters):

SELECT SRC.[Customer_BK], SRC.[Customer_SK] FROM [dim].[Customer] SRC

After (parameterized lookup with LookupParameter + LookupSQL):

Lookup SQL:

SELECT SRC.[Customer_BK], SRC.[Customer_SK] FROM [dim].[Customer] SRC WHERE SRC.[ModifiedDate] >= ?

LookupParameter:

<Parameter Name="Parameter1" VariableName="User.CurrentModifiedDate" />

Common Mistakes

  • Parameter order mismatch: Parameters are bound by ordinal position. If the ? placeholders in the query are in a different order than your parameters, incorrect values are passed.
  • Undefined SSIS variable: Referencing User.CurrentModifiedDate without defining it (via Package Variable or BimlFlex metadata) causes a validation error.
  • No matching placeholder in the query: Adding a parameter without a corresponding ? in the lookup query has no effect or causes errors.
  • Using named parameters: SSIS lookup queries use positional ? placeholders, not named parameters. Use sequential Parameter1, Parameter2 names.
  • Lookup SQL -- Overrides the lookup query (where you add the ? placeholders).
  • Lookup Join -- Adds JOIN clauses to the lookup query.
  • Lookup Cache -- Pre-populates a cache file for the lookup.
  • Package Variable -- Defines the SSIS variables that parameters reference.

Lookup SQL

Override the default generated Lookup SQL

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the source object to which the lookup SQL will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the lookup SQL will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="LookupSql" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
SELECT SRC.[MyTableCode]
,SRC.[MyTableKey]
,SRC.[FlexRowHashType1] AS [lkpFlexRowHashType1]
FROM [dim].[MyTable] SRC
WHERE SRC.[TenantCode] <> 'UNK'

What It Does

Replaces the entire SQL query used by the Lookup component in the SSIS data flow. Instead of the auto-generated lookup query derived from metadata, the Lookup component executes your custom query, giving you complete control over which columns are returned, how the lookup table is filtered, and what joins or transformations are applied. This is only used when the Lookup Cache extension point is empty (not defined); if Lookup Cache is defined, the cache data flow provides the lookup data instead.

Where It Fires

The extension point is consumed during data flow generation and injected into the Lookup component's DirectInput in 90.flx-i-get-derived-columns.biml and 8 other files. It fires once per object when the lookup component is rendered. The custom SQL replaces the auto-generated lookup SELECT statement entirely.

When to Use It

  1. Filter the lookup table. The default lookup for SalesLT.Customer on AWLT_SRC returns all rows from dim.Customer. You need to exclude rows where TenantCode = 'UNK'.
  2. Select specific columns. The default lookup returns all lookup columns from metadata. You want to return only MyTableCode, MyTableKey, and FlexRowHashType1 to reduce memory usage.
  3. Rename columns in the lookup. Alias the lookup columns (e.g., FlexRowHashType1 AS lkpFlexRowHashType1) to avoid column name collisions with the main data flow.
  4. Add computed columns. Include calculated values in the lookup that are not stored in the lookup table.
  5. Use this instead of Lookup Join when you need to change the SELECT list, WHERE clause, and JOINs simultaneously, rather than just adding a JOIN.

Prerequisites

  • The object must exist in BimlFlex metadata with lookup columns defined.
  • The Lookup Cache extension point must NOT be defined for this object (if it is, LookupSQL is ignored and the cache provides the data).
  • The returned columns must include the lookup key and all output columns expected by the main data flow.
  • The query must be valid SQL for the lookup connection.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Lookup SQL from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Write the complete SELECT query including column aliases, FROM clause, and WHERE filters.
  5. Ensure the query returns the lookup key column and all required output columns.
  6. Build the project and verify the Lookup component uses your custom query.
  7. Execute and confirm the lookup returns the expected results.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="LookupSql" target="SalesLT.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
SELECT SRC.[Customer_BK]
,SRC.[Customer_SK]
,SRC.[FlexRowHashType1] AS [lkpFlexRowHashType1]
FROM [dim].[Customer] SRC
WHERE SRC.[Customer_BK] <> 'UNK'

Example

Before (default auto-generated lookup query):

SELECT [Customer_BK], [Customer_SK], [FlexRowHashType1]
FROM [dim].[Customer]

After (with LookupSql filtering and aliasing):

SELECT   SRC.[Customer_BK]
,SRC.[Customer_SK]
,SRC.[FlexRowHashType1] AS [lkpFlexRowHashType1]
FROM [dim].[Customer] SRC
WHERE SRC.[Customer_BK] <> 'UNK'

Common Mistakes

  • Defining both LookupSQL and LookupCache: If Lookup Cache is defined, it takes precedence and LookupSQL is ignored. Choose one approach.
  • Missing lookup key column: The query must return the column(s) used as the lookup key. Omitting them causes the Lookup component to fail.
  • Column alias mismatch: If downstream transformations expect specific column names from the lookup, your aliases must match exactly.
  • Returning too many columns: Including unnecessary columns in the lookup increases memory usage. Only return the key and required output columns.
  • Not testing the query independently: Run your custom query against the database independently to verify it returns the expected results before adding it as an extension point.
  • Lookup Join -- Adds a JOIN clause to the lookup query (less invasive than replacing the entire query).
  • Lookup Parameter -- Adds parameters to the lookup query for runtime filtering.
  • Lookup Cache -- Pre-populates a cache file for the lookup (alternative approach; takes precedence over LookupSQL).
  • Override SQL -- Overrides the main source query (different from the lookup query).

Override Alter SQL

Configure SQL that will Override the Alter SQL for a object

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the SQL will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="OverrideAlterSql" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
<# var tableObject = new ObjectViewModel(table, table.Connection.RelatedItem.IntegrationStage); #>
ALTER TABLE [<#=tableObject.Schema#>].[<#=tableObject.ObjectName#>] ADD CONSTRAINT [DF_ep_<#=tableObject.Schema#>_<#=tableObject.ObjectName#>_FlexRowEffectiveFromDate] DEFAULT ('0001-01-01 00:00:00.000') FOR [FlexRowEffectiveFromDate]
GO

What It Does

Replaces the ALTER TABLE DDL statements that BimlFlex generates during deployment for an object. This lets you define custom ALTER TABLE logic, such as adding default constraints, modifying column definitions, or applying schema changes that differ from what the standard metadata-driven generation produces, giving you precise control over how existing tables are modified.

Where It Fires

The extension point is consumed during the DDL generation phase in the 0.90.x-flx-deploy-*-script.biml deployment script files. It replaces the ALTER TABLE statements for the targeted object. It fires once per object during deployment script generation when ALTER logic is needed.

When to Use It

  • You need to add a DEFAULT constraint for FlexRowEffectiveFromDate on awlt.Customer in BFX_STG that the standard generation does not include.
  • The auto-generated ALTER TABLE for awlt.SalesOrderHeader adds columns in an order that conflicts with your database standards, and you need to reorder them.
  • You want to add a computed column or column-level check constraint to dv.SAT_Customer_Details_awlt in BFX_DV that cannot be expressed through metadata alone.
  • Use this instead of Pre Create SQL / Post Create SQL when you need to replace the ALTER logic itself rather than inject additional SQL before or after the CREATE.
  • Use this instead of Create SQL when the table already exists and you only need to modify it (add columns, change constraints) rather than rebuild it entirely.

Prerequisites

  • The object must exist in BimlFlex metadata so the extension point can bind to it.
  • The target table must already exist in the database for ALTER TABLE statements to succeed.
  • You must have ALTER permissions on the target table and schema.
  • Understand the current table structure so your ALTER statements reference correct column names and data types.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Override Alter SQL from the dropdown.
  3. Set the Target to the specific object (e.g., awlt.Customer on BFX_STG).
  4. Write your custom ALTER TABLE statements using the tableObject helper variable for dynamic schema and table names.
  5. Include GO separators between independent DDL batches.
  6. Build the project and review the deployment script to confirm the ALTER statements are correct.
  7. Deploy the script and verify the table modifications were applied.

Example

Before (default generated ALTER):

ALTER TABLE [awlt].[Customer] ADD [NewColumn] NVARCHAR(100) NULL
GO

After (with OverrideAlterSql extension point):

ALTER TABLE [awlt].[Customer] ADD CONSTRAINT [DF_ep_awlt_Customer_FlexRowEffectiveFromDate]
DEFAULT ('0001-01-01 00:00:00.000') FOR [FlexRowEffectiveFromDate]
GO
ALTER TABLE [awlt].[Customer] ADD [NewColumn] NVARCHAR(100) NOT NULL
CONSTRAINT [DF_awlt_Customer_NewColumn] DEFAULT ('N/A')
GO

Common Mistakes

  • Applying ALTER to a table that does not exist yet: Override Alter SQL fires during ALTER logic, not CREATE. If the table is new, this extension point has no effect and you should use Create SQL or Post Create SQL instead.
  • Missing GO separators: Multiple ALTER TABLE statements in the same batch can cause unexpected errors in SQL Server.
  • Not using the tableObject helper: Hardcoding schema and table names makes the extension point non-portable across environments or when object names change.
  • Conflicting with other DDL extension points: If Create SQL is also defined for the same object, the ALTER may be unnecessary or may conflict with the custom CREATE logic.
  • Setting ObjectInherit to true on environment-specific SQL: Applying the same ALTER to all objects in a batch can fail if the referenced columns or constraints do not exist on every table.
  • Create SQL -- Overrides the entire CREATE TABLE DDL when you need to rebuild rather than alter.
  • Pre Create SQL -- Injects SQL before CREATE DDL, useful for dropping constraints before a rebuild.
  • Post Create SQL -- Injects SQL after CREATE DDL, useful for adding constraints after a rebuild.
  • Override Primary Key SQL -- Overrides the primary key constraint within the CREATE TABLE DDL.

Override Initialize

Override the initializing of staging tables

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="OverrideInitialize" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>


<!-- This can be anything defined within the SSIS Control Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<# var initializeConnection = sourceTable.GetInitializeConnection();
var initializeSafeScopedName = sourceTable.GetInitializeScopedName(initializeConnection, sourceTable.Connection.RelatedItem.IntegrationStage).MakeFlexSafeName();
var initializeSchemaQualifiedName = sourceTable.GetInitializeSchemaQualifiedName(initializeConnection, sourceTable.Connection.RelatedItem.IntegrationStage);
#>
<ExecuteSQL Name="SQL - Initialize <#=initializeSafeScopedName#>" ConnectionName="<#=initializeConnection.Name#>" ResultSet="None">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<DirectInput>DELETE FROM <#=initializeSchemaQualifiedName#> --WHERE 1=2</DirectInput>
<# CustomOutput.OutputPathName = @"SQL - Initialize " + initializeSafeScopedName + ".Output"; #>
</ExecuteSQL>

What It Does

Replaces the staging table initialization step that BimlFlex generates before the main data flow. By default, BimlFlex truncates or deletes from the staging table to prepare it for the incoming data load. This extension point lets you override that behavior with custom initialization logic -- such as conditional deletes, archive-before-truncate patterns, or disabling the initialization entirely.

Where It Fires

The extension point is consumed during control flow generation and replaces the default initialize (truncate/delete) task inside the SEQC - Main container. It fires once per object before the data flow task. The precedenceConstraint parameter connects the initialize task to any preceding control flow tasks. You must set CustomOutput.OutputPathName so the data flow task can chain after your custom initialization.

When to Use It

  1. Use DELETE with a WHERE clause instead of TRUNCATE. The default initialization truncates awlt.Customer in BFX_STG. You want to delete only rows from a specific source (WHERE FlexRowRecordSource = 'AWLT') while preserving rows from other sources.
  2. Archive staging data before clearing. Before truncating awlt.SalesOrderHeader in BFX_STG, insert the current contents into an archive table for audit trail purposes.
  3. Disable initialization entirely. For append-only staging patterns where you want to keep historical extractions in the staging table, replace the initialization with a no-op.
  4. Add conditional initialization logic. Only truncate the staging table if the previous execution completed successfully, by checking a status flag before executing the DELETE.
  5. Use this instead of Pre Data Flow when you need to replace the initialization step itself, not add tasks before or after it.

Prerequisites

  • The object must exist in BimlFlex metadata with a staging connection configured.
  • You must set CustomOutput.OutputPathName so the data flow can chain after your task.
  • The precedenceConstraint parameter must be handled to connect to preceding tasks.
  • The initialization connection (initializeConnection) must be accessible.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Override Initialize from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Write the custom initialization SQL (DELETE, TRUNCATE, or conditional logic).
  5. Include precedenceConstraint handling and set CustomOutput.OutputPathName.
  6. Build the project and verify the initialization task is replaced in the generated package.
  7. Execute and confirm the staging table is prepared according to your custom logic.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="OverrideInitialize" target="SalesLT.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# CustomOutput.ObjectInherit = false; #>
<# var initializeConnection = sourceTable.GetInitializeConnection();
var initializeSafeScopedName = sourceTable.GetInitializeScopedName(initializeConnection, sourceTable.Connection.RelatedItem.IntegrationStage).MakeFlexSafeName();
var initializeSchemaQualifiedName = sourceTable.GetInitializeSchemaQualifiedName(initializeConnection, sourceTable.Connection.RelatedItem.IntegrationStage);
#>
<ExecuteSQL Name="SQL - Initialize <#=initializeSafeScopedName#>" ConnectionName="<#=initializeConnection.Name#>" ResultSet="None">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<DirectInput>DELETE FROM <#=initializeSchemaQualifiedName#> WHERE [FlexRowRecordSource] = 'AWLT'</DirectInput>
<# CustomOutput.OutputPathName = @"SQL - Initialize " + initializeSafeScopedName + ".Output"; #>
</ExecuteSQL>

Example

Before (default initialization -- full truncate):

TRUNCATE TABLE [awlt].[Customer]

After (Override Initialize with conditional delete):

DELETE FROM [awlt].[Customer] WHERE [FlexRowRecordSource] = 'AWLT'

Common Mistakes

  • Forgetting to set OutputPathName: Without CustomOutput.OutputPathName, the data flow cannot connect to the initialization task, breaking the execution chain.
  • Not handling precedenceConstraint: Omitting the precedence constraint handling disconnects the initialization from preceding tasks.
  • Using TRUNCATE when DELETE is needed: TRUNCATE cannot have a WHERE clause and ignores foreign key constraints. Use DELETE when you need conditional or constraint-aware clearing.
  • Leaving old data in staging: If your custom initialization is too narrow (e.g., deletes fewer rows than expected), stale data remains in the staging table and gets processed downstream.
  • Not testing with the full package flow: The initialization runs before the data flow. Verify the complete sequence (initialize -> extract -> load) works end-to-end.
  • Pre Data Flow -- Injects tasks before the data flow without replacing the initialization step.
  • Data Flow Override -- Replaces the data flow itself (not the initialization).
  • Override Main -- Replaces the entire SEQC - Main container including initialization and data flow.
  • Pre Tasks -- Injects tasks before all tasks in SEQC - Main with source/target context.

Override Main

Completely override the main Data Flow (SEQC - Main)

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the override will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="OverrideMain" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- This option is added for flexibility, but not recommended, as it will be disconnected from the BimlFlex patterns. -->
<Tasks>
<Dataflow Name="DFT - Load MySource" DelayValidation="true">
<Transformations>
<OleDbSource Name="OLE_SRC - MySource" ConnectionName="MySourceConnection">
<DirectInput>SELECT * FROM [dbo].[MySourceTable]</DirectInput>
</OleDbSource>
<OleDbDestination Name="OLE_DST - MyTarget" ConnectionName="MyTargetConnection">
<InputPath OutputPathName="OLE_SRC - MySource.Output" />
<ExternalTableOutput Table="[dbo].[MySourceTable]" />
</OleDbDestination>
</Transformations>
</Dataflow>
</Tasks>

What It Does

Completely replaces the SEQC - Main sequence container in the generated SSIS package with your own custom task definition. This is the most invasive object-level extension point: it removes all BimlFlex-generated logic (source extraction, transformations, hashing, destination loading, error handling) and substitutes your own end-to-end pipeline. The generated package retains its batch framework (logging, execution tracking) but the core data movement is entirely custom.

Where It Fires

The extension point is consumed during package generation and replaces the entire <Tasks> content of the SEQC - Main sequence container. It fires once per object. Because it replaces the main container, none of the standard BimlFlex data flow components, transformations, or control flow tasks are generated for that object. This effectively disconnects the object from all BimlFlex patterns.

When to Use It

  • You have a SalesLT.Customer extraction on AWLT_SRC that requires a completely custom data flow with specialized transformations, multiple destinations, or non-standard logic that cannot be achieved through other extension points.
  • You need to implement a package that uses an Execute Process Task or Script Task as its main operation instead of a data flow for SalesLT.SalesOrderHeader.
  • You want to call external ETL tools or REST APIs as the primary operation for a specific object while still keeping it registered in the BimlFlex batch framework.
  • Use this only as a last resort when Data Flow Override, Source Override, Target Override, and other targeted extension points cannot achieve your requirements.
  • Use this instead of creating a standalone SSIS package outside of BimlFlex when you still want the object to participate in batch orchestration and execution logging.

Prerequisites

  • The object must exist in BimlFlex metadata so the extension point can bind to it.
  • You must provide a complete <Tasks> block containing all the control flow tasks for the object's package.
  • Any connections referenced in your custom tasks must be defined in the project.
  • Understand that all BimlFlex-generated logic (source extraction, hashing, staging, error handling) is removed. You are responsible for the entire pipeline.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Override Main from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Write the complete <Tasks> block with all control flow tasks (data flows, Execute SQL tasks, Script Tasks, etc.).
  5. Include connection references, column mappings, and any error handling your custom pipeline requires.
  6. Build the project and carefully review the generated package -- no BimlFlex standard logic will be present.
  7. Execute the package and verify end-to-end data movement works as expected.

Example

Before (default BimlFlex-generated SEQC - Main):

SEQC - Main
├─ SQL - Pre Process
├─ DFT - Extract SalesLT.Customer
│ ├─ OLE_SRC - SalesLT.Customer
│ ├─ [Hashing/Transformations]
│ └─ OLE_DST - awlt.Customer
└─ SQL - Post Process

After (with OverrideMain replacing everything):

<Tasks>
<Dataflow Name="DFT - Load Customer Custom" DelayValidation="true">
<Transformations>
<OleDbSource Name="OLE_SRC - Customer" ConnectionName="AWLT_SRC">
<DirectInput>SELECT * FROM [SalesLT].[Customer]</DirectInput>
</OleDbSource>
<OleDbDestination Name="OLE_DST - Customer" ConnectionName="BFX_STG">
<InputPath OutputPathName="OLE_SRC - Customer.Output" />
<ExternalTableOutput Table="[awlt].[Customer]" />
</OleDbDestination>
</Transformations>
</Dataflow>
</Tasks>

Common Mistakes

  • Using Override Main when a targeted extension point would suffice: This is the nuclear option. If you only need to change the source, use Source Override. If you only need to change the data flow, use Data Flow Override. Override Main removes everything.
  • Losing BimlFlex-generated transformations: Hashing, row numbering, audit columns, change detection -- all are lost. You must implement any of these you still need in your custom tasks.
  • Forgetting execution logging: The batch framework still logs execution start/end, but internal logging (row counts, error counts) that BimlFlex normally generates is lost. Add your own logging if needed.
  • Not wrapping in a Tasks element: The extension point expects a <Tasks> root element. Omitting it causes a build error.
  • Breaking batch orchestration: While the object still participates in the batch, any expected downstream dependencies (e.g., staging table populated before Data Vault load) may break if your custom logic does not produce the expected output.
  • Data Flow Override -- Replaces only the data flow task, preserving other control flow tasks (pre/post processing).
  • Source Override -- Replaces only the source component in the data flow.
  • Target Override -- Replaces only the target destination in the data flow.
  • Pre Data Flow / Post Data Flow -- Injects control flow tasks before/after the data flow without replacing anything.

Override Merge

Override the Merge statement of target tables

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the override will be added

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="OverrideMerge" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<!-- A scenario might be that you have an insert only fact and you might want to override the Merge to only insert and ignore updates. -->
MERGE [dim].[MyTable] AS TGT
USING [dim].[stage_MyTable] AS SRC
ON TGT.[MyTableCode] = SRC.[MyTableCode]
--WHEN MATCHED
-- THEN
-- UPDATE
-- SET [MyTableCode] = SRC.[MyTableCode]
-- ,[MyTableName] = SRC.[MyTableName]
-- ,[FlexRowAuditId] = SRC.[FlexRowAuditId]
-- ,[FlexRowHashType1] = SRC.[FlexRowHashType1]
-- ,[FlexRowIsInferred] = SRC.[FlexRowIsInferred]
WHEN NOT MATCHED BY TARGET
THEN
INSERT (
[MyTableCode]
,[MyTableName]
,[FlexRowAuditId]
,[FlexRowHashType1]
,[FlexRowIsInferred]
)
VALUES (
SRC.[MyTableCode]
,SRC.[MyTableName]
,SRC.[FlexRowAuditId]
,SRC.[FlexRowHashType1]
,SRC.[FlexRowIsInferred]
);

What It Does

Replaces the entire MERGE statement that BimlFlex generates for loading data into target tables (typically Data Mart dimensions and facts, or ODS tables). The default merge performs both INSERT and UPDATE operations based on hash comparison. This extension point lets you customize the merge logic -- for example, making it insert-only for fact tables, adding custom WHEN MATCHED conditions, or implementing slowly changing dimension (SCD) logic that differs from the default.

Where It Fires

The extension point is consumed during stored procedure or ELT script generation. It replaces the MERGE statement in the generated load procedure for the targeted object. It fires once per object during the merge logic generation phase. The MERGE statement is typically used in Data Mart loads (staging-to-DM) and ODS loads (staging-to-ODS).

When to Use It

  1. Make a fact table insert-only. Your fact.SalesOrder in BFX_DM is an append-only fact table. Override the merge to remove the WHEN MATCHED (UPDATE) clause, keeping only the WHEN NOT MATCHED (INSERT).
  2. Add custom SCD Type 2 logic. The default merge for dim.Customer in BFX_DM does not handle SCD Type 2 the way your business requires. Override to add row expiration and new row insertion logic.
  3. Add a WHEN NOT MATCHED BY SOURCE clause. Soft-delete records in dim.Product in BFX_DM that no longer exist in the source by adding a DELETE or UPDATE clause for unmatched target rows.
  4. Customize the match condition. The default merge matches on the business key. You need to match on a composite key or add additional conditions for dim.Customer.
  5. Use this instead of Data Flow Override when the data flow is correct but the merge/load procedure needs custom SQL logic.

Prerequisites

  • The object must exist in BimlFlex metadata with a target connection that uses merge-based loading (typically Data Mart or ODS).
  • The MERGE statement must reference the correct staging and target table names.
  • Column lists in INSERT and UPDATE clauses must match the actual table columns.
  • The merge must be syntactically valid SQL Server MERGE syntax (terminated with a semicolon).

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Override Merge from the dropdown.
  3. Set the Target to the specific object (e.g., dim.Customer on BFX_DM).
  4. Write the complete MERGE statement including USING, ON, WHEN MATCHED, WHEN NOT MATCHED, and the terminating semicolon.
  5. Use the table parameter to dynamically generate table names if needed.
  6. Build the project and verify the generated stored procedure contains your custom MERGE.
  7. Execute the load and verify data is merged according to your custom logic.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="OverrideMerge" target="dim.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

MERGE [dim].[Customer] AS TGT
USING [dim].[stage_Customer] AS SRC
ON TGT.[Customer_BK] = SRC.[Customer_BK]
WHEN MATCHED AND TGT.[FlexRowHashType1] <> SRC.[FlexRowHashType1]
THEN
UPDATE
SET [CustomerName] = SRC.[CustomerName]
,[EmailAddress] = SRC.[EmailAddress]
,[FlexRowAuditId] = SRC.[FlexRowAuditId]
,[FlexRowHashType1] = SRC.[FlexRowHashType1]
WHEN NOT MATCHED BY TARGET
THEN
INSERT ([Customer_BK], [CustomerName], [EmailAddress], [FlexRowAuditId], [FlexRowHashType1])
VALUES (SRC.[Customer_BK], SRC.[CustomerName], SRC.[EmailAddress], SRC.[FlexRowAuditId], SRC.[FlexRowHashType1]);

Example

Before (default merge with INSERT and UPDATE):

MERGE [dim].[Customer] AS TGT
USING [dim].[stage_Customer] AS SRC
ON TGT.[Customer_BK] = SRC.[Customer_BK]
WHEN MATCHED
THEN UPDATE SET ...
WHEN NOT MATCHED BY TARGET
THEN INSERT (...) VALUES (...);

After (Override Merge -- insert-only for a fact table):

MERGE [dim].[MyTable] AS TGT
USING [dim].[stage_MyTable] AS SRC
ON TGT.[MyTableCode] = SRC.[MyTableCode]
WHEN NOT MATCHED BY TARGET
THEN INSERT ([MyTableCode], [MyTableName], [FlexRowAuditId], [FlexRowHashType1])
VALUES (SRC.[MyTableCode], SRC.[MyTableName], SRC.[FlexRowAuditId], SRC.[FlexRowHashType1]);

Common Mistakes

  • Missing the terminating semicolon: SQL Server MERGE statements require a semicolon at the end. Omitting it causes a syntax error.
  • Column mismatch between INSERT and VALUES: The INSERT column list and VALUES list must match in count and order. Mismatches cause runtime errors.
  • Forgetting BimlFlex framework columns: The merge must include framework columns like FlexRowAuditId, FlexRowHashType1, and FlexRowIsInferred to maintain compatibility with BimlFlex auditing and change detection.
  • Not testing with the staging table: The MERGE reads from the staging table (e.g., [dim].[stage_Customer]). Ensure the staging table is populated before the merge runs.
  • Applying to the wrong object type: Override Merge is typically for Data Mart or ODS loads. Applying it to a Data Vault or staging object has no effect.
  • Create SQL -- Overrides the DDL for the target table that the merge writes to.
  • Override SQL -- Overrides the source query (not the merge logic).
  • Override Initialize -- Overrides the staging table initialization before the data flow.
  • Data Flow Override -- Replaces the data flow task (different from the merge procedure).

Override Primary Key SQL

Configure SQL that will Override the Primary Key SQL for a object

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the SQL will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="OverridePrimaryKeySql" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
<# var tableObject = new ObjectViewModel(table, table.Connection.RelatedItem.IntegrationStage);
var primaryKeySql = tableObject.IndexPrimaryKeyColumns.GetPrimaryKeySelectSql(tableObject.StartDelimiter, tableObject.EndDelimiter);
#>
CONSTRAINT [PK_ep_<#=tableObject.Schema#>_<#=tableObject.ObjectName#>] PRIMARY KEY CLUSTERED (<#=primaryKeySql#>, [FlexRowEffectiveFromDate])

What It Does

Replaces the primary key constraint definition that BimlFlex generates within the CREATE TABLE DDL for an object. This gives you full control over which columns participate in the primary key, whether it is clustered or nonclustered, and any advanced options such as fill factor or filegroup placement that the standard metadata-driven generation does not expose.

Where It Fires

The extension point is consumed during the DDL generation phase in the 0.90.x-flx-deploy-*-script.biml deployment files. It replaces the CONSTRAINT ... PRIMARY KEY clause within the CREATE TABLE statement for the targeted object. It fires once per object during deployment script generation.

When to Use It

  • The default primary key on awlt.Customer in BFX_STG needs to include FlexRowEffectiveFromDate to support Type 2 historization, but the metadata-driven PK only includes the business key.
  • You want to change the primary key on dv.HUB_Customer in BFX_DV from clustered to nonclustered so a different column can serve as the clustered index.
  • The auto-generated PK on awlt.SalesOrderHeader needs custom fill factor or data compression settings for performance tuning.
  • Use this instead of Create SQL when you only need to change the PK constraint and want BimlFlex to continue generating the rest of the CREATE TABLE automatically.
  • Use this instead of Post Create SQL when you want the PK defined inline in the CREATE TABLE rather than as a separate ALTER TABLE statement.

Prerequisites

  • The object must exist in BimlFlex metadata with columns marked as primary keys.
  • You must understand the existing PK definition to know what you are replacing.
  • The columns referenced in your custom PK must exist in the table definition.
  • If combining with Override Unique Key SQL, ensure the PK and UK do not both attempt to be clustered.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Override Primary Key SQL from the dropdown.
  3. Set the Target to the specific object (e.g., awlt.Customer on BFX_STG).
  4. Write the complete CONSTRAINT clause including the constraint name, key type, and column list.
  5. Use the tableObject and primaryKeySql helper variables from the template to dynamically incorporate the default key columns.
  6. Build the project and inspect the generated CREATE TABLE DDL to verify the primary key is correct.
  7. Deploy the script and confirm the constraint exists with the expected definition.

Example

Before (default generated primary key):

CONSTRAINT [PK_awlt_Customer] PRIMARY KEY CLUSTERED ([CustomerID] ASC)

After (with OverridePrimaryKeySql extension point):

CONSTRAINT [PK_ep_awlt_Customer] PRIMARY KEY CLUSTERED ([CustomerID] ASC, [FlexRowEffectiveFromDate] ASC)
WITH (PAD_INDEX = OFF, IGNORE_DUP_KEY = OFF, DATA_COMPRESSION = PAGE)

Common Mistakes

  • Referencing columns not in the table: If you add a column to the PK that does not exist in the metadata, the CREATE TABLE statement fails.
  • Conflicting clustered indexes: Defining both the PK and the unique key (via Override Unique Key SQL) as clustered causes deployment failure since SQL Server allows only one clustered index per table.
  • Incorrect constraint naming: Using a name that collides with existing constraints or does not follow your naming convention causes confusion during troubleshooting.
  • Omitting the ASC/DESC keyword: While defaults apply, explicit ordering prevents ambiguity across environments.
  • Not testing with the full CREATE TABLE: The PK constraint is embedded within the CREATE TABLE DDL; test the complete generated script, not just the constraint fragment.
  • Override Unique Key SQL -- Overrides the unique key constraint within the same CREATE TABLE DDL.
  • Create SQL -- Overrides the entire CREATE TABLE DDL, including all constraints.
  • Post Create SQL -- Adds constraints after the CREATE TABLE as separate ALTER statements.
  • Pre Create SQL -- Drops existing constraints before the CREATE TABLE executes.

Override SQL

Override the source query SQL generated by BimlFlex

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the SQL will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="OverrideSql" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
SELECT DISTINCT [PropertyId]
,[EnterpriseId]
,CONVERT(INT, ISNULL([MemberId], - 1)) AS [MemberId]
FROM [dbo].[MyTable]

What It Does

Replaces the entire source query SQL that BimlFlex generates for an object. Instead of using the auto-generated SELECT statement derived from metadata, the pipeline executes your custom query, giving you complete control over column selection, filtering, joins, type conversions, and deduplication in the extraction or ELT source query.

Where It Fires

The extension point is consumed in 0.90.x deployment scripts and 90.flx-i-get-* package generation files. It replaces the source SELECT statement used by the data flow or ELT stored procedure for the targeted object. It fires once per object during package or script generation.

When to Use It

  • You need to extract only distinct records from SalesLT.Customer in AWLT_SRC, removing duplicates at the source before data reaches BFX_STG.
  • The source table SalesLT.SalesOrderHeader requires a CONVERT or CAST on specific columns (e.g., converting a string MemberId to INT) that cannot be expressed through metadata alone.
  • You want to add a WHERE clause to filter out test or soft-deleted records from SalesLT.Customer during extraction.
  • Use this instead of Append Source SQL when you need to replace the entire SELECT statement rather than just appending clauses (like hints or FOR READ ONLY) to the end.
  • Use this instead of Source Create SQL when you want to change the query the pipeline uses without deploying a physical view to the source database.

Prerequisites

  • The object must exist in BimlFlex metadata so the extension point can bind to it.
  • Your custom query must return columns that match the expected column names and data types defined in the object metadata.
  • The query must be valid SQL for the source connection type (SQL Server, Snowflake, Databricks, etc.).
  • Test your custom query independently against the source database before adding it as an extension point.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Override SQL from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Write your complete SELECT statement, ensuring all required columns from the metadata are included.
  5. Verify column aliases match the expected metadata column names exactly.
  6. Build the project and review the generated package or stored procedure to confirm the custom SQL is used.
  7. Execute a test run to verify data flows correctly through the pipeline.

Example

Before (default auto-generated query):

SELECT [CustomerID]
,[FirstName]
,[LastName]
,[EmailAddress]
,[ModifiedDate]
FROM [SalesLT].[Customer]

After (with OverrideSql extension point):

SELECT DISTINCT
[CustomerID]
,[FirstName]
,[LastName]
,LOWER([EmailAddress]) AS [EmailAddress]
,[ModifiedDate]
FROM [SalesLT].[Customer]
WHERE [IsActive] = 1
AND [CustomerID] > 0

Common Mistakes

  • Column name mismatch: Your custom query returns columns with different names or in a different order than the metadata expects, causing mapping failures at runtime.
  • Missing columns: Omitting a column that downstream transformations or hash calculations depend on causes build or runtime errors.
  • Confusing Override SQL with Append Source SQL: Override SQL replaces the entire query; if you only need to add WITH (NOLOCK) or FOR READ ONLY, use Append Source SQL instead.
  • Not handling NULLs consistently: Adding ISNULL or COALESCE wrappers can change hash calculations downstream, leading to false change detection in staging.
  • Hardcoding environment-specific values: Avoid hardcoding server names or database names; use the BimlFlex connection metadata to stay environment-agnostic.
  • Append Source SQL -- Appends text to the end of the generated source query rather than replacing it entirely.
  • Source Create SQL -- Deploys a physical view or table to the source database, rather than just changing the pipeline query.
  • Lookup SQL -- Overrides the SQL used for lookup transformations, not the main source query.
  • Create SQL -- Overrides the CREATE TABLE DDL for target objects, not the source query.

Override Unique Key SQL

Configure SQL that will Override the Unique Key SQL for a object

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the SQL will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="OverrideUniqueKeySql" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
<# var tableObject = new ObjectViewModel(table, table.Connection.RelatedItem.IntegrationStage); #>
CONSTRAINT [UK_ep_<#=tableObject.Schema#>_<#=tableObject.ObjectName#>] UNIQUE NONCLUSTERED (<#=tableObject.IntegrationKeyQualifiedName#>, [FlexRowEffectiveFromDate])

What It Does

Replaces the unique key constraint definition that BimlFlex generates within the CREATE TABLE DDL for an object. This lets you customize which columns participate in the unique constraint, change the constraint type (clustered vs. nonclustered), or add special options like included columns or filter predicates that the standard metadata-driven generation does not support.

Where It Fires

The extension point is consumed during the DDL generation phase in the 0.90.x-flx-deploy-*-script.biml deployment files. It replaces the CONSTRAINT ... UNIQUE clause within the CREATE TABLE statement for the targeted object. It fires once per object during deployment script generation.

When to Use It

  • The default unique key on awlt.Customer in BFX_STG needs to include FlexRowEffectiveFromDate as part of a composite unique constraint to support Type 2 history tracking.
  • You need to change the unique key on dv.SAT_Customer_Details_awlt in BFX_DV from nonclustered to clustered for query performance reasons.
  • The auto-generated unique constraint on awlt.SalesOrderHeader conflicts with your partitioning strategy and needs to be defined with specific filegroup placement.
  • Use this instead of Post Create SQL when you want to define the unique key inline within the CREATE TABLE statement rather than adding it as a separate ALTER TABLE after creation.
  • You want to remove the unique key entirely for a specific object by providing an empty or commented-out extension point body.

Prerequisites

  • The object must exist in BimlFlex metadata with at least one column marked as an integration key.
  • You must understand the current unique key definition to know what you are replacing.
  • The columns referenced in your custom unique constraint must exist in the table definition.
  • If combining with Override Primary Key SQL, ensure the PK and UK do not conflict (e.g., both trying to be clustered).

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Override Unique Key SQL from the dropdown.
  3. Set the Target to the specific object (e.g., awlt.Customer on BFX_STG).
  4. Write the complete CONSTRAINT clause including the constraint name, key type, and column list.
  5. Use the tableObject helper variable from the template to dynamically generate schema and table names.
  6. Build the project and inspect the generated CREATE TABLE DDL to verify the unique constraint is correct.
  7. Deploy the script and confirm the constraint exists with the expected definition.

Example

Before (default generated unique key):

CONSTRAINT [UK_awlt_Customer] UNIQUE NONCLUSTERED ([CustomerID] ASC)

After (with OverrideUniqueKeySql extension point):

CONSTRAINT [UK_ep_awlt_Customer] UNIQUE NONCLUSTERED ([CustomerID] ASC, [FlexRowEffectiveFromDate] ASC)
WITH (PAD_INDEX = OFF, IGNORE_DUP_KEY = OFF, FILLFACTOR = 90)

Common Mistakes

  • Mismatched constraint name conventions: Using a name that conflicts with the primary key constraint or other indexes on the table.
  • Referencing columns not in the CREATE TABLE: If you reference a column that does not exist in the table metadata, the deployment script fails.
  • Creating both a clustered PK and clustered UK: SQL Server allows only one clustered index per table. If Override Primary Key SQL defines a clustered PK, the unique key must be nonclustered.
  • Omitting the ASC/DESC keyword: While optional, inconsistent ordering between environments can cause subtle differences in query plans.
  • Forgetting to coordinate with Override Primary Key SQL: Both extension points modify constraints within the same CREATE TABLE. Changes to one may require adjustments to the other.
  • Override Primary Key SQL -- Overrides the primary key constraint within the same CREATE TABLE DDL.
  • Create SQL -- Overrides the entire CREATE TABLE DDL, including all constraints.
  • Post Create SQL -- Adds constraints after the CREATE TABLE, as an alternative to inline definition.
  • Override Alter SQL -- Overrides ALTER TABLE statements if you need to add the unique key as a post-deployment change.

Package Add Connection Reference

Configure additional Connection references to the object package

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the connection will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PackageAddConnection" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- Note that the connection must be a valid connection defined within the BimlFlex metadata. Connections can be added manually if required. -->
<# CustomOutput.ObjectInherit = true; #>
<Connection ConnectionName="MY_SOURCE" />

What It Does

Adds additional connection references to the object's SSIS package. When your extension points reference connections that BimlFlex does not automatically include in the package (because they are not part of the standard source-to-target mapping), this extension point ensures those connections are registered in the package's connection manager list so they are available at runtime.

Where It Fires

The extension point is consumed during SSIS package generation and adds <Connection> elements to the package's connection reference list. It fires once per object (or once per batch when ObjectInherit = true). The referenced connection must already exist in the BimlFlex metadata or be defined via a Root Add extension point.

When to Use It

  1. Add a connection used by a custom pre/post process. Your Post Data Flow extension point for SalesLT.Customer executes a stored procedure on an audit database (AUDIT_DB). Use PackageAddConnection to include the AUDIT_DB connection in the package.
  2. Reference a lookup connection. Your Lookup Cache extension point for SalesLT.SalesOrderHeader reads from a dimension table on a BFX_DM connection that is not the object's source or target. Add it as a package connection reference.
  3. Include a BimlCatalog connection explicitly. Some package configurations require the BimlCatalog connection to be explicitly referenced beyond the default auto-inclusion.
  4. Apply to all packages in a batch. Set ObjectInherit = true to ensure every package in the LOAD_BFX_DV batch has access to a shared connection like MY_SOURCE.

Prerequisites

  • The connection must be defined in BimlFlex metadata or via a Root Add extension point.
  • The connection name must exactly match the name defined in metadata.
  • The connection must be accessible at runtime (server reachable, credentials valid).

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Package Add Connection Reference from the dropdown.
  3. Set the Target to the specific object or use ObjectInherit = true for all objects.
  4. Add <Connection ConnectionName="..." /> elements for each connection to include.
  5. Build the project and verify the connections appear in the generated package's connection managers.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PackageAddConnection" target="SalesLT.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
<Connection ConnectionName="AUDIT_DB" />

Example

Before (package only has AWLT_SRC and BFX_STG connections):

The package for SalesLT.Customer references only the source and target connections.

After (PackageAddConnection adds AUDIT_DB):

The package now also references AUDIT_DB, allowing Post Data Flow tasks to execute stored procedures on the audit database.

Common Mistakes

  • Misspelled connection name: The ConnectionName must exactly match a connection defined in BimlFlex metadata. A typo causes a build error: "Cannot resolve connection reference."
  • Connection not defined in metadata: The connection must exist in BimlFlex metadata or be created via Root Add. Referencing a non-existent connection fails.
  • Adding a connection that is already auto-included: BimlFlex automatically includes source and target connections. Adding them again may cause a duplicate reference warning.
  • Using ObjectInherit = true when not all packages need the connection: Adds unnecessary connection references to packages that do not use them, which can cause validation warnings.
  • Root Add -- Defines connections at the root level of the project (creates the connection; PackageAddConnection references it).
  • Package Variable -- Adds variables to the package (often used alongside custom connections).
  • Package Parameter -- Adds parameters to the package.

Package Configurations

Add PackageConfigurations if you have unselected Use Project Deployment

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the configurations will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PackageConfigurations" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- Use PackageConfigurations to add a Configurations if you have unselected Use Project Deployment in the BimlStudio project.
When using Package Deployment instead of Project Deployment you might need to specify additional configurations. Configurations can also be added to the Batch using the PackageConfigurations combined with CustomOutput.ObjectInherit = true; -->
<# CustomOutput.ObjectInherit = true; #>
<# var configurationPath = table.GetConfigurationValue("ConfigurationPath");
configurationPath = string.IsNullOrEmpty(configurationPath) ? @"C:\Varigence\Configurations" : configurationPath; #>
<PackageConfiguration Name="LOAD_MY_Configurations">
<ExternalFileInput ExternalFilePath="<#=configurationPath#>\MY_BATCH_Configurations.dtsConfig" FileUsageType="ExistingFile" RelativePath="false" />
</PackageConfiguration>

What It Does

Adds SSIS package configurations (.dtsConfig files) to the object's package. This is used when the BimlStudio project is set to Package Deployment mode (not Project Deployment). Package configurations allow external files to override connection strings, variable values, and other package properties at runtime, providing environment-specific settings without modifying the packages themselves.

Where It Fires

The extension point is consumed during SSIS package generation and adds <PackageConfiguration> elements to the package. It fires once per object (or once per batch with ObjectInherit = true). The configuration file is loaded by SSIS at package startup before any tasks execute.

When to Use It

  1. Override connection strings per environment. Your SalesLT.Customer package on AWLT_SRC uses Package Deployment and needs a .dtsConfig file to switch connection strings between dev, test, and prod.
  2. Set variable values from an external file. A configuration file provides the TenantCode and BatchSize values for all packages in the LOAD_BFX_DV batch.
  3. Apply batch-level configurations. Set ObjectInherit = true to apply the same configuration file to every package in the batch.
  4. Support legacy SSIS environments. Older SSIS environments that use Package Deployment require .dtsConfig files; this extension point bridges that need.

Prerequisites

  • The BimlStudio project must be set to Package Deployment mode (not Project Deployment).
  • The .dtsConfig file must exist at the specified path on the SSIS server at runtime.
  • The configuration path can be read from BimlFlex metadata using table.GetConfigurationValue("ConfigurationPath").

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Package Configurations from the dropdown.
  3. Set the Target to the specific object or use ObjectInherit = true for all objects.
  4. Define the <PackageConfiguration> element pointing to the .dtsConfig file.
  5. Use table.GetConfigurationValue("ConfigurationPath") for dynamic path resolution.
  6. Build the project and verify the configuration reference appears in the generated package.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PackageConfigurations" target="SalesLT.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = true; #>
<# var configurationPath = table.GetConfigurationValue("ConfigurationPath");
configurationPath = string.IsNullOrEmpty(configurationPath) ? @"C:\Varigence\Configurations" : configurationPath; #>
<PackageConfiguration Name="LOAD_MY_Configurations">
<ExternalFileInput ExternalFilePath="<#=configurationPath#>\LOAD_BFX_DV_Configurations.dtsConfig" FileUsageType="ExistingFile" RelativePath="false" />
</PackageConfiguration>

Example

Before (Package Deployment with no custom configurations):

The package uses hardcoded connection strings and variable values.

After (PackageConfigurations adds an external config file):

<PackageConfiguration Name="LOAD_MY_Configurations">
<ExternalFileInput ExternalFilePath="C:\Varigence\Configurations\LOAD_BFX_DV_Configurations.dtsConfig" FileUsageType="ExistingFile" RelativePath="false" />
</PackageConfiguration>

Common Mistakes

  • Using with Project Deployment: Package configurations only work with Package Deployment mode. In Project Deployment, use project parameters and environments instead.
  • Configuration file not found at runtime: The .dtsConfig file must exist at the specified path on the SSIS server. Missing files cause package startup failures.
  • Incorrect configuration path: Hardcoded paths like C:\Varigence\Configurations may not exist on all servers. Use table.GetConfigurationValue() for environment-aware path resolution.
  • Overriding critical BimlFlex variables: Configuration files can override any package variable. Accidentally overriding BimlFlex framework variables (e.g., ExecutionID) can break execution logging.
  • Package Variable -- Defines package variables that can be overridden by configurations.
  • Package Parameter -- Adds package parameters (used with Project Deployment instead of configurations).
  • Package Add Connection Reference -- Adds connection references that the configuration file may configure.

Package Parameter

Configure parameter bound values to the package

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the parameter will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PackageParameter" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- The below example is adding a PackageParameter to the package. PackageParameter is normally combined with a ParameterBindings Extension Point -->
<# CustomOutput.ObjectInherit = true; #>
<Parameter Name="BatchId" DataType="String"></Parameter>
<Parameter Name="BatchInstanceId" DataType="String">0</Parameter>

What It Does

Adds custom package-level parameters to the SSIS package. Package parameters are used with Project Deployment to pass values into a package at execution time from the SSIS catalog, SQL Server Agent, or other calling mechanisms. They differ from package variables in that parameters are externally visible and can be set by the SSIS execution environment.

Where It Fires

The extension point is consumed during SSIS package generation and adds <Parameter> elements to the package definition. It fires once per object (or once per batch with ObjectInherit = true). The parameters are available throughout the package and can be referenced in expressions, connection strings, and task properties.

When to Use It

  1. Pass a BatchId from the orchestrator. Your SSIS catalog execution of the SalesLT.Customer package on AWLT_SRC needs to receive a BatchId from the calling SQL Agent job step.
  2. Define a BatchInstanceId for parallel execution. When running multiple instances of the same package simultaneously, pass a BatchInstanceId to differentiate executions.
  3. Expose configuration values as parameters. Instead of using package configurations, define parameters like SourceConnectionString that can be set in the SSIS catalog environment.
  4. Pass tenant or partition identifiers. In multi-tenant architectures, pass a TenantCode parameter to filter extraction for a specific tenant.
  5. Use this instead of Package Configurations when using Project Deployment mode, where parameters and environments replace .dtsConfig files.

Prerequisites

  • The project must use Project Deployment mode for parameters to be externally configurable.
  • Parameter names must be unique within the package.
  • Parameter data types must match the values that will be passed at execution time.
  • Consuming tasks must reference parameters using @[$Package::ParameterName] syntax.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Package Parameter from the dropdown.
  3. Set the Target to the specific object or use ObjectInherit = true for all objects.
  4. Define <Parameter> elements with name, data type, and optional default value.
  5. Build the project and verify the parameters appear in the generated package.
  6. Configure the SSIS catalog environment to set parameter values for each deployment target.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PackageParameter" target="SalesLT.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = true; #>
<Parameter Name="BatchId" DataType="String"></Parameter>
<Parameter Name="BatchInstanceId" DataType="String">0</Parameter>

Example

Before (package with no custom parameters):

The package has only BimlFlex-generated parameters.

After (PackageParameter adds BatchId and BatchInstanceId):

<Parameter Name="BatchId" DataType="String"></Parameter>
<Parameter Name="BatchInstanceId" DataType="String">0</Parameter>

Common Mistakes

  • Using with Package Deployment: Package parameters are only externally configurable with Project Deployment. In Package Deployment mode, use Package Configurations instead.
  • Duplicate parameter names: Adding a parameter that BimlFlex already generates causes a build error. Check existing parameters before adding custom ones.
  • Wrong data type: Defining a parameter as String when the consuming expression expects Int64 causes a runtime type conversion error.
  • Not setting default values: Parameters without defaults require values at every execution. If a value is not provided, the package fails to start.
  • Referencing parameters incorrectly: Package parameters use @[$Package::BatchId] syntax in SSIS expressions, not @[User::BatchId].
  • Package Variable -- Adds package-scoped variables (internal to the package, not externally visible like parameters).
  • Package Configurations -- Adds .dtsConfig configuration files (used with Package Deployment instead of parameters).
  • Package Add Connection Reference -- Adds connection references that parameters may configure.

Package Variable

Configure additional package variables or override default BimlFlex variables

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the variable will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PackageVariable" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- Variables can also be added to all the packages for the Batch using the PackageVariable combined with CustomOutput.ObjectInherit = true; -->
<# CustomOutput.ObjectInherit = true; #>
<Variable Name="TenantCode" DataType="String">UNK</Variable>
<Variable Name="CurrentModifiedDate" DataType="String" Namespace="User">1900-01-01</Variable>

What It Does

Adds custom package-scoped variables to the SSIS package or overrides default BimlFlex-generated variable values. Package variables store runtime values used by data flow expressions, control flow tasks, connection string expressions, and other SSIS components. This extension point lets you define variables that your custom extension points (Pre Data Flow, Source Pipeline, etc.) need at runtime.

Where It Fires

The extension point is consumed during SSIS package generation and adds <Variable> elements to the package definition. It fires once per object (or once per batch with ObjectInherit = true). Variables are available throughout the package immediately from the start of execution.

When to Use It

  1. Define a TenantCode variable. Your Override SQL extension point for SalesLT.Customer on AWLT_SRC filters by TenantCode, which needs to be defined as a package variable with a default value.
  2. Add a CurrentModifiedDate variable. Your Pre Data Flow and Set Parameter extension points read and write a watermark date. Define the User.CurrentModifiedDate variable so both extension points can access it.
  3. Override a BimlFlex default variable. Change the default value of an existing BimlFlex variable (e.g., override the default SSIS buffer temp storage path).
  4. Define variables for Data Flow Properties. Your Data Flow Properties extension point references User.SsisBufferTempStoragePath. Define that variable here.
  5. Apply to all packages in a batch. Set ObjectInherit = true to ensure every package in the batch has the same custom variables available.

Prerequisites

  • Variable names must be unique within the package (unless overriding an existing BimlFlex variable).
  • The Namespace attribute defaults to User if not specified.
  • Data types must match what consuming expressions and tasks expect.
  • Default values should be sensible for development; runtime values are typically set by Pre Data Flow or Get Parameter extension points.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Package Variable from the dropdown.
  3. Set the Target to the specific object or use ObjectInherit = true for all objects.
  4. Define <Variable> elements with name, data type, namespace, and default value.
  5. Build the project and verify the variables appear in the generated package.
  6. Confirm that consuming extension points can reference the variables at runtime.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PackageVariable" target="SalesLT.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = true; #>
<Variable Name="TenantCode" DataType="String">UNK</Variable>
<Variable Name="CurrentModifiedDate" DataType="String" Namespace="User">1900-01-01</Variable>

Example

Before (no custom variables -- Pre Data Flow references an undefined variable):

A Pre Data Flow task references User.CurrentModifiedDate but the variable does not exist, causing a validation error.

After (PackageVariable defines the variable):

<Variable Name="CurrentModifiedDate" DataType="String" Namespace="User">1900-01-01</Variable>

Common Mistakes

  • Duplicate variable names: Adding a variable that BimlFlex already generates causes a build error. Check existing package variables before adding custom ones.
  • Wrong data type: Defining a variable as String when the consuming expression expects DateTime causes a runtime conversion error. Match the data type to the consumer.
  • Missing Namespace: If you omit the Namespace attribute, the variable may not be found by tasks referencing User.VariableName. Explicitly set Namespace="User".
  • Overriding framework variables accidentally: Changing the default value of a BimlFlex framework variable (like ExecutionID) can break execution logging.
  • Not applying ObjectInherit when needed: If your Pre Data Flow extension point uses ObjectInherit = true, the corresponding Package Variable must also use ObjectInherit = true to ensure the variable exists in every package.
  • Package Parameter -- Adds externally visible parameters (different from internal variables).
  • Package Configurations -- Adds .dtsConfig files that can override variable values at runtime.
  • Data Flow Properties -- References variables in data flow task expressions.
  • Pre Data Flow / Post Data Flow -- Tasks that commonly read/write the variables defined here.

Persistent Staging Initial Target Pipeline

Configure pipeline logic that will be injected into the source to staging package before the Persistent Staging Initial target destination node

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PersistentInitialTargetPipeline" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>

What It Does

Injects custom data flow transformations into the source-to-staging package immediately before the persistent staging initial destination node. This extension point only fires during the initial (first-run) load for the persistent staging table, allowing you to apply transformations specifically for the initial bulk load of the PSA without affecting subsequent incremental loads.

Where It Fires

The extension point is consumed during data flow generation in the source-to-staging package and places transformations before the persistent staging initial destination component. It fires only during the initial load path, which is used when the persistent staging table is empty or when BimlFlex generates a separate initial load data flow. You must set CustomOutput.OutputPathName so the PSA initial destination can connect to your last transformation.

When to Use It

  1. Add a Data Conversion before the initial PSA load. When bootstrapping awlt.Customer in BFX_ODS, convert character columns from ANSI to Unicode before the initial bulk insert.
  2. Insert a Row Count for initial load auditing. Count how many rows are being written during the initial load of awlt.SalesOrderHeader into the persistent staging table, separately from incremental counts.
  3. Apply a Conditional Split during bootstrapping. Route historical records differently during the initial PSA load of awlt.Product in BFX_ODS.
  4. Use this instead of Persistent Staging Target Pipeline when the transformation should only apply during the initial load, not during every incremental extraction.

Prerequisites

  • The object must exist in BimlFlex metadata with a persistent staging connection configured (e.g., BFX_ODS).
  • The BimlFlex project must be configured to generate a separate initial load path.
  • You must set CustomOutput.OutputPathName to the output path of your last transformation.
  • The inputPath parameter provides the output path from the preceding transformation.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Persistent Staging Initial Target Pipeline from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Add data flow transformation Biml, connecting inputPath to your first transformation and setting OutputPathName on the last one.
  5. Build the project and verify the transformations appear before the PSA initial destination.
  6. Execute the initial load and confirm the transformations fire. Then run an incremental load and confirm they do not fire.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PersistentInitialTargetPipeline" target="SalesLT.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>

<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert CustomerName">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="CustomerName" TargetColumn="cnv_CustomerName" DataType="String" Length="200" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert CustomerName.Output"; #>
</DataConversion>

Example

Before (default initial PSA load):

... -> OLE_DST - awlt.Customer (persistent staging initial)

After (with PersistentInitialTargetPipeline adding a conversion):

... -> DCV - Convert CustomerName -> OLE_DST - awlt.Customer (persistent staging initial)

Common Mistakes

  • Expecting it to fire on every load: This extension point only fires during the initial load path. For transformations on every load, use Persistent Staging Target Pipeline.
  • Not setting OutputPathName: Failing to set CustomOutput.OutputPathName disconnects the PSA initial destination from the data flow.
  • Confusing with Staging Initial Target Pipeline: Staging Initial Target Pipeline targets the staging destination. Persistent Staging Initial Target Pipeline targets the persistent staging destination. These are different destinations in multi-hop architectures.
  • Not testing the initial load path: The initial load path may only run once. Clear the PSA table to re-trigger it.
  • Persistent Staging Target Pipeline -- Injects transformations before the PSA destination on every load (not just initial).
  • Staging Initial Target Pipeline -- Injects transformations before the staging initial destination (different from PSA).
  • Source Pipeline -- Injects transformations after the source component.
  • Target Pipeline -- Injects transformations before the primary target destination.

Persistent Staging Target Pipeline

Configure pipeline logic that will be injected into the source to staging package before the Persistent Staging target destination node

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PersistentTargetPipeline" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>

What It Does

Injects custom data flow transformations into the source-to-staging package immediately before the persistent staging destination node. This extension point fires on every load (both initial and incremental), allowing you to apply transformations -- such as data type conversions, derived columns, or conditional logic -- right before data is written to the persistent staging table.

Where It Fires

The extension point is consumed during data flow generation in the source-to-staging package and places transformations between the last BimlFlex-generated transformation and the persistent staging destination component. It fires once per object on every execution (unlike Persistent Staging Initial Target Pipeline which only fires on the initial load). You must set CustomOutput.OutputPathName so the PSA destination can connect to your last transformation.

When to Use It

  1. Add a Data Conversion before writing to PSA. Convert character columns from ANSI to Unicode before every load of awlt.Customer into the persistent staging table in BFX_ODS.
  2. Insert a Derived Column for audit information. Add a load timestamp or source file name column before writing awlt.SalesOrderHeader to the PSA table.
  3. Apply a Conditional Split before PSA. Filter out rows that do not meet validation criteria before they reach the persistent staging table for awlt.Product.
  4. Add a Row Count for monitoring. Audit the number of rows written to the PSA table on every execution.
  5. Use this instead of Persistent Staging Initial Target Pipeline when the transformation should apply to every execution, not just the initial load.

Prerequisites

  • The object must exist in BimlFlex metadata with a persistent staging connection configured (e.g., BFX_ODS).
  • You must set CustomOutput.OutputPathName to the output path of your last transformation.
  • The inputPath parameter provides the output path from the preceding transformation.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Persistent Staging Target Pipeline from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Add data flow transformation Biml, connecting inputPath to your first transformation and setting OutputPathName on the last one.
  5. Build the project and verify the transformations appear before the PSA destination.
  6. Execute both an initial and incremental load to confirm the transformations fire on every run.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PersistentTargetPipeline" target="SalesLT.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>

<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert EmailAddress">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="EmailAddress" TargetColumn="cnv_EmailAddress" DataType="String" Length="200" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert EmailAddress.Output"; #>
</DataConversion>

Example

Before (default PSA data flow ending):

... -> OLE_DST - awlt.Customer (persistent staging)

After (with PersistentTargetPipeline adding a conversion):

... -> DCV - Convert EmailAddress -> OLE_DST - awlt.Customer (persistent staging)

Common Mistakes

  • Not setting OutputPathName: Failing to set CustomOutput.OutputPathName disconnects the PSA destination from the data flow, causing a build error.
  • Confusing with Persistent Staging Initial Target Pipeline: Initial fires only on the first load; this fires on every load. If you need both, implement both extension points.
  • Confusing with Staging Target Pipeline: Staging Target Pipeline targets the staging destination. Persistent Staging Target Pipeline targets the persistent staging destination. In multi-hop architectures (source -> staging -> PSA), these are different.
  • Using control flow tasks: This extension point operates inside the data flow. Use Post Data Flow or Post Tasks for control flow tasks.
  • Persistent Staging Initial Target Pipeline -- Injects transformations before the PSA destination during the initial load only.
  • Staging Target Pipeline -- Injects transformations before the staging destination (different from PSA).
  • Source Pipeline -- Injects transformations after the source component.
  • Target Pipeline -- Injects transformations before the primary target destination.

Post Create SQL

Configure SQL that will be executed after the Create DDL for a object

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the SQL will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PostCreateSql" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
ALTER TABLE [awlt].[Account] ADD CONSTRAINT [PK_awlt_Account] PRIMARY KEY CLUSTERED
(
[AccountCodeAlternateKey] ASC,
[FlexRowEffectiveFromDate] ASC
) ON [PRIMARY]
GO

What It Does

Injects custom SQL that executes immediately after the CREATE TABLE DDL for an object during deployment. This allows you to add constraints, indexes, permissions, or seed data to a table right after it has been created, ensuring the object is fully configured before any data loading begins.

Where It Fires

The extension point is stored as an annotation during metadata processing and consumed in the 0.90.x-flx-deploy-*-script.biml deployment script files. It fires once per object during the DDL deployment phase, immediately after the corresponding CREATE TABLE statement has executed.

When to Use It

  • You need to add a clustered primary key to awlt.Customer in BFX_STG after the CREATE TABLE statement because the key definition requires special options not supported by the standard metadata.
  • After creating a Data Vault hub table dv.HUB_Customer in BFX_DV, you want to add a non-clustered index on the business key for query performance.
  • You want to grant SELECT permissions on dim.Customer in BFX_DM to a reporting service account immediately after the table is created.
  • Use this instead of Override Primary Key SQL when you need to add the constraint as a separate ALTER statement rather than inline in the CREATE TABLE.
  • You need to insert default seed rows (e.g., unknown member records) into dim.Customer after it is created in BFX_DM.

Prerequisites

  • The object must exist in BimlFlex metadata so the extension point can bind to it.
  • The CREATE TABLE DDL must complete successfully before Post Create SQL runs; ensure there are no errors in the CREATE statement itself.
  • You must have appropriate DDL privileges (ALTER TABLE, CREATE INDEX, GRANT) on the target database.
  • If your Post Create SQL references other objects (e.g., foreign keys to other tables), those objects must already exist.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Post Create SQL from the dropdown.
  3. Set the Target to the specific object (e.g., awlt.Customer on BFX_STG).
  4. Write your SQL statements to add constraints, indexes, or other DDL after the CREATE TABLE.
  5. Use GO separators between independent batches as needed.
  6. Build the project and review the deployment script to confirm your SQL appears after the CREATE TABLE statement.
  7. Deploy and verify the constraints or indexes exist on the table.

Example

Before (table created without custom primary key):

-- Default CREATE TABLE generates the table but PK needs special filegroup placement
CREATE TABLE [awlt].[Customer] (
[CustomerID] INT NOT NULL,
[FirstName] NVARCHAR(50),
[FlexRowEffectiveFromDate] DATETIME2(7) NOT NULL,
[FlexRowAuditId] BIGINT NOT NULL
)

After (with PostCreateSql extension point adding PK and index):

-- Post Create SQL fires after CREATE TABLE
ALTER TABLE [awlt].[Customer] ADD CONSTRAINT [PK_awlt_Customer] PRIMARY KEY CLUSTERED
(
[CustomerID] ASC,
[FlexRowEffectiveFromDate] ASC
) ON [PRIMARY]
GO
CREATE NONCLUSTERED INDEX [IX_awlt_Customer_AuditId]
ON [awlt].[Customer] ([FlexRowAuditId] ASC)
GO

Common Mistakes

  • Referencing columns that do not exist: If the CREATE TABLE DDL was overridden by a Create SQL extension point with different columns, your Post Create SQL may reference non-existent columns.
  • Missing GO separators: Multiple DDL statements (ALTER TABLE, CREATE INDEX) in the same batch can cause syntax errors in SQL Server.
  • Duplicate constraint names: If the CREATE TABLE already includes a primary key and your Post Create SQL tries to add another, deployment fails. Coordinate with Override Primary Key SQL to avoid conflicts.
  • Not guarding with IF NOT EXISTS: On re-deployments, constraints may already exist. Use conditional checks to make scripts idempotent.
  • Setting ObjectInherit to true carelessly: This applies the same post-create SQL to every object in the batch, which may cause failures for objects with different schemas.
  • Pre Create SQL -- Injects SQL before the CREATE DDL (the natural counterpart to Post Create SQL).
  • Create SQL -- Overrides the entire CREATE TABLE DDL itself.
  • Override Primary Key SQL -- Overrides the inline PK constraint within the CREATE statement.
  • Override Alter SQL -- Overrides ALTER TABLE statements during deployment.

Post Data Flow

Configure logic that will be injected after the main Data Flow

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the Data Flow element will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PostDataflow" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>


<!-- This can be any anything defined within the SSIS Control Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Set CurrentModifiedDate" ConnectionName="BimlCatalog" ResultSet="None">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<DirectInput>EXEC [ssis].[SetConfigVariable] 'MY_SRC', 'MY_SRC.dbo.EJTable.CurrentModifiedDate', 'CurrentModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
</ExecuteSQL>

What It Does

Injects one or more SSIS control flow tasks after the main data flow task within the object's sequence container. This allows you to run post-processing logic -- such as writing watermark values back to the BimlCatalog, executing stored procedures, rebuilding indexes, or updating audit tables -- after data extraction and loading completes. The extension point receives a precedenceConstraint from the data flow so your tasks execute in the correct order.

Where It Fires

The extension point is consumed during SSIS control flow generation and places tasks inside the SEQC - Main sequence container after the data flow task completes. It fires once per object when the control flow is assembled. The precedenceConstraint parameter connects your first task to the data flow task's output. Unlike Pre Data Flow, you do not need to set OutputPathName unless there are additional downstream tasks that need to chain after your post-data-flow logic.

When to Use It

  • You need to write the CurrentModifiedDate watermark back to the BimlCatalog after successfully extracting SalesLT.Customer from AWLT_SRC, enabling the next incremental load to pick up where this one left off.
  • You want to execute a stored procedure on BFX_STG to rebuild indexes on awlt.SalesOrderHeader after the bulk insert completes.
  • You need to update an audit or reconciliation table after data arrives in BFX_ODS from AWLT_SRC.
  • You want to send a notification (via Database Mail or a Script Task) after a specific object's extraction completes.
  • Use this instead of Target Post Process when you only need the single table parameter and the precedenceConstraint connection, and do not need separate source/target table context.

Prerequisites

  • The object must exist in BimlFlex metadata with valid connections.
  • Any connections referenced (e.g., BimlCatalog) must be available in the project.
  • Variables referenced in your tasks (e.g., User.CurrentModifiedDate, User.ExecutionID) must be defined in the package.
  • The precedenceConstraint parameter must be used to connect your task to the preceding data flow.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Post Data Flow from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Add SSIS control flow Biml (ExecuteSQL, Script Task, etc.) that includes the precedenceConstraint connection logic.
  5. Build the project in BimlStudio and verify the tasks appear after the data flow in the generated package.
  6. Execute the package and confirm the post-data-flow logic runs after extraction completes.

Example

Before (default package flow):

SEQC - Main
└─ DFT - Extract SalesLT.Customer

After (with PostDataflow writing back a watermark):

<ExecuteSQL Name="SQL - Set CurrentModifiedDate" ConnectionName="BimlCatalog" ResultSet="None">
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<DirectInput>EXEC [ssis].[SetConfigVariable] 'AWLT_SRC', 'SalesLT.Customer.ModifiedDate', 'CurrentModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
</ExecuteSQL>

Common Mistakes

  • Not handling the precedenceConstraint: If precedenceConstraint is not empty, you must include it to ensure your task runs after the data flow. The template shows the conditional pattern -- always use it.
  • Assuming the data flow succeeded: Post Data Flow fires regardless of whether the data flow completed successfully or failed. Add conditional precedence constraints (e.g., EvaluationOperation="ExpressionAndConstraint") if your post-processing should only run on success.
  • Referencing undefined variables: Variables like User.CurrentModifiedDate must be defined via BimlFlex metadata, Package Variable extension points, or populated by Pre Data Flow.
  • Confusing with Target Post Process: Target Post Process provides both sourceTable and targetTable but no precedenceConstraint. Post Data Flow provides precedenceConstraint but only a single table parameter. Choose based on your needs.
  • Putting data flow transformations here: Post Data Flow is for control flow tasks, not data flow transformations. Use Target Pipeline for transformations inside the data flow.
  • Pre Data Flow -- Injects control flow tasks before the data flow.
  • Target Post Process -- Injects control flow tasks after the process with source and target table context.
  • Post Process -- Injects control flow tasks after the main process with a single table parameter.
  • Set Parameter -- Writes parameter values back to the BimlCatalog (commonly used with Post Data Flow).

Post Process

Configure logic that will be injected after the main Object process

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the process will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PostProcess" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- This can be any anything defined within the SSIS Control Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Set SnapshotDate" ConnectionName="BimlCatalog" ResultSet="None">
<DirectInput>EXEC [ssis].[SetConfigVariable] 'LOAD_DATAMART', 'LOAD_DATAMART.SnapshotDate', 'SnapshotDate', @VariableValue</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.SnapshotDate" Length="-1" DataType="String" />
</Parameters>
</ExecuteSQL>

What It Does

Injects one or more SSIS control flow tasks after the main object process, wrapped inside the SEQC - Post Process container. Unlike Post Data Flow which fires inside SEQC - Main after the data flow, Post Process fires at the outermost level of the object's package, after the main sequence container completes. This makes it suitable for package-level cleanup tasks such as persisting configuration values, updating audit records, or sending notifications.

Where It Fires

Consumed during SSIS control flow generation in 90.flx-i-get-post-process.biml. The tasks are placed inside the SEQC - Post Process sequence container, which executes after SEQC - Main completes. It fires once per object. The extension point does not provide a precedenceConstraint parameter because it sits at the beginning of the Post Process container. It does not require OutputPathName.

When to Use It

  1. Persist a snapshot date after processing. After loading dim.Customer in BFX_DM, write the snapshot date back to the BimlCatalog so it can be used by the next execution.
  2. Rebuild indexes after bulk insert. After data arrives in awlt.SalesOrderHeader on BFX_STG, execute a stored procedure that rebuilds non-clustered indexes.
  3. Update statistics on target tables. After loading awlt.Product into BFX_ODS, run UPDATE STATISTICS to ensure the query optimizer has current information.
  4. Send a notification or log an audit record. Insert a completion record into a custom audit table after SalesLT.Customer processing finishes.
  5. Use this instead of Post Data Flow when you do not need precedenceConstraint chaining and want the task to run outside the SEQC - Main container, after all main processing completes.

Prerequisites

  • The object must exist in BimlFlex metadata with valid connections.
  • Any connections referenced (e.g., BimlCatalog) must be available in the project.
  • Variables referenced in parameter bindings must be defined in the package.
  • Post Process runs after SEQC - Main, so variables populated during the data flow are available.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Post Process from the dropdown.
  3. Set the Target to the specific object (e.g., dim.Customer on BFX_DM) or use ObjectInherit = true for all objects.
  4. Add SSIS control flow Biml (ExecuteSQL, Script Task, etc.).
  5. Build the project and verify the tasks appear in the SEQC - Post Process container.
  6. Execute the package and confirm the post-process logic runs after the main data flow completes.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PostProcess" target="SalesLT.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Set SnapshotDate" ConnectionName="BimlCatalog" ResultSet="None">
<DirectInput>EXEC [ssis].[SetConfigVariable] 'AWLT_SRC', 'SalesLT.Customer.SnapshotDate', 'SnapshotDate', @VariableValue</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.SnapshotDate" Length="-1" DataType="String" />
</Parameters>
</ExecuteSQL>

Example

Before (no Post Process -- package ends after data flow):

SEQC - Main
└─ DFT - Extract SalesLT.Customer

After (Post Process persists a snapshot date):

SEQC - Main
└─ DFT - Extract SalesLT.Customer
SEQC - Post Process
└─ SQL - Set SnapshotDate

Common Mistakes

  • Confusing with Post Data Flow: Post Process fires in SEQC - Post Process (after the main container). Post Data Flow fires inside SEQC - Main (after the data flow task). Choose based on where you need the task to execute.
  • Confusing with Target Post Process: Target Post Process provides both sourceTable and targetTable parameters. Post Process provides only a single table parameter.
  • Referencing undefined variables: Variables like User.SnapshotDate must be defined and populated earlier in the package.
  • Assuming Post Process only runs on success: Post Process may fire even if the main data flow failed, depending on the package configuration. Add conditional logic if your post-processing should only run on success.
  • Pre Process -- Injects control flow tasks before the main process (counterpart to Post Process).
  • Post Data Flow -- Injects tasks inside SEQC - Main after the data flow.
  • Post Tasks -- Injects tasks after all tasks in the main flow with source/target context and precedenceConstraint.
  • Target Post Process -- Injects tasks after the process with both source and target table parameters.

Post Tasks

Configure logic that will be injected after all the Tasks in the main Data Flow

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PostTasks" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>


<!-- This can be any anything defined within the SSIS Control Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Set CurrentModifiedDate" ConnectionName="BimlCatalog" ResultSet="None">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<DirectInput>EXEC [ssis].[SetConfigVariable] 'MY_SRC', 'MY_SRC.dbo.EJTable.CurrentModifiedDate', 'CurrentModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
</ExecuteSQL>

What It Does

Injects one or more SSIS control flow tasks after all the tasks in the SEQC - Main sequence container, providing both source and target table context plus a precedence constraint for chaining. This is similar to Post Data Flow but provides both sourceTable and targetTable parameters, making it useful when post-processing logic needs to reference both the source and target object metadata.

Where It Fires

Consumed during SSIS control flow generation. The tasks are placed at the very end of the SEQC - Main sequence container, after the data flow and any post-data-flow tasks. It fires once per source-to-target object mapping. The precedenceConstraint parameter connects your first task to the last preceding task in the container.

When to Use It

  1. Persist a watermark value using source and target context. After extracting SalesLT.Customer from AWLT_SRC to awlt.Customer on BFX_STG, write the CurrentModifiedDate back to the BimlCatalog.
  2. Rebuild indexes on the target table. After loading awlt.SalesOrderHeader in BFX_STG, execute ALTER INDEX REBUILD using the targetTable parameter for dynamic naming.
  3. Update statistics on the staging table. Run UPDATE STATISTICS on the specific target table after the data flow completes for awlt.Product in BFX_ODS.
  4. Execute target-specific cleanup procedures. Run a stored procedure that deduplicates the target staging table after the bulk insert completes.
  5. Use this instead of Post Data Flow when you need both source and target table context, or instead of Target Post Process when you need precedenceConstraint chaining.

Prerequisites

  • The object must exist in BimlFlex metadata with valid source and target connections.
  • The precedenceConstraint parameter must be handled to connect to the preceding task.
  • Any connections referenced (e.g., BimlCatalog) must be available in the project.
  • Variables referenced in your tasks must be defined in the package.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Post Tasks from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Add SSIS control flow Biml including precedenceConstraint handling.
  5. Build the project and verify the tasks appear at the end of SEQC - Main.
  6. Execute the package and confirm the post-tasks logic runs after the data flow completes.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PostTasks" target="SalesLT.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Set CurrentModifiedDate" ConnectionName="BimlCatalog" ResultSet="None">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<DirectInput>EXEC [ssis].[SetConfigVariable] 'AWLT_SRC', 'SalesLT.Customer.CurrentModifiedDate', 'CurrentModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
</ExecuteSQL>

Example

Before (no Post Tasks -- SEQC - Main ends after data flow):

SEQC - Main
└─ DFT - Extract SalesLT.Customer

After (Post Tasks persists a watermark):

SEQC - Main
├─ DFT - Extract SalesLT.Customer
└─ SQL - Set CurrentModifiedDate

Common Mistakes

  • Not handling precedenceConstraint: The conditional pattern in the template must be preserved. Without it, the task is disconnected from the preceding data flow.
  • Confusing with Post Data Flow: Post Data Flow provides a single table parameter with a precedenceConstraint. Post Tasks provides sourceTable, targetTable, and precedenceConstraint. Choose based on your parameter needs.
  • Confusing with Post Process: Post Process fires in SEQC - Post Process (outside the main container). Post Tasks fires inside SEQC - Main (after all tasks).
  • Referencing undefined variables: Variables like User.CurrentModifiedDate must be populated earlier in the package by the data flow or a Pre Tasks / Get Parameter extension point.
  • Assuming it only runs on success: Post Tasks fires based on the precedence constraint connection, which by default runs on completion (success or failure). Add a conditional expression if you need it to run only on success.
  • Pre Tasks -- Injects control flow tasks before all tasks in SEQC - Main (counterpart to Post Tasks).
  • Post Data Flow -- Injects tasks after the data flow with a single table parameter.
  • Post Process -- Injects tasks in SEQC - Post Process (outside the main container).
  • Target Post Process -- Injects tasks after the process with source and target context but no precedence constraint.

Pre Create SQL

Configure SQL that will be executed prior to the Create DDL for a object

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the SQL will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PreCreateSql" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
ALTER TABLE [awlt].[Account] DROP CONSTRAINT [PK_awlt_Account] WITH ( ONLINE = OFF )
GO

What It Does

Injects custom SQL that executes immediately before the CREATE TABLE DDL for an object during deployment. This allows you to prepare the database environment by dropping constraints, removing indexes, or performing cleanup tasks that must complete before the table is recreated or altered, ensuring the CREATE statement succeeds without conflicts.

Where It Fires

The extension point is stored as an annotation during metadata processing and consumed in the 0.90.x-flx-deploy-*-script.biml deployment script files. It fires once per object during the DDL deployment phase, just before the corresponding CREATE TABLE statement is emitted.

When to Use It

  • You need to drop an existing primary key constraint on awlt.Customer in BFX_STG before the CREATE DDL rebuilds the table with a new key structure.
  • A schema migration requires dropping a foreign key from awlt.SalesOrderHeader before the table can be recreated with modified columns.
  • You want to back up data from an existing table into a temporary table before the CREATE DDL drops and recreates the object in BFX_ODS.
  • Use this instead of Override Alter SQL when you need actions to run before the full CREATE, not as a replacement for an ALTER statement.
  • You need to disable change tracking or drop indexed views that reference the table before it can be dropped and recreated in BFX_DV.

Prerequisites

  • The object must exist in BimlFlex metadata so the extension point can bind to it.
  • You must have DDL privileges (ALTER TABLE, DROP CONSTRAINT) on the target database and schema.
  • Understand the current table structure and constraint names so your DROP statements reference the correct objects.
  • If setting ObjectInherit = true, ensure the SQL is generic enough to apply safely to all objects in the batch.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Pre Create SQL from the dropdown.
  3. Set the Target to the specific object (e.g., awlt.Customer on BFX_STG).
  4. Write your SQL to drop constraints, indexes, or perform other pre-CREATE tasks.
  5. Include conditional checks (IF EXISTS) so the script does not fail if the constraint does not exist yet.
  6. Build the project and review the deployment script to confirm your SQL appears before the CREATE TABLE statement.
  7. Deploy and verify the table is recreated successfully.

Example

Before (no extension point -- deployment fails):

-- CREATE TABLE fails because PK_awlt_Customer already exists with different columns
CREATE TABLE [awlt].[Customer] (
[CustomerID] INT NOT NULL,
[FirstName] NVARCHAR(50),
[FlexRowEffectiveFromDate] DATETIME2(7) NOT NULL,
CONSTRAINT [PK_awlt_Customer] PRIMARY KEY CLUSTERED ([CustomerID] ASC, [FlexRowEffectiveFromDate] ASC)
)

After (with PreCreateSql extension point):

-- Pre Create SQL fires first
IF EXISTS (SELECT * FROM sys.indexes WHERE name = 'PK_awlt_Customer' AND object_id = OBJECT_ID('[awlt].[Customer]'))
ALTER TABLE [awlt].[Customer] DROP CONSTRAINT [PK_awlt_Customer] WITH (ONLINE = OFF)
GO
-- Then the normal CREATE TABLE DDL follows

Common Mistakes

  • Missing IF EXISTS guards: The script fails on first deployment when the constraint does not yet exist. Always wrap DROP statements in conditional checks.
  • Incorrect constraint names: Referencing the wrong constraint name causes the DROP to silently skip, then the CREATE fails with a duplicate constraint error.
  • Missing GO separators: Batch-scoped statements in SQL Server require GO delimiters; omitting them causes syntax errors.
  • Running destructive operations without ObjectInherit = false: Setting ObjectInherit to true applies the same DROP to every object in the batch, which can inadvertently damage unrelated tables.
  • Forgetting the matching Post Create SQL: If you drop a constraint in Pre Create SQL, you may need a corresponding Post Create SQL to re-add it if the CREATE DDL does not include it.
  • Post Create SQL -- Injects SQL after the CREATE DDL (the natural counterpart to Pre Create SQL).
  • Create SQL -- Overrides the entire CREATE TABLE DDL itself.
  • Override Alter SQL -- Overrides ALTER TABLE DDL when you only need to modify existing structures.
  • Override Primary Key SQL -- Overrides just the primary key constraint within the CREATE DDL.

Pre Data Flow

Configure logic that will be injected before the main Data Flow

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the Data Flow element will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PreDataflow" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>


<!-- This can be any anything defined within the SSIS Control Flow. -->
<!-- NOTE: You must add CustomOutput.OutputPathName with the last task to connect it with the Data Flow -->
<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Get CurrentModifiedDate" ConnectionName="BimlCatalog" ResultSet="SingleRow">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<Results>
<Result Name="0" VariableName="User.CurrentModifiedDate" />
</Results>
<DirectInput>EXEC [ssis].[GetConfigVariable] 'MY_SRC', 'MY_SRC.dbo.EJTable.CurrentModifiedDate', 'CurrentModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
<# CustomOutput.OutputPathName = @"SQL - Get CurrentModifiedDate.Output"; #>
</ExecuteSQL>

What It Does

Injects one or more SSIS control flow tasks before the main data flow task within the object's sequence container. This allows you to run preparatory logic -- such as retrieving watermark values from the BimlCatalog, truncating work tables, setting configuration variables, or executing stored procedures -- before the data extraction begins. You must set CustomOutput.OutputPathName so the data flow task can connect to your last pre-data-flow task via a precedence constraint.

Where It Fires

The extension point is consumed during SSIS control flow generation and places tasks inside the SEQC - Main sequence container before the data flow task. It fires once per object when the control flow is assembled. The precedenceConstraint parameter connects your first task to any preceding tasks in the container. You must set CustomOutput.OutputPathName on your last task so the data flow task can follow it in the execution chain.

When to Use It

  • You need to retrieve a CurrentModifiedDate watermark from the BimlCatalog before extracting SalesLT.Customer from AWLT_SRC, so the data flow can use it as a filter parameter.
  • You want to execute a stored procedure on BFX_STG to truncate or prepare the staging table for awlt.SalesOrderHeader before the data flow runs.
  • You need to set SSIS variables (e.g., configuration values, batch parameters) that the data flow will reference during extraction.
  • Use this instead of Target Pre Process when you only need the single table parameter and do not need separate source/target table context.
  • Use this instead of Pre Process when you need the precedenceConstraint parameter to chain your task into the existing control flow sequence.

Prerequisites

  • The object must exist in BimlFlex metadata with valid connections.
  • You must set CustomOutput.OutputPathName on your last task so the data flow can connect via a precedence constraint.
  • Any connections referenced (e.g., BimlCatalog) must be available in the project.
  • Variables referenced in your tasks (e.g., User.CurrentModifiedDate) must be defined in the package.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Pre Data Flow from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Add SSIS control flow Biml (ExecuteSQL, Script Task, etc.) that includes the precedenceConstraint connection logic.
  5. Set CustomOutput.OutputPathName on the last task (e.g., @"SQL - Get CurrentModifiedDate.Output").
  6. Build the project in BimlStudio and verify the tasks appear before the data flow in the generated package.
  7. Execute the package and confirm the pre-data-flow logic runs before extraction begins.

Example

Before (default package flow):

SEQC - Main
└─ DFT - Extract SalesLT.Customer

After (with PreDataflow retrieving a watermark):

<ExecuteSQL Name="SQL - Get CurrentModifiedDate" ConnectionName="BimlCatalog" ResultSet="SingleRow">
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<Results>
<Result Name="0" VariableName="User.CurrentModifiedDate" />
</Results>
<DirectInput>EXEC [ssis].[GetConfigVariable] 'AWLT_SRC', 'SalesLT.Customer.ModifiedDate', 'CurrentModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
<# CustomOutput.OutputPathName = @"SQL - Get CurrentModifiedDate.Output"; #>
</ExecuteSQL>

Common Mistakes

  • Forgetting to set OutputPathName: Without CustomOutput.OutputPathName, the data flow task cannot connect to your pre-data-flow task, causing either a build error or the data flow running in parallel instead of after your task.
  • Not handling the precedenceConstraint: If precedenceConstraint is not empty, you must include it. The template shows the conditional pattern -- always use it to avoid breaking the control flow chain.
  • Referencing undefined variables: Variables like User.CurrentModifiedDate must be defined via BimlFlex metadata or Package Variable extension points.
  • Confusing with Target Pre Process: Target Pre Process provides both sourceTable and targetTable but no precedenceConstraint or OutputPathName. Pre Data Flow provides precedenceConstraint and requires OutputPathName but only has a single table parameter.
  • Putting data flow transformations here: Pre Data Flow is for control flow tasks (ExecuteSQL, Script Task, etc.), not data flow transformations. Use Source Pipeline for transformations inside the data flow.
  • Post Data Flow -- Injects control flow tasks after the data flow.
  • Target Pre Process -- Injects control flow tasks before the process with source and target table context.
  • Pre Process -- Injects control flow tasks before the main process with a single table parameter.
  • Get Parameter -- Retrieves parameter values from the BimlCatalog (commonly used with Pre Data Flow).
  • Package Variable -- Defines custom package variables used by pre-data-flow tasks.

Pre Process

Configure logic that will be injected before the main Object process

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the process will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PreProcess" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- This can be any anything defined within the SSIS Control Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Get SnapshotDate" ConnectionName="BimlCatalog" ResultSet="SingleRow">
<Results>
<Result Name="0" VariableName="User.SnapshotDate" />
</Results>
<DirectInput>EXEC [ssis].[GetConfigVariable] 'LOAD_DATAMART', 'LOAD_DATAMART.SnapshotDate', 'SnapshotDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.SnapshotDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
</ExecuteSQL>

What It Does

Injects one or more SSIS control flow tasks before the main object process, wrapped inside the SEQC - Pre Process container. Unlike Pre Data Flow which fires inside the SEQC - Main container, Pre Process fires at the outermost level of the object's package, before the main sequence container begins. This makes it suitable for package-level setup tasks such as retrieving configuration values, setting snapshot dates, or executing preparatory stored procedures.

Where It Fires

Consumed during SSIS control flow generation in 90.flx-i-get-pre-process.biml. The tasks are placed inside the SEQC - Pre Process sequence container, which executes before SEQC - Main. It fires once per object. The extension point does not provide a precedenceConstraint parameter because it sits at the beginning of the Pre Process container. It does not require OutputPathName because the Pre Process container connects to SEQC - Main automatically.

When to Use It

  1. Retrieve a snapshot date before processing. Before loading dim.Customer in BFX_DM, retrieve the current snapshot date from the BimlCatalog so downstream merge logic can stamp rows with the correct date.
  2. Execute a stored procedure that prepares the environment. Run a cleanup procedure on BFX_STG that truncates work tables before SalesLT.SalesOrderHeader extraction begins.
  3. Set a package variable from a configuration store. Query an external configuration table to populate User.TenantCode before the data flow uses it as a filter.
  4. Log the start of processing. Insert an audit record into a custom logging table to track when processing of SalesLT.Customer from AWLT_SRC begins.
  5. Use this instead of Pre Data Flow when you do not need precedenceConstraint chaining and want the task to run outside the SEQC - Main container.

Prerequisites

  • The object must exist in BimlFlex metadata with valid connections.
  • Any connections referenced (e.g., BimlCatalog) must be available in the project.
  • Variables referenced in parameter bindings must be defined in the package.
  • Pre Process runs before SEQC - Main, so variables set here are available to all subsequent tasks.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Pre Process from the dropdown.
  3. Set the Target to the specific object (e.g., dim.Customer on BFX_DM) or use ObjectInherit = true for all objects in the batch.
  4. Add SSIS control flow Biml (ExecuteSQL, Script Task, etc.).
  5. Build the project and verify the tasks appear in the SEQC - Pre Process container.
  6. Execute the package and confirm the pre-process logic runs before the main data flow.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PreProcess" target="SalesLT.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Get SnapshotDate" ConnectionName="BimlCatalog" ResultSet="SingleRow">
<Results>
<Result Name="0" VariableName="User.SnapshotDate" />
</Results>
<DirectInput>EXEC [ssis].[GetConfigVariable] 'AWLT_SRC', 'SalesLT.Customer.SnapshotDate', 'SnapshotDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.SnapshotDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
</ExecuteSQL>

Example

Before (no Pre Process -- package starts directly with data flow):

SEQC - Main
└─ DFT - Extract SalesLT.Customer

After (Pre Process retrieves a snapshot date first):

SEQC - Pre Process
└─ SQL - Get SnapshotDate
SEQC - Main
└─ DFT - Extract SalesLT.Customer

Common Mistakes

  • Confusing with Pre Data Flow: Pre Process fires in SEQC - Pre Process (before the main container). Pre Data Flow fires inside SEQC - Main (before the data flow task). Choose based on where you need the task to execute.
  • Referencing undefined variables: Variables like User.SnapshotDate must be defined via BimlFlex metadata or Package Variable extension points.
  • Assuming Pre Process has precedenceConstraint: Unlike Pre Data Flow and Pre Tasks, Pre Process does not provide a precedenceConstraint parameter. Tasks within the Pre Process container execute based on the container's internal ordering.
  • Not testing with ObjectInherit = true: When applied to all objects, ensure the pre-processing logic is valid for every object in the batch.
  • Post Process -- Injects control flow tasks after the main process (counterpart to Pre Process).
  • Pre Data Flow -- Injects tasks inside SEQC - Main before the data flow.
  • Pre Tasks -- Injects tasks before all tasks in the main flow with source/target context and precedenceConstraint.
  • Target Pre Process -- Injects tasks before the process with both source and target table parameters.

Pre Tasks

Configure logic that will be injected before all the Tasks in the main Data Flow

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PreTasks" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>


<!-- This can be any anything defined within the SSIS Control Flow. -->
<!-- NOTE: You must add CustomOutput.OutputPathName with the last task to connect it with the Data Flow -->
<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Get CurrentModifiedDate" ConnectionName="BimlCatalog" ResultSet="SingleRow">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<Results>
<Result Name="0" VariableName="User.CurrentModifiedDate" />
</Results>
<DirectInput>EXEC [ssis].[GetConfigVariable] 'MY_SRC', 'MY_SRC.dbo.EJTable.CurrentModifiedDate', 'CurrentModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
<# CustomOutput.OutputPathName = @"SQL - Get CurrentModifiedDate.Output"; #>
</ExecuteSQL>

What It Does

Injects one or more SSIS control flow tasks before all the tasks in the SEQC - Main sequence container, providing both source and target table context plus a precedence constraint for chaining. This is similar to Pre Data Flow but provides both sourceTable and targetTable parameters, making it useful when your pre-processing logic needs to reference both the source and target object metadata.

Where It Fires

Consumed during SSIS control flow generation. The tasks are placed at the very beginning of the SEQC - Main sequence container, before the initialize step and the data flow. It fires once per source-to-target object mapping. You must set CustomOutput.OutputPathName so subsequent tasks in the container can chain to your last task via precedence constraints.

When to Use It

  1. Retrieve a watermark value using source and target context. Before extracting SalesLT.Customer from AWLT_SRC to awlt.Customer on BFX_STG, retrieve the CurrentModifiedDate from the BimlCatalog using the source connection name.
  2. Truncate the target staging table before load. Execute a TRUNCATE on awlt.SalesOrderHeader in BFX_STG before the data flow populates it, using the targetTable parameter for dynamic table naming.
  3. Disable indexes on the target before bulk insert. Drop non-clustered indexes on the target table to speed up the load of awlt.Product in BFX_ODS, then rebuild them in Post Tasks.
  4. Set up conditional logic based on source-target mapping. Check whether the target table has data and set a variable that controls whether the data flow performs a full or incremental load.
  5. Use this instead of Pre Data Flow when you need both source and target table context, or instead of Target Pre Process when you need the precedenceConstraint parameter for control flow chaining.

Prerequisites

  • The object must exist in BimlFlex metadata with valid source and target connections.
  • You must set CustomOutput.OutputPathName so subsequent tasks can connect.
  • The precedenceConstraint parameter must be handled to connect to preceding tasks.
  • Variables and connections referenced in your tasks must be defined in the package.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Pre Tasks from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Add SSIS control flow Biml including precedenceConstraint handling and OutputPathName setting.
  5. Build the project and verify the tasks appear at the beginning of SEQC - Main.
  6. Execute the package and confirm the pre-tasks logic runs before the data flow.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="PreTasks" target="SalesLT.Customer" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Get CurrentModifiedDate" ConnectionName="BimlCatalog" ResultSet="SingleRow">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<Results>
<Result Name="0" VariableName="User.CurrentModifiedDate" />
</Results>
<DirectInput>EXEC [ssis].[GetConfigVariable] 'AWLT_SRC', 'SalesLT.Customer.CurrentModifiedDate', 'CurrentModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
<# CustomOutput.OutputPathName = @"SQL - Get CurrentModifiedDate.Output"; #>
</ExecuteSQL>

Example

Before (no Pre Tasks -- data flow starts immediately):

SEQC - Main
└─ DFT - Extract SalesLT.Customer

After (Pre Tasks retrieves a watermark before the data flow):

SEQC - Main
├─ SQL - Get CurrentModifiedDate
└─ DFT - Extract SalesLT.Customer

Common Mistakes

  • Forgetting to set OutputPathName: Without CustomOutput.OutputPathName, the data flow and subsequent tasks cannot connect to your pre-task, causing a disconnected control flow.
  • Not handling precedenceConstraint: The conditional pattern in the template must be preserved. Omitting it causes connection errors when preceding tasks exist.
  • Confusing with Pre Data Flow: Pre Tasks provides both sourceTable and targetTable parameters. Pre Data Flow provides only a single table parameter. Choose based on whether you need source+target context.
  • Confusing with Pre Process: Pre Process fires in SEQC - Pre Process (outside the main container) and has no precedence constraint. Pre Tasks fires inside SEQC - Main and chains via precedence constraints.
  • Using data flow transformations: Pre Tasks is for control flow tasks. Use Source Pipeline for transformations inside the data flow.
  • Post Tasks -- Injects control flow tasks after all tasks in SEQC - Main (counterpart to Pre Tasks).
  • Pre Data Flow -- Injects tasks before the data flow with a single table parameter.
  • Pre Process -- Injects tasks in SEQC - Pre Process (outside the main container).
  • Target Pre Process -- Injects tasks before the process with source and target context but no precedence constraint.

Reload From PSA Where SQL

Add a WHERE clause to the PSA reload package. Currently this cannot be parameterized.

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the SQL will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="ReloadFromPsaWhereSql" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
WHERE [MyColumn] = 1

What It Does

Adds a WHERE clause to the SELECT statement used in PSA (Persistent Staging Area) reload packages. When you need to reload data from the PSA back into the Data Vault or other downstream layers, this extension point lets you filter which PSA records are included in the reload, rather than reprocessing the entire PSA table. The WHERE clause is appended directly to the PSA reload query.

Where It Fires

The extension point is consumed during the PSA reload package generation. It fires once per object when the reload query is assembled. The WHERE clause text is appended after the FROM clause of the PSA SELECT statement. Note that the current implementation does not support parameterized queries -- the WHERE clause must contain literal values or expressions that can be resolved at generation time.

When to Use It

  1. Reload a specific date range from PSA. After a failed Data Vault load, you need to reload only records from awlt.Customer in BFX_ODS where FlexRowEffectiveFromDate >= '2024-01-01' rather than the entire PSA history.
  2. Filter by record source during reload. Your PSA table awlt.SalesOrderHeader in BFX_ODS contains records from multiple sources. You want to reload only records where FlexRowRecordSource = 'AWLT'.
  3. Exclude soft-deleted records from reload. Filter out records where FlexRowChangeType = 'D' during the PSA-to-Data Vault reload for awlt.Product.
  4. Limit reload volume for testing. During development, add WHERE CustomerID < 1000 to limit the reload to a small subset of awlt.Customer records.

Prerequisites

  • The object must exist in BimlFlex metadata with a PSA connection configured.
  • The WHERE clause must reference columns that exist in the PSA table.
  • The clause cannot be parameterized at runtime -- values must be known at build time or hardcoded.
  • The PSA reload packages must be generated (these are separate from the normal extraction packages).

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Reload From PSA Where SQL from the dropdown.
  3. Set the Target to the specific object (e.g., awlt.Customer on BFX_ODS).
  4. Write the WHERE clause including the WHERE keyword.
  5. Build the project and verify the reload package contains the filtered query.
  6. Execute the reload package and confirm only the filtered rows are processed.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="ReloadFromPsaWhereSql" target="awlt.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
WHERE [FlexRowEffectiveFromDate] >= '2024-01-01'
AND [FlexRowRecordSource] = 'AWLT'

Example

Before (default reload query -- full PSA table):

SELECT [CustomerID], [FirstName], [LastName], [FlexRowEffectiveFromDate], [FlexRowAuditId]
FROM [awlt].[Customer]

After (with ReloadFromPsaWhereSql filtering by date):

SELECT [CustomerID], [FirstName], [LastName], [FlexRowEffectiveFromDate], [FlexRowAuditId]
FROM [awlt].[Customer]
WHERE [FlexRowEffectiveFromDate] >= '2024-01-01'
AND [FlexRowRecordSource] = 'AWLT'

Common Mistakes

  • Forgetting the WHERE keyword: The extension point expects the full clause including the WHERE keyword. Omitting it produces invalid SQL (e.g., FROM [awlt].[Customer] [FlexRowEffectiveFromDate] >= ...).
  • Referencing columns not in the PSA table: The PSA table may have different column names than the source or Data Vault tables. Verify column names against the actual PSA table structure.
  • Leaving the filter in place after a one-time reload: This extension point is often used for a one-time corrective reload. Forgetting to remove or disable it means all future reloads are filtered, missing records.
  • Expecting parameterization: The current implementation does not support runtime parameters. Hardcode values or use BimlFlex configuration values resolved at build time.
  • Using ObjectInherit = true with object-specific filters: A WHERE clause specific to one table's columns fails when applied to other objects in the batch.
  • Override SQL -- Overrides the source query for normal extraction packages (not PSA reload).
  • Append Source SQL -- Appends text to source queries (normal extraction, not PSA reload).
  • Override Merge -- Overrides the merge logic that the PSA reload feeds into.

Root Add

Add element to the root of the BimlFlex project

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the root elements will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="RootAdd" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- RootAdd is normally combined with the other extension points. An example will be adding a PreProcess that reads data and saves the output into a cache file to be used in a custom lookup in SourcePipeline.
The example below demonstrates adding the required connection and file format.-->
<# CustomOutput.ObjectInherit = true; #>
<# var sourceConnection = table.GetSourceConnection();
var sourceConnectionType = sourceConnection.ConnectionType;
var sourceDatabase = table.GetSourceCatalog();
var sourceScopedName = table.GetSourceScopedName(sourceConnection);
var sourceQualifiedName = table.GetSourceQualifiedName(sourceConnection);
var sourceSchemaQualifiedName = table.GetSourceSchemaQualifiedName(sourceConnection);
var sourceSsisSafeScopedName = sourceScopedName.MakeSsisSafe();
var sourceDisplayName = table.GetSourceDisplayName(sourceConnection); #>
<Connections>
<CacheConnection Name="<#=sourceSsisSafeScopedName#>" PersistFileConnectionName="<#=sourceSsisSafeScopedName#>_File" RawFileFormatName="<#=sourceSsisSafeScopedName#>" />
<FileConnection Name="<#=sourceSsisSafeScopedName#>_File" FilePath="C:\Varigence\<#=sourceSsisSafeScopedName#>.caw" />
</Connections>
<FileFormats>
<RawFileFormat Name="<#=ssisSafeScopedName#>">
<Columns>
<Column Name="MyTable_BK" DataType="AnsiString" Length="60" CodePage="1252" IndexPosition="1" />
<Column Name="MyTable_SK" DataType="AnsiString" Length="40" CodePage="1252" />
</Columns>
</RawFileFormat>
</FileFormats>

What It Does

Adds root-level Biml elements -- such as connections, file formats, and cache connections -- to the BimlFlex project. These elements are not tied to a specific data flow or control flow task; they exist at the top level of the Biml object model. The most common use case is defining cache connections and raw file formats that are consumed by custom lookup transformations added through Lookup Cache or Source Pipeline extension points.

Where It Fires

The extension point is consumed during the Biml project compilation phase and adds elements to the root <Biml> node. It fires once per object (or once per batch when ObjectInherit = true). The root-level elements become available to all packages and data flows within the project. It is typically used in combination with other extension points that reference the connections or file formats defined here.

When to Use It

  1. Define a cache connection for a custom lookup. Your SalesLT.Customer extraction from AWLT_SRC needs a cache-based lookup against a dimension table. Use RootAdd to define the CacheConnection and FileConnection, then use Lookup Cache to populate it.
  2. Add a raw file format for delete detection. Override the raw file format used by the delete detection process for SalesLT.SalesOrderHeader by defining a custom RawFileFormat at the root level.
  3. Register a custom connection. Your extension points reference a connection (e.g., an audit database or external API endpoint) that is not defined in BimlFlex metadata. Use RootAdd to add it as a <Connection> element.
  4. Add a flat file format. A custom source pipeline reads from a flat file and needs a FlatFileFormat definition at the root level.
  5. Use this instead of adding connections through metadata when the connection is only needed by extension point logic and should not appear in the standard BimlFlex metadata.

Prerequisites

  • The object must exist in BimlFlex metadata so the extension point can bind to it.
  • Element names must be unique across the project; duplicate connection or file format names cause build errors.
  • If using ObjectInherit = true, ensure the generated element names include object-specific tokens (e.g., sourceSsisSafeScopedName) to avoid collisions.
  • Other extension points that consume these root elements must reference the exact names defined here.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Root Add from the dropdown.
  3. Set the Target to the specific object or use ObjectInherit = true for all objects in the batch.
  4. Define your root-level Biml elements (<Connections>, <FileFormats>, etc.) using the table parameter for dynamic naming.
  5. Create companion extension points (Lookup Cache, Source Pipeline, etc.) that reference the connections or formats by name.
  6. Build the project and verify the root elements appear in the generated Biml output.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="RootAdd" target="SalesLT.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
<# var sourceConnection = table.GetSourceConnection();
var sourceScopedName = table.GetSourceScopedName(sourceConnection);
var sourceSsisSafeScopedName = sourceScopedName.MakeSsisSafe(); #>
<Connections>
<CacheConnection Name="<#=sourceSsisSafeScopedName#>_Cache" PersistFileConnectionName="<#=sourceSsisSafeScopedName#>_File" RawFileFormatName="<#=sourceSsisSafeScopedName#>_Format" />
<FileConnection Name="<#=sourceSsisSafeScopedName#>_File" FilePath="C:\Varigence\Cache\<#=sourceSsisSafeScopedName#>.caw" />
</Connections>

Example

Before (no RootAdd -- custom lookup has no cache connection):

A Lookup Cache extension point references MyTableCache but no such connection exists, causing a build error.

After (RootAdd defines the cache connection):

<Connections>
<CacheConnection Name="MyTableCache" PersistFileConnectionName="MyTable_File" RawFileFormatName="MyTable_Format" />
<FileConnection Name="MyTable_File" FilePath="C:\Varigence\Cache\MyTable.caw" />
</Connections>
<FileFormats>
<RawFileFormat Name="MyTable_Format">
<Columns>
<Column Name="MyTable_BK" DataType="AnsiString" Length="60" CodePage="1252" IndexPosition="1" />
<Column Name="MyTable_SK" DataType="AnsiString" Length="40" CodePage="1252" />
</Columns>
</RawFileFormat>
</FileFormats>

Common Mistakes

  • Duplicate element names: If two objects both define a connection with the same name (especially with ObjectInherit = true and static names), the build fails. Use dynamic naming with sourceSsisSafeScopedName.
  • Mismatched names between RootAdd and consuming extension points: The ConnectionName in a Lookup Cache data flow must exactly match the Name in the RootAdd <CacheConnection>. A typo causes a build error.
  • Forgetting FileFormats for cache connections: A CacheConnection requires a corresponding RawFileFormatName. Without the matching <RawFileFormat>, the cache connection fails validation.
  • Adding connections that duplicate metadata connections: If a connection with the same name already exists in BimlFlex metadata, the RootAdd definition creates a conflict. Use unique names for extension-point-only connections.
  • Not testing the file path: FileConnection paths must be accessible at runtime. Hardcoded paths like C:\Varigence\ may not exist on the SSIS server.
  • Lookup Cache -- Populates a cache file using a data flow (often requires connections defined in RootAdd).
  • Lookup SQL -- Overrides the SQL used in lookup transformations.
  • Source Pipeline -- Can reference cache connections defined in RootAdd for custom lookups.
  • Package Add Connection Reference -- Adds connection references to a specific package (different from root-level connections).

Set Parameter

Configure override to publish parameter values

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the process will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="SetParameter" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- This can be any anything defined within the SSIS Control Flow. -->
<# CustomOutput.ObjectInherit = true; #>
<# var sourceConnection = table.GetSourceConnection();
var sourceConnectionType = sourceConnection.ConnectionType;
var sourceDatabase = table.GetSourceCatalog();
var sourceScopedName = table.GetSourceScopedName(sourceConnection);
var sourceQualifiedName = table.GetSourceQualifiedName(sourceConnection);
var sourceSchemaQualifiedName = table.GetSourceSchemaQualifiedName(sourceConnection);
var sourceSsisSafeScopedName = sourceScopedName.MakeSsisSafe();
var sourceDisplayName = table.GetSourceDisplayName(sourceConnection);
#>
<# if (table.GetParameters().Any()) { #>
<ExecuteSQL Name="SQL - Set LastModified" ConnectionName="BimlCatalog" ResultSet="None">
<DirectInput>EXEC [ssis].[SetConfigVariable] '<#=sourceConnection.Name#>', '<#=table.Name#>.LastModifiedDate', 'LastModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
<# CustomOutput.OutputPathName = @"SQL - Set LastModified.Output"; #>
</ExecuteSQL>
<# } #>

What It Does

Writes parameter values back to the BimlCatalog configuration store after an object's data flow completes. This is the counterpart to Get Parameter: while Get Parameter reads watermark or configuration values before extraction, Set Parameter persists updated values (such as the latest LastModifiedDate) so the next incremental load can pick up where this execution left off.

Where It Fires

Consumed in the object control flow generation templates. The extension point is placed after the main data flow and post-processing tasks, typically as one of the last steps in the object's SSIS package. It fires once per object when parameter columns are defined in the metadata (table.GetParameters().Any()). The template conditionally emits an ExecuteSQL task that calls [ssis].[SetConfigVariable] on the BimlCatalog connection.

When to Use It

  1. Persist a watermark after incremental extraction. After extracting SalesLT.Customer from AWLT_SRC, write the latest ModifiedDate back to the BimlCatalog so the next run starts from that point.
  2. Store a custom high-water mark. Your extraction of SalesLT.SalesOrderHeader uses a sequence-based parameter (LastOrderID) instead of a date. Override Set Parameter to call SetConfigVariable with the max OrderID.
  3. Write multiple parameter values. The default template writes one parameter. Override to persist both LastModifiedDate and LastRowVersion for SalesLT.Product in a single post-execution step.
  4. Change the storage mechanism. Replace the [ssis].[SetConfigVariable] call with a custom stored procedure that writes to a different audit table or external configuration store.
  5. Use this instead of Post Data Flow when the only post-processing needed is persisting a parameter value, keeping the extension point purpose-specific.

Prerequisites

  • The object must have parameter columns defined in BimlFlex metadata (table.GetParameters() must return results).
  • The BimlCatalog connection must be available in the project.
  • The SSIS variable User.CurrentModifiedDate (or your custom variable) must be populated earlier in the package, typically by the data flow or a Pre Data Flow / Get Parameter extension point.
  • The [ssis].[SetConfigVariable] stored procedure must exist in the BimlCatalog database (deployed by default with BimlFlex).

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Set Parameter from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC) or leave it unset and use ObjectInherit = true for all objects with parameters.
  4. Modify the [ssis].[SetConfigVariable] call to use your connection name, parameter key, and variable name.
  5. Set CustomOutput.OutputPathName so subsequent tasks can chain after this one.
  6. Build the project and verify the Set Parameter task appears after the data flow in the generated package.
<#@ extension bundle="BimlFlex.bimlb" extensionpoint="SetParameter" target="SalesLT.Customer" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<# CustomOutput.ObjectInherit = false; #>
<# var sourceConnection = table.GetSourceConnection(); #>
<# if (table.GetParameters().Any()) { #>
<ExecuteSQL Name="SQL - Set LastModified" ConnectionName="BimlCatalog" ResultSet="None">
<DirectInput>EXEC [ssis].[SetConfigVariable] '<#=sourceConnection.Name#>', '<#=table.Name#>.LastModifiedDate', 'LastModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
<# CustomOutput.OutputPathName = @"SQL - Set LastModified.Output"; #>
</ExecuteSQL>
<# } #>

Example

Before (no Set Parameter -- watermark is never updated):

After each incremental extraction of SalesLT.Customer, the next run re-extracts all rows because the high-water mark in the BimlCatalog was never updated.

After (Set Parameter persists the watermark):

<ExecuteSQL Name="SQL - Set LastModified" ConnectionName="BimlCatalog" ResultSet="None">
<DirectInput>EXEC [ssis].[SetConfigVariable] 'AWLT_SRC', 'SalesLT.Customer.LastModifiedDate', 'LastModifiedDate', @VariableValue, @ExecutionID</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.CurrentModifiedDate" Length="-1" DataType="String" />
<Parameter Name="@ExecutionID" Direction="Input" DataType="Int64" VariableName="User.ExecutionID" />
</Parameters>
</ExecuteSQL>

Common Mistakes

  • Not pairing with Get Parameter: Set Parameter writes values, but the next execution must read them. Without a Get Parameter extension point, the persisted value is never consumed.
  • Writing the wrong variable: Setting User.CurrentModifiedDate when the data flow populated User.MaxModifiedDate means the wrong value is persisted. Verify the variable name matches what the data flow produces.
  • Missing OutputPathName: Without CustomOutput.OutputPathName, tasks that follow (e.g., Post Process) cannot connect, breaking the execution chain.
  • Overwriting with stale values on failure: If the data flow fails partway through, the Set Parameter still runs and may persist an incorrect watermark. Add conditional precedence constraints to only persist on success.
  • Applying ObjectInherit = true to objects without parameters: The template includes a conditional check (table.GetParameters().Any()), but if you remove that check while using ObjectInherit, objects without parameters get an empty task.
  • Get Parameter -- Reads parameter values from the BimlCatalog before extraction.
  • Post Data Flow -- Injects control flow tasks after the data flow (alternative location for persisting values).
  • Post Process -- Injects control flow tasks after the main process.
  • Package Variable -- Defines the SSIS variables that Set Parameter writes back.

Source Create SQL

Configure override Create DDL for a source object, like a view.

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the SQL will be added

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="SourceCreateSql" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>

<!-- The below example is adding View Create statement to the project. -->
IF EXISTS (SELECT * from sys.objects WHERE object_id = OBJECT_ID(N'[aw].[dimProduct]') AND type IN (N'V'))
DROP VIEW [aw].[dimProduct]
GO
CREATE VIEW [aw].[dimProduct]
AS
WITH [CategoryCTE]([ParentProductCategoryID], [ProductCategoryID], [Name]) AS
(
SELECT [ParentProductCategoryID]
,[ProductCategoryID]
,[Name]
FROM [aw].[SalesLT_ProductCategory]
WHERE [ParentProductCategoryID] IS NULL
UNION ALL
SELECT C.[ParentProductCategoryID]
,C.[ProductCategoryID]
,C.[Name]
FROM [aw].[SalesLT_ProductCategory] AS C
INNER JOIN [CategoryCTE] AS BC
ON BC.[ProductCategoryID] = C.[ParentProductCategoryID]
)

SELECT p.[ProductID]
,p.[Name] AS [ProductName]
,p.[ProductNumber]
,p.[Color]
,p.[StandardCost]
,p.[ListPrice]
,p.[Size]
,p.[Weight]
,ccte.[ProductCategoryID]
,ccte.[Name] as [ProductCategoryName]
,pc.[ParentProductCategoryID]
,pc.[Name] AS [ParentProductCategoryName]
,p.[ProductModelID]
,pm.[Name] AS [ProductModel]
,pd.[Description] AS [ProductModelDescription]
,p.[SellStartDate]
,p.[SellEndDate]
,p.[DiscontinuedDate]
FROM [aw].[SalesLT_Product] p
INNER JOIN [aw].[SalesLT_ProductModel] pm
ON p.[ProductModelID] = pm.[ProductModelID]
INNER JOIN [aw].[SalesLT_ProductModelProductDescription] pmx
ON pm.[ProductModelID] = pmx.[ProductModelID]
INNER JOIN [aw].[SalesLT_ProductDescription] pd
ON pmx.[ProductDescriptionID] = pd.[ProductDescriptionID]
INNER JOIN [CategoryCTE] ccte
ON ccte.[ProductCategoryID] = p.[ProductCategoryID]
INNER JOIN [aw].[SalesLT_ProductCategory] AS pc
ON pc.[ProductCategoryID] = ccte.[ParentProductCategoryID]
WHERE pmx.[Culture] = 'en'

What It Does

Replaces the entire CREATE DDL statement for a source-system object such as a view or table used during extraction. This lets you define hand-crafted views that reshape, filter, or pre-join source data before BimlFlex pulls it into the staging layer, giving you full control over the SQL object that the extraction pipeline reads from.

Where It Fires

The extension point is stored as an annotation during metadata import in 00.flx-i-get-table.biml and 00.flx-i-get-view.biml. It is consumed at deployment time in 0.90.1-flx-deploy-tables-script.biml and referenced by 13 files across the build process. It fires once per object during the DDL script generation phase.

When to Use It

  • You need to create a flattened view in AWLT_SRC that pre-joins SalesLT.Product with SalesLT.ProductCategory before extraction, rather than extracting raw tables and joining downstream.
  • The source system exposes only tables but your ELT pattern requires reading through a view with business logic (e.g., filtering soft-deleted records from SalesLT.Customer).
  • You want to add a CTE-based transformation on the source side so that the staging layer in BFX_STG receives a pre-aggregated dataset.
  • Use this instead of Override SQL when you need to physically deploy a SQL object (view or table) to the source database, rather than just changing the SELECT query the pipeline uses.
  • You need to union multiple source tables into a single view before BimlFlex ingests from AWLT_SRC.
  • Create a source view that joins multiple tables and adds computed columns. Your Wide World Importers source needs a GetSale view that joins Sales.Invoices, Sales.InvoiceLines, Warehouse.StockItems, Warehouse.PackageTypes, and Sales.Customers — calculating Total Excluding Tax, Total Chiller Items, and Total Dry Items as derived columns. SourceCreateSql deploys this view to the source database so BimlFlex can extract from it like a regular table.
  • Create a source view with windowed functions for temporal data. Your GetCity source needs a LEAD() window function partitioned by [WWI City ID] and ordered by [Valid From] to compute a [Valid To] date — turning insert-only temporal data into a Type 2 range format before extraction.

Prerequisites

  • The source connection (e.g., AWLT_SRC) must be configured in BimlFlex metadata with appropriate permissions to create views or tables.
  • The target object must exist in BimlFlex metadata so the extension point can bind to it.
  • You must have DDL privileges (CREATE VIEW / CREATE TABLE) on the source database.
  • Ensure column names and data types in your custom DDL match the columns defined in BimlFlex metadata for the object.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Source Create SQL from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Replace the template body with your custom CREATE VIEW or CREATE TABLE statement.
  5. Ensure the column list in your DDL matches the metadata columns exactly, including data types.
  6. Build the project and verify the generated deployment script in the output folder contains your custom DDL.
  7. Execute the deployment script against the source database or let your CI/CD pipeline deploy it.

Example

Before (default generated DDL):

CREATE VIEW [SalesLT].[vw_Customer]
AS
SELECT [CustomerID], [FirstName], [LastName]
FROM [SalesLT].[Customer]

After (with SourceCreateSql extension point):

IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[SalesLT].[vw_Customer]') AND type IN (N'V'))
DROP VIEW [SalesLT].[vw_Customer]
GO
CREATE VIEW [SalesLT].[vw_Customer]
AS
SELECT c.[CustomerID]
,c.[FirstName]
,c.[LastName]
,a.[City]
,a.[StateProvince]
FROM [SalesLT].[Customer] c
INNER JOIN [SalesLT].[CustomerAddress] ca ON c.[CustomerID] = ca.[CustomerID]
INNER JOIN [SalesLT].[Address] a ON ca.[AddressID] = a.[AddressID]
WHERE ca.[AddressType] = 'Main Office'

Common Mistakes

  • Column mismatch with metadata: The columns in your custom DDL do not match the BimlFlex metadata column definitions, causing build errors or runtime failures. Always keep the two in sync.
  • Missing DROP before CREATE: Omitting the IF EXISTS / DROP statement means re-deployment fails when the object already exists.
  • Forgetting the GO separator: SQL Server requires GO between DROP and CREATE for views; omitting it causes a batch parse error.
  • Using this for non-source objects: This extension point only applies to source objects. For staging, ODS, or Data Vault tables, use Create SQL instead.
  • Setting ObjectInherit to true unintentionally: This causes your custom DDL to be applied to every object in the batch rather than just the targeted one.
  • Create SQL -- Overrides CREATE DDL for non-source objects (staging, ODS, Data Vault, Data Mart tables).
  • Override SQL -- Overrides the SELECT query used by the extraction pipeline without deploying a physical object.
  • Pre Create SQL -- Injects SQL to run before the CREATE DDL (e.g., dropping constraints).
  • Post Create SQL -- Injects SQL to run after the CREATE DDL (e.g., adding indexes or constraints).

Source Error Handling

Configure pipeline logic that will be injected on error of the source transformation node

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="SourceErrorHandling" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
</DataConversion>

What It Does

Defines custom data flow transformations that process the error output from the source component. When the source component is configured to redirect error rows (via Source Error Handling Configuration or metadata settings), this extension point receives those error rows and defines what happens to them -- logging to an error table, writing to a flat file, applying data conversions to fix the data, or routing errors to multiple destinations.

Where It Fires

The extension point is consumed during data flow generation and connects to the error output of the source component. It fires once per object when the data flow error handling path is assembled. The inputPath parameter provides the error output path from the source component. Unlike Source Pipeline, this extension point processes only rows that failed at the source, not the main data flow.

When to Use It

  • You need to capture rows from SalesLT.Customer on AWLT_SRC that fail data type conversion at the source and log them to an error table in BFX_STG for later review.
  • You want to apply a Data Conversion transformation to fix truncated columns from SalesLT.SalesOrderHeader error rows and redirect them back into the pipeline or to an alternate destination.
  • You need to write source error rows to a flat file for manual remediation when extracting from AWLT_SRC.
  • Use this instead of letting the package fail when you want to capture and process source-level errors gracefully.
  • Pair with Source Error Handling Configuration to configure ErrorRowDisposition="RedirectRow" on the source component, which enables the error output that this extension point consumes.

Prerequisites

  • The source component must be configured to redirect error rows (ErrorRowDisposition="RedirectRow"), either through Source Error Handling Configuration or metadata settings.
  • The object must exist in BimlFlex metadata with valid source and target connections.
  • The inputPath parameter provides the error output path from the source; use it in your first transformation's InputPath.
  • An error destination (table, flat file, etc.) must exist if you are logging errors.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. First create a Source Error Handling Configuration extension point with ErrorRowDisposition="RedirectRow" for the same object.
  3. Create a new extension point and select Source Error Handling from the dropdown.
  4. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  5. Add data flow transformations that process the error rows, connecting inputPath to your first transformation.
  6. Build the project and verify the error handling path appears in the generated data flow.
  7. Test by introducing data that triggers source errors and confirm the error rows are processed as expected.

Example

Before (default behavior -- source fails on error):

OLE_SRC - SalesLT.Customer → [fails on bad data]

After (with SourceErrorHandling logging errors to a table):

<OleDbDestination Name="OLE_DST - ErrorLog" ConnectionName="BFX_STG">
<InputPath OutputPathName="<#=inputPath #>" />
<ExternalTableOutput Table="[err].[Customer_Errors]" />
</OleDbDestination>

Common Mistakes

  • Not configuring the error output first: This extension point only works if the source component redirects error rows. Without Source Error Handling Configuration setting ErrorRowDisposition="RedirectRow", there is no error output to consume.
  • Expecting main data flow columns: The error output includes error code and error column metadata columns in addition to the data columns. Your transformations must account for these extra columns.
  • Confusing with Source Error Handling Configuration: The Configuration extension point sets how errors are handled at the source. This extension point defines what happens to the redirected error rows. You typically need both.
  • Not creating the error destination table: If you log errors to a table (e.g., [err].[Customer_Errors]), that table must exist with columns matching the error output schema.
  • Forgetting to test with actual error data: Error handling paths are only exercised when errors occur. Test with deliberately bad data to verify the path works.
  • Source Error Handling Configuration -- Configures ErrorRowDisposition and TruncationRowDisposition on the source component.
  • Source Override -- Replaces the entire source component (can include inline error handling).
  • Source Pipeline -- Injects transformations after the source component's main output (not the error output).

Source Error Handling Configuration

Configure Error Handling configuration logic that will be injected in the source transformation node

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="SourceErrorHandlingConfiguration" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any valid ErrorHandling for the source component. -->
<# CustomOutput.ObjectInherit = false; #>
<ErrorHandling ErrorRowDisposition="RedirectRow" TruncationRowDisposition="IgnoreFailure" />

What It Does

Configures the error handling behavior (ErrorRowDisposition and TruncationRowDisposition) on the source component itself, controlling how the source adapter responds when it encounters data conversion errors or truncation during extraction. This is different from handling errors after they occur -- it configures the source component's built-in error output disposition before any data flows downstream.

Where It Fires

The extension point is consumed during data flow generation and adds an <ErrorHandling> element directly to the source component (OLE DB Source, ADO NET Source, etc.). It fires once per object when the source component is rendered. The configuration applies at the source output level, determining whether error/truncated rows are redirected, ignored, or cause the component to fail.

When to Use It

  • You need to redirect error rows from the SalesLT.Customer source on AWLT_SRC to an error output instead of failing the entire package, by setting ErrorRowDisposition="RedirectRow".
  • You want to ignore truncation warnings when extracting SalesLT.SalesOrderHeader from AWLT_SRC where source columns exceed the metadata-defined lengths, by setting TruncationRowDisposition="IgnoreFailure".
  • You need to configure the source to fail on data conversion errors but ignore truncation, or vice versa, for specific objects.
  • Use this instead of Source Error Handling when you want to configure the source component's built-in error disposition rather than adding custom transformations to process the error output.
  • Pair with Source Error Handling when you want to both redirect errors and process them downstream.

Prerequisites

  • The object must exist in BimlFlex metadata with a valid source connection.
  • The source component must support error outputs (most OLE DB and ADO.NET sources do).
  • If using RedirectRow, you should also implement a Source Error Handling extension point to handle the redirected error rows.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Source Error Handling Configuration from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Define the <ErrorHandling> element with the desired ErrorRowDisposition and TruncationRowDisposition values (FailComponent, RedirectRow, or IgnoreFailure).
  5. If using RedirectRow, create a companion Source Error Handling extension point to process the redirected rows.
  6. Build the project and verify the error handling configuration appears on the source component.
  7. Test with data that triggers errors/truncation to confirm the behavior matches expectations.

Example

Before (default source error handling -- fail on any error):

<OleDbSource Name="OLE_SRC - SalesLT.Customer" ConnectionName="AWLT_SRC">
<DirectInput>SELECT ... FROM [SalesLT].[Customer]</DirectInput>
</OleDbSource>

After (with SourceErrorHandlingConfiguration redirecting errors):

<OleDbSource Name="OLE_SRC - SalesLT.Customer" ConnectionName="AWLT_SRC">
<DirectInput>SELECT ... FROM [SalesLT].[Customer]</DirectInput>
<ErrorHandling ErrorRowDisposition="RedirectRow" TruncationRowDisposition="IgnoreFailure" />
</OleDbSource>

Common Mistakes

  • Using RedirectRow without a Source Error Handling extension point: Redirected error rows go to the error output, but if nothing consumes that output, the rows are silently lost. Always pair with Source Error Handling to capture or log error rows.
  • Using IgnoreFailure for critical data: Setting IgnoreFailure on errors means bad data is silently discarded. Only use this when data loss is acceptable for the affected columns.
  • Confusing with Source Error Handling: This extension point configures how errors are handled at the source component level (redirect, ignore, fail). Source Error Handling defines what to do with redirected error rows downstream.
  • Invalid disposition values: Only FailComponent, RedirectRow, and IgnoreFailure are valid. Misspelled values cause build errors.
  • Source Error Handling -- Handles the error output from the source component (processes redirected error rows).
  • Source Override -- Replaces the entire source component (can include inline error handling configuration).
  • Source Property -- Adds properties to the source component (different from error handling configuration).

Source File Archive Override

Configure control flow logic that will override the SC_FILE_ARCHIVE script task call. You might also need to add a ProjectScriptFile.

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
precedenceConstraintStringContains the Precedence Constraint of the preceding task unless it is the first task

Outputs

Name
TypeDescription
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="SourceFileArchiveOverride" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="precedenceConstraint" type="String" #>

<FileSystem Name="FST - YOUR FILE ARCHIVE" Operation="MoveFile" OverwriteDestination="true">
<# if (!string.IsNullOrEmpty(precedenceConstraint)) {#>
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<# } #>
<VariableOutput VariableName="User.ArchivePath" />
<VariableInput VariableName="User.SourceFullFilePath" />
<# CustomOutput.OutputPathName = @"FST - YOUR FILE ARCHIVE.Output"; #>
</FileSystem>

What It Does

Replaces the default file archive script task (SC_FILE_ARCHIVE) that BimlFlex generates in file-based extraction packages. Instead of the standard script task that archives processed files, you provide a custom control flow task -- such as a File System Task, Execute Process Task, or Script Task -- that handles file archival according to your specific requirements (move, copy, rename, delete, or upload to cloud storage).

Where It Fires

The extension point is consumed during control flow generation in file-based extraction packages and replaces the SC_FILE_ARCHIVE script task. It fires once per object when the archive step is rendered. You must set CustomOutput.OutputPathName so subsequent tasks in the control flow can connect to your custom archive task. The precedenceConstraint parameter connects your task to the preceding step in the control flow.

When to Use It

  • You need to replace the default archive script with a File System Task that moves processed files from the import directory to an archive directory for SalesLT.Customer file extractions.
  • You want to use an Execute Process Task to call an external tool (e.g., 7-Zip, AzCopy) to compress and archive files after extraction from SalesLT.SalesOrderHeader.
  • The default archive script does not support your file system or storage infrastructure (e.g., network shares with specific authentication requirements).
  • You want to upload processed files to Azure Blob Storage or AWS S3 after extraction, replacing the local file move with a cloud upload task.
  • Use this instead of modifying the default archive behavior when you need a completely different archival mechanism. Pair with ProjectScriptFile if your custom task requires a script project file.

Prerequisites

  • The object must be configured for file-based extraction with file archiving enabled.
  • You must set CustomOutput.OutputPathName to the output path of your custom archive task.
  • The precedenceConstraint parameter must be used to connect to the preceding control flow step.
  • Variables referenced (e.g., User.ArchivePath, User.SourceFullFilePath) must be defined in the package.
  • If using a custom script task, you may also need to add a ProjectScriptFile extension point for the script assembly.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Source File Archive Override from the dropdown.
  3. Set the Target to the specific file-based object.
  4. Write the custom control flow task Biml (File System Task, Script Task, etc.) that handles file archival.
  5. Include the precedenceConstraint connection logic and set CustomOutput.OutputPathName.
  6. If needed, create a companion ProjectScriptFile extension point for any custom script assemblies.
  7. Build the project in BimlStudio and verify the archive task in the generated package.
  8. Execute the package and confirm files are archived according to your custom logic.

Example

Before (default archive script task):

<ScriptTask Name="SC_FILE_ARCHIVE" ...>
<!-- Default BimlFlex file archive script -->
</ScriptTask>

After (with SourceFileArchiveOverride using a File System Task):

<FileSystem Name="FST - Archive Customer File" Operation="MoveFile" OverwriteDestination="true">
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="<#=precedenceConstraint #>" />
</Inputs>
</PrecedenceConstraints>
<VariableOutput VariableName="User.ArchivePath" />
<VariableInput VariableName="User.SourceFullFilePath" />
<# CustomOutput.OutputPathName = @"FST - Archive Customer File.Output"; #>
</FileSystem>

Common Mistakes

  • Forgetting to set OutputPathName: Without CustomOutput.OutputPathName, subsequent tasks in the control flow cannot connect, breaking the package execution chain.
  • Not handling the precedenceConstraint: Omitting the precedenceConstraint connection disconnects your task from the preceding step, causing it to execute out of order or not at all.
  • Missing variable definitions: Variables like User.ArchivePath and User.SourceFullFilePath are provided by BimlFlex. Do not redefine them, but ensure your task references them correctly.
  • Forgetting the ProjectScriptFile companion: If your custom archive logic requires a Script Task with compiled code, you must also add a ProjectScriptFile extension point to include the assembly.
  • Archive path permissions: Ensure the SSIS service account has write permissions to the archive destination directory.
  • Source File Loop Expression -- Adds expressions to the ForEachFileLoop container (often used alongside file archive overrides).
  • Source Override -- Replaces the source component in the data flow for file-based sources.
  • ProjectScriptFile -- Adds script project files needed by custom Script Tasks.
  • Pre Data Flow / Post Data Flow -- Injects control flow tasks before/after the data flow.

Source File Loop Expression

Configure expressions that will be added to source ForEachFileLoop transformation

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the property will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="SourceFileLoopExpression" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
<Expression PropertyName="Directory">@[$Project::ImportPath]</Expression>

What It Does

Adds custom property expressions to the ForEachFileLoop container that BimlFlex generates for file-based source extractions. These expressions allow you to dynamically control loop properties at runtime -- such as the directory path, file specification (wildcard pattern), or other ForEachLoop properties -- using SSIS project parameters or package variables.

Where It Fires

The extension point is consumed during package generation in the file-based extraction templates and adds <Expression> elements to the ForEachFileLoop container. It is referenced in three separate Biml template files that generate file-based extraction packages. It fires once per object when the ForEachFileLoop container is rendered.

When to Use It

  • You need to override the file directory at runtime using a project parameter (e.g., @[$Project::ImportPath]) so the file extraction path for SalesLT.Customer can differ between development, test, and production environments.
  • You want to change the file specification (wildcard pattern) dynamically based on a variable, allowing the loop to pick up files matching a date-based pattern.
  • You need to set the FileNameRetrievalType or other ForEachFileLoop properties via expressions for the SalesLT.SalesOrderHeader file extraction.
  • Use this instead of hard-coding file paths in connection strings when the path needs to change between environments or at runtime.
  • You want to apply the same loop expressions to all file-based objects in a batch by setting ObjectInherit = true.

Prerequisites

  • The object must be configured for file-based extraction (e.g., Flat File, Excel, or other file source connection type).
  • Project parameters or package variables referenced in expressions (e.g., $Project::ImportPath) must be defined.
  • The expression syntax must follow SSIS expression language rules.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Source File Loop Expression from the dropdown.
  3. Set the Target to the specific object (e.g., a file-based source object on AWLT_SRC).
  4. Add one or more <Expression> elements with valid SSIS property names and expression values.
  5. Build the project in BimlStudio and verify the expressions appear on the ForEachFileLoop container in the generated package.
  6. Execute the package and confirm the loop uses the dynamic values at runtime.

Example

Before (default ForEachFileLoop with static directory):

<ForEachFileLoop Name="FELC - Customer Files" FileSpecification="*.csv" Folder="C:\Import\Customer">
...
</ForEachFileLoop>

After (with SourceFileLoopExpression overriding the directory):

<ForEachFileLoop Name="FELC - Customer Files" FileSpecification="*.csv" Folder="C:\Import\Customer">
<Expressions>
<Expression PropertyName="Directory">@[$Project::ImportPath]</Expression>
</Expressions>
...
</ForEachFileLoop>

Common Mistakes

  • Invalid expression syntax: SSIS expressions use a specific syntax (e.g., @[$Project::ImportPath] for project parameters, @[User::MyVariable] for package variables). Using incorrect syntax causes a build or validation error.
  • Referencing undefined parameters: Using $Project::ImportPath without defining it in the project causes a validation error. Ensure the parameter exists.
  • Applying to non-file objects: This extension point only applies to file-based extractions with a ForEachFileLoop. Applying it to database source objects has no effect.
  • Overriding the wrong property: The PropertyName must match a valid ForEachFileLoop property (e.g., Directory, FileSpec). Misspelled or invalid property names cause build errors.
  • Source File Archive Override -- Replaces the default file archive script task.
  • Source Override -- Replaces the source component in the data flow (can be used for file sources).
  • Source Property -- Adds properties to the source component.
  • Package Variable -- Defines custom variables that can be used in loop expressions.

Source Override

Configure override for the Object source transformation node

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the override will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="SourceOverride" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
<# var tableObject = new ObjectViewModel(table, table.Connection.RelatedItem.IntegrationStage);
var sourceColumns = tableObject.SourceColumns;
#>
<CustomComponent Name="CozyRoc_ParquetSource" ComponentTypeName="CozyRoc.ParquetSource" ContactInfo="Extracts data from Parquet file source.; COZYROC LLC; SSIS+; © 2006-2022 COZYROC LLC; All Rights Reserved; http://www.cozyroc.com/;0">
<DataflowOverrides>
<OutputPath OutputPathName="Parquet Source Output">
<Columns>
<# foreach(var column in sourceColumns.Where(sc => sc.IsDerived != "Y")){ #>
<Column ColumnName="<#=column.ColumnName #>" ErrorRowDisposition="FailComponent" TruncationRowDisposition="FailComponent" />
<# } #>
</Columns>
</OutputPath>
</DataflowOverrides>
<CustomProperties>
<CustomProperty Name="Schema" DataType="System.String" UITypeEditor="CozyRoc.SqlServer.SSIS.TextDialogEditor, CozyRoc.SSISPlus.UI.2019, Version=1.0.0.0, Culture=Neutral, PublicKeyToken=16cf490bb80c34ea" SupportsExpression="true" Description="JSON string representing the schema of the data file.">[
<# var isFirst = true;
foreach(var column in sourceColumns.Where(sc => sc.IsDerived != "Y")){ #>
<#=isFirst ? ", " : "" #>{
"Name" : "<#=column.ColumnName #>",
"Type" : "<#=column.DataType.ToLower() #>"
}
<# } #>]</CustomProperty>
<CustomProperty Name="ComponentObject" DataType="System.Null" TypeConverter="NOTBROWSABLE">
</CustomProperty>
</CustomProperties>
<OutputPaths>
<OutputPath Name="Parquet Source Output">
<OutputColumns>
<# foreach(var column in sourceColumns.Where(sc => sc.IsDerived != "Y")){ #>
<OutputColumn ExternalMetadataColumnName="<#=column.ColumnName #>" Name="<#=column.ColumnName #>" <#=column.GetDataTypeBiml() #> ErrorOrTruncationOperation="Conversion" ErrorRowDisposition="FailComponent" TruncationRowDisposition="FailComponent" />
<# } #>
</OutputColumns>
<ExternalColumns>
<# foreach(var column in sourceColumns.Where(sc => sc.IsDerived != "Y")){ #>
<ExternalColumn Name="<#=column.ColumnName #>" <#=column.GetDataTypeBiml() #> />
<# } #>
</ExternalColumns>
</OutputPath>
<OutputPath Name="Parquet Source Error Output" IsErrorOutput="true">
<OutputColumns>
<# foreach(var column in sourceColumns.Where(sc => sc.IsDerived != "Y")){ #>
<OutputColumn Name="<#=column.ColumnName #>" <#=column.GetDataTypeBiml() #> />
<# } #>
</OutputColumns>
</OutputPath>
</OutputPaths>
<Connections>
<Connection Name="FILE" ConnectionName="CUSTOM_SRC" />
</Connections>
<# CustomOutput.OutputPathName = @"CozyRoc_ParquetSource.Parquet Source Output"; #>
</CustomComponent>

What It Does

Replaces the entire source component in the SSIS data flow with a custom source definition. This gives you complete control over how data is extracted, including the source type (OLE DB, ADO.NET, Flat File, custom third-party component), connection, query, column mappings, and all component properties. The common use case is integrating non-standard source types (e.g., CozyRoc Parquet Source, REST API source) that BimlFlex does not natively generate.

Where It Fires

The extension point is consumed during data flow generation and replaces the default source component that BimlFlex would generate. It fires once per object when the source component is rendered. You must set CustomOutput.OutputPathName to the output path of your custom source component so downstream transformations (hashing, lookups, etc.) can connect to it.

When to Use It

  • You need to replace the default OLE DB Source for SalesLT.Customer on AWLT_SRC with a CozyRoc Parquet Source to read from Parquet files instead of a database.
  • You want to use a custom REST API source component to extract data that is not available through a standard database connection.
  • You need to replace the source with an ADO.NET Source when the OLE DB provider does not support certain features required by your extraction.
  • You want to read from a custom file format (Avro, JSON, etc.) using a third-party SSIS component as the source.
  • Use this instead of Source Property when you need to change the source component type or connection entirely, not just add properties to the existing component.

Prerequisites

  • The object must exist in BimlFlex metadata so the extension point can bind to it.
  • You must set CustomOutput.OutputPathName to the output path of your custom source component.
  • The output columns from your custom source must match what downstream BimlFlex transformations expect (column names and data types).
  • For third-party components (e.g., CozyRoc), the component must be installed on the SSIS server and referenced in the project.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Source Override from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Write the complete source component Biml, including output columns that match the object's metadata columns.
  5. Set CustomOutput.OutputPathName to the output path of your source component (e.g., @"CozyRoc_ParquetSource.Parquet Source Output").
  6. Use tableObject.SourceColumns to dynamically generate column definitions from metadata rather than hard-coding them.
  7. Build the project in BimlStudio and verify the source component in the generated package matches your specification.
  8. Execute a test extraction to confirm data flows from your custom source through the BimlFlex transformation chain.

Example

Before (default OLE DB Source):

<OleDbSource Name="OLE_SRC - SalesLT.Customer" ConnectionName="AWLT_SRC">
<DirectInput>SELECT [CustomerID], [FirstName], [LastName] FROM [SalesLT].[Customer]</DirectInput>
</OleDbSource>

After (with SourceOverride using a Flat File Source):

<FlatFileSource Name="FF_SRC - Customer" ConnectionName="CUSTOM_SRC">
<Columns>
<Column SourceColumn="CustomerID" TargetColumn="CustomerID" />
<Column SourceColumn="FirstName" TargetColumn="FirstName" />
<Column SourceColumn="LastName" TargetColumn="LastName" />
</Columns>
</FlatFileSource>
<# CustomOutput.OutputPathName = @"FF_SRC - Customer.Output"; #>

Common Mistakes

  • Forgetting to set OutputPathName: Without CustomOutput.OutputPathName, downstream transformations (hashing, lookups) cannot connect to your source, causing a build error.
  • Column name or type mismatches: The output columns from your custom source must match what BimlFlex metadata defines. Mismatches cause downstream transformation failures or data truncation.
  • Not using dynamic column generation: Hard-coding columns makes the extension point brittle. Use tableObject.SourceColumns to loop over metadata columns dynamically, as shown in the template.
  • Missing third-party component references: Custom components like CozyRoc must be installed and the assembly referenced. Missing references cause build or runtime failures.
  • Confusing with Source Pipeline: Source Override replaces the source component itself. Source Pipeline adds transformations after the source component.
  • Source Pipeline -- Injects transformations after the source component (does not replace it).
  • Source Property -- Adds properties to the existing source component without replacing it.
  • Source Parameter -- Adds parameters to the source OLE DB command.
  • Target Override -- Replaces the target destination component (analogous to Source Override for the target side).
  • Data Flow Override -- Replaces the entire data flow task, including both source and destination.

Source Parameter

Configure parameters for the Object source transformation node

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the parameter will be added

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="SourceParameter" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
<Parameter Name="Parameter0" VariableName="User.LastModifiedDate" />
<Parameter Name="Parameter1" VariableName="User.CurrentModifiedDate" />

What It Does

Adds custom parameters to the source OLE DB command in the SSIS data flow. These parameters bind SSIS package variables to the source query's parameter placeholders (using ? markers), enabling parameterized queries for incremental extraction, date-range filtering, or any scenario where runtime values need to be passed into the source SQL statement.

Where It Fires

The extension point is consumed during data flow generation and adds <Parameter> elements to the source OLE DB Source component's parameter collection. It fires once per object when the source component is rendered. The parameters are bound to SSIS variables and passed to the source query at runtime.

When to Use It

  • You need to pass LastModifiedDate and CurrentModifiedDate watermark values to a parameterized source query for SalesLT.Customer on AWLT_SRC to implement incremental extraction (e.g., WHERE ModifiedDate BETWEEN ? AND ?).
  • You want to filter SalesLT.SalesOrderHeader by a date range using package variables populated by a Pre Data Flow or Get Parameter extension point.
  • You need to pass a batch identifier or execution ID to a stored procedure source on AWLT_SRC.
  • Use this instead of hardcoding filter values in Override SQL when the filter values change at runtime (e.g., watermarks, execution dates).
  • You want to apply the same parameter bindings to all objects in a batch by setting ObjectInherit = true.

Prerequisites

  • The source query must contain parameter placeholders (? for OLE DB) that match the number and order of the parameters you define.
  • The SSIS variables referenced (e.g., User.LastModifiedDate, User.CurrentModifiedDate) must be defined in the package, either through BimlFlex metadata, Package Variable extension points, or populated by Pre Data Flow / Get Parameter extension points.
  • The source connection must support parameterized queries (OLE DB connections do; not all source types support this).

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Source Parameter from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Define parameters using <Parameter> elements with sequential names (Parameter0, Parameter1, etc.) mapped to SSIS variables.
  5. Ensure the source query (either auto-generated or via Override SQL) contains matching ? placeholders in the correct order.
  6. Build the project in BimlStudio and verify the parameters appear on the source component in the generated package.
  7. Execute the package and confirm the parameterized query runs correctly with the variable values.

Example

Before (source query without parameters):

SELECT [CustomerID], [FirstName], [LastName], [ModifiedDate]
FROM [SalesLT].[Customer]

After (source query with parameters via Override SQL + SourceParameter):

Source query (Override SQL):

SELECT [CustomerID], [FirstName], [LastName], [ModifiedDate]
FROM [SalesLT].[Customer]
WHERE [ModifiedDate] BETWEEN ? AND ?

SourceParameter extension point:

<Parameter Name="Parameter0" VariableName="User.LastModifiedDate" />
<Parameter Name="Parameter1" VariableName="User.CurrentModifiedDate" />

Common Mistakes

  • Parameter order mismatch: Parameters are bound by ordinal position (Parameter0, Parameter1, etc.), not by name. If the ? placeholders in the query are in a different order than your parameters, the wrong values are passed.
  • Undefined SSIS variables: Referencing User.LastModifiedDate without defining it (via Package Variable or BimlFlex metadata) causes a validation error at package execution time.
  • Using named parameters with OLE DB: OLE DB uses positional ? placeholders, not named parameters like @LastModifiedDate. Use ? in the query and sequential Parameter0, Parameter1 names.
  • Missing Override SQL: If the auto-generated source query does not contain ? placeholders, adding parameters has no effect or causes errors. Pair this with Override SQL or Append Source SQL to add the parameterized WHERE clause.
  • Data type mismatches: The SSIS variable data type must be compatible with the column data type in the WHERE clause. A string variable compared to a datetime column causes implicit conversion issues.
  • Override SQL -- Replaces the source query to include parameterized WHERE clauses.
  • Get Parameter -- Retrieves parameter values from the BimlCatalog at runtime.
  • Set Parameter -- Writes parameter values back to the BimlCatalog after execution.
  • Pre Data Flow -- Populates SSIS variables before the data flow executes.
  • Source Property -- Adds properties to the source component (different from parameters).

Source Pipeline

Configure pipeline logic that will be injected after the source transformation node

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="SourcePipeline" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>

What It Does

Injects custom data flow transformations into the pipeline immediately after the source component and before any BimlFlex-generated transformations (hashing, row numbering, etc.). This allows you to manipulate data at the earliest possible point after extraction -- applying data type conversions, derived columns, conditional splits, or data cleansing logic -- before BimlFlex applies its standard transformation chain.

Where It Fires

The extension point is consumed during data flow generation and places transformations between the source component and the first BimlFlex-generated transformation. It fires once per object when the data flow is assembled. You must set CustomOutput.OutputPathName so the next transformation in the pipeline can connect to your last transformation. The inputPath parameter provides the output path from the source component.

When to Use It

  • You need to add a Data Conversion after extracting SalesLT.Customer from AWLT_SRC to convert character columns from ANSI to Unicode before BimlFlex hashing occurs.
  • You want to insert a Derived Column transformation to calculate a composite key or clean up data values from SalesLT.SalesOrderHeader before they reach the staging destination.
  • You need a Conditional Split after the source to filter out test or invalid rows from SalesLT.Product before they enter the BimlFlex transformation chain.
  • You want to add a Multicast after the source to send raw data to an audit table while the main flow continues through BimlFlex transformations.
  • Use this instead of Source Override when the default source component is correct and you only need to add transformations after it, not replace it entirely.
  • Apply a data type conversion after the source. Your MyColumn column arrives as nvarchar(max) from the source but the staging table expects nvarchar(100). Insert a DataConversion transformation after the source node to convert and rename the column (e.g., cnv_MyColumn), then set CustomOutput.OutputPathName so downstream transformations receive the converted data.

Prerequisites

  • The object must exist in BimlFlex metadata with valid source and target connections.
  • You must set CustomOutput.OutputPathName to the output path of your last transformation so the next BimlFlex transformation can connect.
  • The inputPath parameter provides the output path from the source component; use it in your first transformation's InputPath.
  • Column names and data types in your transformation outputs must be compatible with downstream BimlFlex transformations (hashing, lookups, etc.).

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Source Pipeline from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Add data flow transformation Biml, connecting inputPath to your first transformation and setting OutputPathName on the last one.
  5. Build the project in BimlStudio and verify the transformations appear after the source component in the generated data flow.
  6. Execute a test extraction and confirm data passes through your transformations before reaching BimlFlex's standard transformations.

Example

Before (default data flow after source):

OLE_SRC - SalesLT.Customer → [BimlFlex hashing/transformations] → OLE_DST - awlt.Customer

After (with SourcePipeline adding a data conversion):

<DataConversion Name="DCV - Convert LastName">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="LastName" TargetColumn="cnv_LastName" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert LastName.Output"; #>
</DataConversion>

Common Mistakes

  • Not setting OutputPathName: Failing to set CustomOutput.OutputPathName disconnects the downstream BimlFlex transformations from the data flow, causing a build error.
  • Changing column names without updating metadata: If you rename columns in a Data Conversion (e.g., cnv_MyColumn), downstream transformations still reference the original column name. Use Derived Column with the same column name for in-place replacement, or update column mappings.
  • Breaking hash calculations: BimlFlex calculates row hashes after the source pipeline. If your transformation changes column values or types, hash calculations may produce different results than expected, affecting change detection.
  • Using control flow tasks instead of data flow transformations: Source Pipeline operates inside the data flow. Use Pre Data Flow for control flow tasks before the data flow.
  • Confusing with Source Override: Source Override replaces the source component itself. Source Pipeline adds transformations after the source component.
  • Source Override -- Replaces the entire source component in the data flow.
  • Source Property -- Adds properties to the source component.
  • Target Pipeline -- Injects transformations before the target destination (analogous position at the end of the data flow).
  • Staging Target Pipeline -- Injects transformations before the staging destination.

Source Property

Configure additional properties that will be added to source transformation node

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the property will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="SourceProperty" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
BindCharColumnAs="true" ExposeCharColumnsAsUnicode="false"

What It Does

Adds custom properties to the source component inside the SSIS data flow. These properties control how the source adapter reads data, such as how character columns are bound, whether character data is exposed as Unicode, or how the command timeout is configured. This gives you fine-grained control over source component behavior without replacing the entire source component.

Where It Fires

The extension point is consumed during data flow generation and emits property attributes directly on the source component (OLE DB Source, ADO NET Source, etc.). It fires once per object when the source component is rendered. The properties are appended to the source element's attribute list in the generated Biml.

When to Use It

  • You need to set ExposeCharColumnsAsUnicode="false" on the OLE DB Source for SalesLT.Customer on AWLT_SRC to prevent SSIS from automatically converting CHAR/VARCHAR columns to NCHAR/NVARCHAR, which can cause data truncation.
  • You want to set BindCharColumnAs="true" when extracting from SalesLT.SalesOrderHeader to control how the OLE DB provider binds character columns.
  • You need to increase the command timeout for a long-running source query by adding CommandTimeout="600" to the source component.
  • Use this instead of Source Override when the default source component is correct and you only need to add or change a few properties rather than replacing the entire component.
  • You want to apply the same source properties to all objects in a batch by setting ObjectInherit = true.

Prerequisites

  • The object must exist in BimlFlex metadata with a valid source connection.
  • The property names must be valid SSIS source component properties for the connection type (OLE DB, ADO.NET, Flat File, etc.).
  • Verify the property syntax matches the Biml schema for the specific source type.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Source Property from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC) or leave it unset and use ObjectInherit = true for all objects.
  4. Enter the property attributes as space-separated name-value pairs (e.g., BindCharColumnAs="true" ExposeCharColumnsAsUnicode="false").
  5. Build the project in BimlStudio and inspect the generated package to confirm the properties appear on the source component.
  6. Execute a test extraction to verify the behavior matches expectations.

Example

Before (default generated source):

<OleDbSource Name="OLE_SRC - SalesLT.Customer" ConnectionName="AWLT_SRC">
<DirectInput>SELECT ... FROM [SalesLT].[Customer]</DirectInput>
</OleDbSource>

After (with SourceProperty extension point):

<OleDbSource Name="OLE_SRC - SalesLT.Customer" ConnectionName="AWLT_SRC" BindCharColumnAs="true" ExposeCharColumnsAsUnicode="false">
<DirectInput>SELECT ... FROM [SalesLT].[Customer]</DirectInput>
</OleDbSource>

Common Mistakes

  • Using invalid property names for the connection type: Properties like ExposeCharColumnsAsUnicode are specific to OLE DB sources. Using them on a Flat File source causes a build error. Check the Biml reference for valid properties per source type.
  • Conflicting properties with metadata settings: If BimlFlex already sets a property and you add the same property, the build may fail with a duplicate attribute error.
  • Missing quotation marks around values: Property values must be quoted (e.g., ExposeCharColumnsAsUnicode="false" not ExposeCharColumnsAsUnicode=false). Missing quotes cause Biml parse errors.
  • Applying OLE DB properties when ObjectInherit is true: If the batch contains mixed source connection types, OLE DB-specific properties fail on non-OLE DB sources.
  • Confusing with Source Override: If you need to change the source type, connection, or query, use Source Override. Source Property only adds attributes to the existing source element.
  • Source Override -- Replaces the entire source component.
  • Source Pipeline -- Injects data flow transformations after the source component.
  • Target Property -- Adds properties to the target destination component (analogous to Source Property for the target side).
  • Data Flow Properties -- Adds expressions to the data flow task container rather than to individual components.

Staging Initial Target Pipeline

Configure pipeline logic that will be injected into the source to staging package before the Staging Initial target destination node

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="StagingInitialTargetPipeline" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>

What It Does

Injects custom data flow transformations into the source-to-staging package immediately before the staging initial destination node. This extension point only fires during the initial (first-run) load when the staging table is empty and BimlFlex uses a separate initial load path. It allows you to apply transformations specifically for the initial bulk load without affecting subsequent incremental extractions.

Where It Fires

The extension point is consumed during data flow generation in the source-to-staging package and places transformations before the staging initial destination component. It fires only during the initial load path, which is typically used when the staging table is empty or when BimlFlex generates a separate initial load data flow. You must set CustomOutput.OutputPathName so the staging initial destination can connect to your last transformation.

When to Use It

  • You need to add a Data Conversion before the initial bulk load of SalesLT.Customer from AWLT_SRC into awlt.Customer on BFX_STG, but do not want that conversion to run on subsequent incremental loads.
  • You want to insert a Row Count transformation during the initial load of SalesLT.SalesOrderHeader to audit the full initial row count separately from incremental counts.
  • You need to apply a Conditional Split during the initial load to route historical rows differently than current rows when bootstrapping awlt.Customer.
  • Use this instead of Staging Target Pipeline when the transformation should only apply to the initial load, not to every incremental extraction.
  • Use this instead of Target Pipeline when you specifically want to add transformations before the staging initial destination, not the primary target destination.

Prerequisites

  • The object must exist in BimlFlex metadata with a staging connection configured (e.g., BFX_STG).
  • The BimlFlex project must be configured to generate a separate initial load path (this is the default for most configurations).
  • You must set CustomOutput.OutputPathName to the output path of your last transformation.
  • The inputPath parameter provides the output path from the preceding transformation; use it in your first transformation's InputPath.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Staging Initial Target Pipeline from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Add data flow transformation Biml, connecting inputPath to your first transformation and setting OutputPathName on the last one.
  5. Build the project in BimlStudio and verify the transformations appear before the staging initial destination in the generated data flow.
  6. Execute the initial load and confirm the transformations fire. Then execute an incremental load and confirm they do not fire.

Example

Before (default initial staging data flow ending):

... → OLE_DST - awlt.Customer (initial staging)

After (with StagingInitialTargetPipeline adding a data conversion):

<DataConversion Name="DCV - Convert CompanyName">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="CompanyName" TargetColumn="cnv_CompanyName" DataType="String" Length="200" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert CompanyName.Output"; #>
</DataConversion>

Common Mistakes

  • Expecting it to fire on every load: This extension point only fires during the initial load path. If you need transformations on every load, use Staging Target Pipeline instead.
  • Not setting OutputPathName: Failing to set CustomOutput.OutputPathName disconnects the staging initial destination from the data flow.
  • Confusing with Staging Target Pipeline: Staging Initial Target Pipeline is for the initial load only; Staging Target Pipeline is for every load. If you need both, implement both extension points.
  • Not testing the initial load path: The initial load path may only run once. Test by clearing the staging table and re-running the package to trigger the initial load logic.
  • Staging Target Pipeline -- Injects transformations before the staging destination on every load (not just the initial load).
  • Target Pipeline -- Injects transformations before the primary target destination.
  • Persistent Staging Initial Target Pipeline -- Injects transformations before the persistent staging initial destination.
  • Source Pipeline -- Injects transformations after the source component.

Staging Target Pipeline

Configure pipeline logic that will be injected into the source to staging package before the Staging target destination node

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="StagingTargetPipeline" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert MyColumn">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn.Output"; #>
</DataConversion>

What It Does

Injects custom data flow transformations into the source-to-staging package immediately before the staging destination node. This allows you to apply transformations -- such as data type conversions, derived columns, or conditional splits -- right before data is written to the staging table. Unlike Target Pipeline which fires before the primary target destination, this extension point specifically targets the staging destination in packages where staging is a separate step.

Where It Fires

The extension point is consumed during data flow generation in the source-to-staging package and places transformations between the last BimlFlex-generated transformation and the staging destination component. It fires once per object when the staging destination is rendered. You must set CustomOutput.OutputPathName so the staging destination can connect to your last transformation.

When to Use It

  • You need to add a Data Conversion before writing to awlt.Customer in BFX_STG to change column data types (e.g., converting ANSI strings to Unicode) specifically for the staging load.
  • You want to insert a Derived Column transformation before the staging destination to add computed columns (e.g., a load timestamp or source file name) to awlt.SalesOrderHeader in BFX_STG.
  • You need to apply a Conditional Split before the staging destination to filter out rows that do not meet staging criteria for the SalesLT.Product extraction.
  • Use this instead of Target Pipeline when you specifically want to add transformations before the staging destination, not the primary target destination.
  • Use this instead of Staging Initial Target Pipeline when the transformations should apply to the regular staging load, not the initial (first-run) staging load.

Prerequisites

  • The object must exist in BimlFlex metadata with a staging connection configured (e.g., BFX_STG).
  • You must set CustomOutput.OutputPathName to the output path of your last transformation so the staging destination can connect.
  • The inputPath parameter provides the output path from the preceding transformation; use it in your first transformation's InputPath.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Staging Target Pipeline from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Add data flow transformation Biml, connecting inputPath to your first transformation and setting OutputPathName on the last one.
  5. Build the project in BimlStudio and verify the transformations appear before the staging destination in the generated data flow.
  6. Execute a test extraction and confirm data passes through your transformations before reaching the staging table.

Example

Before (default staging data flow ending):

... → OLE_DST - awlt.Customer (staging)

After (with StagingTargetPipeline adding a data conversion):

<DataConversion Name="DCV - Convert EmailAddress">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="EmailAddress" TargetColumn="cnv_EmailAddress" DataType="String" Length="200" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert EmailAddress.Output"; #>
</DataConversion>

Common Mistakes

  • Not setting OutputPathName: Failing to set CustomOutput.OutputPathName disconnects the staging destination from the data flow, causing a build error.
  • Confusing with Target Pipeline: Staging Target Pipeline fires before the staging destination. Target Pipeline fires before the primary target destination. In a multi-hop architecture (source -> staging -> ODS), these are different destinations.
  • Confusing with Staging Initial Target Pipeline: Staging Target Pipeline applies to every execution. Staging Initial Target Pipeline only applies to the initial (first-run) load. Use the correct one for your scenario.
  • Using control flow tasks instead of data flow transformations: This extension point operates inside the data flow. Use control flow extension points (Pre Data Flow, Post Data Flow) for tasks outside the data flow.
  • Staging Initial Target Pipeline -- Injects transformations before the staging destination during the initial load only.
  • Target Pipeline -- Injects transformations before the primary target destination.
  • Persistent Staging Target Pipeline -- Injects transformations before the persistent staging destination.
  • Source Pipeline -- Injects transformations after the source component.

Target Override

Configure override for the Object target transformation node

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the override will be added
inputPathStringContains the output path of the preceding task

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="TargetOverride" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>


<!-- NOTE: The type of target override must be consistent with the connection type. -->
<# var targetConnection = table.GetTargetConnection();
var targetScopedName = table.GetTargetScopedName(targetConnection);
var targetSsisSafeScopedName = targetScopedName.MakeSsisSafe();
#>
<# CustomOutput.ObjectInherit = false; #>
<OleDbDestination Name="OLE_DST - <#=targetSsisSafeScopedName#>" ConnectionName="<#=targetConnection#>" MaximumInsertCommitSize="500000" BatchSize="500000" CheckConstraints="false">
<InputPath OutputPathName="<#=inputPath#>" />
<ExternalTableOutput Table="[dbo].[MyTable]" />
</OleDbDestination>

What It Does

Replaces the entire target destination component in the SSIS data flow with a custom destination definition. This gives you complete control over how data is written to the target, including the destination type (OLE DB, ADO.NET, Flat File, custom component), connection, table mapping, batch sizes, and all other destination properties. The replacement must be consistent with the target connection type.

Where It Fires

The extension point is consumed during data flow generation and replaces the default destination component that BimlFlex would generate. It fires once per object when the target destination is rendered. The inputPath parameter provides the output path from the preceding transformation so your custom destination can connect to the data flow.

When to Use It

  • You need to change the destination batch size or commit size for awlt.Customer in BFX_STG to optimize bulk insert performance (e.g., setting MaximumInsertCommitSize="500000" and BatchSize="500000").
  • You want to replace the default OLE DB Destination with an ADO.NET Destination for loading into awlt.SalesOrderHeader on BFX_STG when the target requires ADO.NET-specific features.
  • You need to write to a custom table name or schema that differs from what BimlFlex metadata generates (e.g., writing to a temporary table [dbo].[MyTable] instead of the metadata-defined target).
  • You want to disable check constraints during loading by setting CheckConstraints="false" while also customizing other destination properties.
  • Use this instead of Target Property when you need to change the destination type, connection, or table mapping rather than just adding a few properties to the existing destination.

Prerequisites

  • The object must exist in BimlFlex metadata with valid source and target connections.
  • The destination type must be compatible with the target connection (e.g., OLE DB Destination for OLE DB connections, ADO.NET Destination for ADO.NET connections).
  • The inputPath parameter must be used to connect the destination to the preceding data flow transformation.
  • Column mappings in the destination must match the columns arriving from the upstream data flow.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Target Override from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Write the complete destination component Biml, using <#=inputPath#> for the input connection and optionally using table.GetTargetConnection() and table.GetTargetScopedName() for dynamic names.
  5. Build the project in BimlStudio and verify the destination component in the generated package matches your specification.
  6. Execute a test load to confirm data writes correctly to the target table.

Example

Before (default generated destination):

<OleDbDestination Name="OLE_DST - awlt.Customer" ConnectionName="BFX_STG">
<InputPath OutputPathName="..." />
<ExternalTableOutput Table="[awlt].[Customer]" />
</OleDbDestination>

After (with TargetOverride customizing batch size and constraints):

<OleDbDestination Name="OLE_DST - awlt.Customer" ConnectionName="BFX_STG" MaximumInsertCommitSize="500000" BatchSize="500000" CheckConstraints="false">
<InputPath OutputPathName="<#=inputPath#>" />
<ExternalTableOutput Table="[awlt].[Customer]" />
</OleDbDestination>

Common Mistakes

  • Mismatched destination type and connection type: Using an OLE DB Destination with an ADO.NET connection (or vice versa) causes a runtime error. The destination type must match the connection type.
  • Forgetting to use inputPath: Hard-coding the input path instead of using <#=inputPath#> causes the destination to disconnect from the data flow when upstream transformations change.
  • Missing column mappings: If the target table has different column names than the upstream data flow, you need explicit column mappings. Otherwise SSIS cannot auto-map columns.
  • Not using dynamic naming: Hard-coding the destination name instead of using <#=targetSsisSafeScopedName#> makes the extension point non-reusable with ObjectInherit = true.
  • Confusing with Target Pipeline: Target Override replaces the destination component itself. Target Pipeline adds transformations before the destination without replacing it.
  • Target Pipeline -- Injects data flow transformations before the target destination (does not replace it).
  • Target Property -- Adds properties to the existing target destination without replacing it.
  • Source Override -- Replaces the source component in the data flow (analogous to Target Override for the source side).
  • Data Flow Override -- Replaces the entire data flow task, including both source and destination.

Target Pipeline

Configure pipeline logic that will be injected before the target transformation node

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
inputPathStringContains the output path of the preceding task
threadStringContains the thread number of the preceding task

Outputs

Name
TypeDescription
OutputPathNameStringYou must add CustomOutput.OutputPathName with the last task to connect it with the next Data Flow task.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="TargetPipeline" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="inputPath" type="String" #>
<#@ property name="thread" type="String" #>


<!-- This can be any anything defined within the SSIS Data Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<DataConversion Name="DCV - Convert MyColumn<#=thread #>">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="MyColumn" TargetColumn="cnv_MyColumn" DataType="String" Length="100" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert MyColumn" + thread + ".Output"; #>
</DataConversion>

What It Does

Injects custom data flow transformations into the pipeline immediately before the target destination component. This allows you to apply last-mile transformations -- such as data type conversions, conditional splits, multicast operations, or row count audits -- right before data is written to the target table. The extension point provides a thread parameter to support parallel load scenarios where multiple threads write to the same destination.

Where It Fires

The extension point is consumed during data flow generation and places transformations between the last BimlFlex-generated transformation and the target destination component. It fires once per object (or once per thread when parallel loads are configured). You must set CustomOutput.OutputPathName so the target destination can connect to your last transformation, and you must include the thread suffix in component names to avoid naming collisions in parallel load scenarios.

When to Use It

  • You need to add a Data Conversion transformation before writing to awlt.Customer in BFX_STG to convert character columns from ANSI to Unicode.
  • You want to insert a Row Count transformation before the destination to audit the number of rows being written to awlt.SalesOrderHeader in BFX_STG.
  • You need a Conditional Split before the target to route error rows to a separate destination while valid rows continue to awlt.Customer.
  • You want to add a Multicast before the destination to send data both to BFX_STG and to an audit/logging table simultaneously.
  • Use this instead of Target Override when the default destination component is correct and you only need to add transformations before it, not replace it entirely.

Prerequisites

  • The object must exist in BimlFlex metadata with valid source and target connections.
  • You must set CustomOutput.OutputPathName to the output path of your last transformation so the target destination can connect.
  • You must append the thread parameter to transformation names to ensure uniqueness in parallel load configurations.
  • The inputPath parameter provides the output path from the preceding transformation; use it in your first transformation's InputPath.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Target Pipeline from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Add data flow transformation Biml, connecting inputPath to your first transformation and setting OutputPathName on the last one.
  5. Append <#=thread #> to all transformation names to support parallel loads.
  6. Build the project in BimlStudio and verify the transformations appear before the destination in the generated data flow.
  7. Execute a test load to confirm data passes through your transformations before reaching the target.

Example

Before (default data flow ending):

... → OLE_DST - awlt.Customer

After (with TargetPipeline adding a data conversion):

<DataConversion Name="DCV - Convert EmailAddress<#=thread #>">
<InputPath OutputPathName="<#=inputPath #>" />
<Columns>
<Column SourceColumn="EmailAddress" TargetColumn="cnv_EmailAddress" DataType="String" Length="200" />
</Columns>
<# CustomOutput.OutputPathName = @"DCV - Convert EmailAddress" + thread + ".Output"; #>
</DataConversion>

Common Mistakes

  • Forgetting the thread suffix: Omitting <#=thread #> from transformation names causes duplicate component name errors when parallel loads are configured. Always append the thread parameter.
  • Not setting OutputPathName: Failing to set CustomOutput.OutputPathName disconnects the target destination from the data flow, causing a build error or an empty destination.
  • Using the wrong inputPath: The inputPath variable already contains the correct output path from the preceding transformation. Do not hard-code a path name.
  • Adding control flow tasks instead of data flow transformations: Target Pipeline operates inside the data flow. Use Target Pre Process or Target Post Process for control flow tasks.
  • Confusing with Staging Target Pipeline: Target Pipeline fires before the primary target destination. Staging Target Pipeline fires before the staging-specific destination in source-to-staging packages.
  • Target Override -- Replaces the entire target destination component.
  • Target Property -- Adds properties to the target destination component.
  • Source Pipeline -- Injects transformations after the source component (analogous position in the data flow).
  • Staging Target Pipeline -- Injects transformations before the staging destination node.
  • Target Pre Process / Target Post Process -- Injects control flow tasks before/after the main process with source and target context.

Target Post Process

Configure logic that will be injected after the main Object process using Source and Target table as parameter

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="TargetPostProcess" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- This can be any anything defined within the SSIS Control Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Set SnapshotDate" ConnectionName="BimlCatalog" ResultSet="None">
<DirectInput>EXEC [ssis].[SetConfigVariable] 'LOAD_DATAMART', 'LOAD_DATAMART.SnapshotDate', 'SnapshotDate', @VariableValue</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.SnapshotDate" Length="-1" DataType="String" />
</Parameters>
</ExecuteSQL>

What It Does

Injects one or more SSIS control flow tasks after the main object process when both source and target table context are available. This allows you to run post-processing logic -- such as writing back watermark values, updating audit tables, rebuilding indexes, or executing stored procedures -- that requires knowledge of the specific source-to-target mapping. The extension point fires after all data flow and downstream processing for the object has completed.

Where It Fires

The extension point is consumed during SSIS control flow generation and places tasks inside the object's sequence container after the main processing completes. It fires once per source-to-target object mapping. The tasks appear after the Post Data Flow and any other post-processing tasks in the package execution sequence.

When to Use It

  • You need to write a snapshot date back to the BimlCatalog after successfully loading SalesLT.Customer from AWLT_SRC into dim.Customer on BFX_DM.
  • You want to rebuild indexes on awlt.SalesOrderHeader in BFX_STG after the bulk insert completes to restore query performance.
  • You need to execute a stored procedure on BFX_ODS that updates audit or reconciliation tables after data arrives from AWLT_SRC.
  • Use this instead of Post Data Flow when you need both source and target table context in your post-processing logic.
  • Use this instead of Post Process when you need the specific source-to-target mapping rather than just the single table context.

Prerequisites

  • The object must exist in BimlFlex metadata with valid source and target connections.
  • Any referenced connections (e.g., BimlCatalog) must be available in the project.
  • Variables referenced in parameter bindings (e.g., User.SnapshotDate) must be defined in the package, either through BimlFlex metadata or via a Package Variable extension point.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Target Post Process from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Add valid SSIS control flow Biml (ExecuteSQL, Script Task, etc.) that uses the sourceTable and targetTable parameters as needed.
  5. Build the project in BimlStudio and verify the tasks appear after the main data flow in the generated package.
  6. Execute the package and confirm the post-processing logic runs after data loading completes.

Example

Before (default package flow for SalesLT.Customer):

SEQC - Main
└─ DFT - Load awlt.Customer

After (with TargetPostProcess writing back a snapshot date):

<ExecuteSQL Name="SQL - Set SnapshotDate" ConnectionName="BimlCatalog" ResultSet="None">
<DirectInput>EXEC [ssis].[SetConfigVariable] 'LOAD_DATAMART', 'LOAD_DATAMART.SnapshotDate', 'SnapshotDate', @VariableValue</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.SnapshotDate" Length="-1" DataType="String" />
</Parameters>
</ExecuteSQL>

Common Mistakes

  • Confusing with Post Data Flow: Post Data Flow fires in the control flow with only a single table parameter and a precedenceConstraint. Target Post Process provides both sourceTable and targetTable but no precedence constraint. Choose based on whether you need source+target context.
  • Assuming the data flow succeeded: Target Post Process fires regardless of whether the data flow completed successfully or failed. Add error handling or conditional logic if your post-processing should only run on success.
  • Referencing undefined variables: Using User.SnapshotDate without defining it via a Package Variable extension point or BimlFlex metadata causes a runtime validation error.
  • Not testing with ObjectInherit = true: When applied to all objects, ensure the post-processing logic is valid for every source-to-target mapping in the batch.
  • Target Pre Process -- Injects control flow tasks before the main object process with source and target context.
  • Post Data Flow -- Injects control flow tasks after the data flow with a single table parameter and precedence constraint support.
  • Post Process -- Injects control flow tasks after the main process with a single table parameter.
  • Package Variable -- Defines custom package variables used by post-process tasks.

Target Pre Process

Configure logic that will be injected before the main Object process using Source and Target table as parameter

Parameters

Name
TypeDescription
sourceTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the pipeline will be added
targetTableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the target object to which the pipeline will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="TargetPreProcess" #>
<#@ property name="sourceTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>
<#@ property name="targetTable" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<!-- This can be any anything defined within the SSIS Control Flow. -->
<# CustomOutput.ObjectInherit = false; #>
<ExecuteSQL Name="SQL - Set SnapshotDate" ConnectionName="BimlCatalog" ResultSet="None">
<DirectInput>EXEC [ssis].[SetConfigVariable] 'LOAD_DATAMART', 'LOAD_DATAMART.SnapshotDate', 'SnapshotDate', @VariableValue</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.SnapshotDate" Length="-1" DataType="String" />
</Parameters>
</ExecuteSQL>

What It Does

Injects one or more SSIS control flow tasks before the main object process when both source and target table context are available. This allows you to run preparatory logic -- such as retrieving configuration values, truncating staging tables, or setting snapshot dates -- that requires knowledge of the specific source-to-target mapping. Unlike Pre Data Flow which only has a single table parameter, this extension point provides both sourceTable and targetTable wrappers.

Where It Fires

The extension point is consumed during SSIS control flow generation and places tasks inside the object's sequence container before the main processing begins. It fires once per source-to-target object mapping. The tasks appear before the Pre Data Flow and Data Flow tasks in the package execution sequence.

When to Use It

  • You need to retrieve a snapshot date or configuration variable from the BimlCatalog before loading SalesLT.Customer from AWLT_SRC into BFX_DM's dim.Customer.
  • You want to truncate a staging table or temporary work table on BFX_STG before the data flow begins, with access to both source and target table metadata.
  • You need to execute a stored procedure on BFX_ODS that prepares the target table (e.g., disabling indexes) before data arrives from AWLT_SRC.
  • Use this instead of Pre Data Flow when you need both source and target table context in your pre-processing logic.
  • Use this instead of Pre Process when you need the specific source-to-target mapping rather than just the single table context.

Prerequisites

  • The object must exist in BimlFlex metadata with valid source and target connections.
  • Any referenced connections (e.g., BimlCatalog) must be available in the project.
  • Variables referenced in parameter bindings (e.g., User.SnapshotDate) must be defined in the package, either through BimlFlex metadata or via a Package Variable extension point.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Target Pre Process from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC).
  4. Add valid SSIS control flow Biml (ExecuteSQL, Script Task, etc.) that uses the sourceTable and targetTable parameters as needed.
  5. Build the project in BimlStudio and verify the tasks appear before the main data flow in the generated package.
  6. Execute the package and confirm the pre-processing logic runs before data extraction begins.

Example

Before (default package flow for SalesLT.Customer):

SEQC - Main
└─ DFT - Load awlt.Customer

After (with TargetPreProcess retrieving a snapshot date):

<ExecuteSQL Name="SQL - Set SnapshotDate" ConnectionName="BimlCatalog" ResultSet="None">
<DirectInput>EXEC [ssis].[SetConfigVariable] 'LOAD_DATAMART', 'LOAD_DATAMART.SnapshotDate', 'SnapshotDate', @VariableValue</DirectInput>
<Parameters>
<Parameter Name="@VariableValue" VariableName="User.SnapshotDate" Length="-1" DataType="String" />
</Parameters>
</ExecuteSQL>

Common Mistakes

  • Confusing with Pre Data Flow: Pre Data Flow fires in the control flow with only a single table parameter and a precedenceConstraint. Target Pre Process provides both sourceTable and targetTable but no precedence constraint output. Choose based on whether you need source+target context.
  • Referencing undefined variables: Using User.SnapshotDate without defining it via a Package Variable extension point or BimlFlex metadata causes a runtime validation error.
  • Not testing with ObjectInherit = true: When applied to all objects, ensure the pre-processing logic is valid for every source-to-target mapping in the batch, not just the one you tested.
  • Ordering confusion with Pre Process / Pre Data Flow: Target Pre Process, Pre Process, and Pre Data Flow all fire at different points. Target Pre Process fires earliest with source+target context; verify execution order in the generated package.
  • Target Post Process -- Injects control flow tasks after the main object process with source and target context.
  • Pre Data Flow -- Injects control flow tasks before the data flow with a single table parameter and precedence constraint support.
  • Pre Process -- Injects control flow tasks before the main process with a single table parameter.
  • Package Variable -- Defines custom package variables used by pre-process tasks.

Target Property

Configure additional properties that will be added to target transformation node

Parameters

Name
TypeDescription
tableBimlFlexModelWrapper.ObjectsWrapperContains all information related to the object to which the property will be added

Outputs

Name
TypeDescription
ObjectInheritBooleanIf CustomOutput.ObjectInherit = true then the Extension Point will be applied to all the Objects associated with the batch.

Template

<#@ extension bundle="BimlFlex.bimlb" extensionpoint="TargetProperty" #>
<#@ property name="table" type="BimlFlexModelWrapper.ObjectsWrapper" #>


<# CustomOutput.ObjectInherit = false; #>
KeepNulls="true" KeepIdentity="false"

What It Does

Adds custom properties to the target destination component inside the SSIS data flow. These properties control destination behavior such as whether null values are preserved during insert, whether identity column values are retained from the source, or whether check constraints are evaluated. This gives you fine-grained control over how SSIS writes data into the target table without replacing the entire destination component.

Where It Fires

The extension point is consumed during data flow generation and emits property attributes directly on the target destination component (OLE DB Destination, ADO NET Destination, etc.). It fires once per object when the target component is rendered. The properties are appended to the destination element's attribute list in the generated Biml.

When to Use It

  • You need to set KeepNulls="true" on the OLE DB Destination for awlt.Customer in BFX_STG so that source NULL values are preserved rather than being replaced by column defaults.
  • You want to enable KeepIdentity="true" when loading awlt.SalesOrderHeader into BFX_STG to retain the original identity values from AWLT_SRC.
  • You need to disable check constraint evaluation during bulk inserts by setting CheckConstraints="false" for performance on large loads into BFX_ODS.
  • Use this instead of Target Override when the default destination component is correct and you only need to add or change a few properties rather than replacing the entire component.
  • You want to apply the same properties to all objects in a batch by setting ObjectInherit = true.

Prerequisites

  • The object must exist in BimlFlex metadata with a valid target connection.
  • The property names must be valid SSIS destination component properties for the connection type (OLE DB, ADO.NET, Flat File, etc.).
  • Verify the property syntax matches the Biml schema for the specific destination type.

Implementation Steps

  1. Open the BimlFlex App and navigate to the Extension Points screen.
  2. Create a new extension point and select Target Property from the dropdown.
  3. Set the Target to the specific object (e.g., SalesLT.Customer on AWLT_SRC targeting BFX_STG).
  4. Enter the property attributes as space-separated name-value pairs (e.g., KeepNulls="true" KeepIdentity="false").
  5. Build the project in BimlStudio and inspect the generated package to confirm the properties appear on the destination component.
  6. Execute a test load and verify the behavior matches expectations (e.g., NULL values are preserved, identity values are kept).

Example

Before (default generated destination):

<OleDbDestination Name="OLE_DST - awlt.Customer" ConnectionName="BFX_STG" MaximumInsertCommitSize="500000">
<InputPath OutputPathName="..." />
<ExternalTableOutput Table="[awlt].[Customer]" />
</OleDbDestination>

After (with TargetProperty extension point adding KeepNulls="true"):

<OleDbDestination Name="OLE_DST - awlt.Customer" ConnectionName="BFX_STG" MaximumInsertCommitSize="500000" KeepNulls="true" KeepIdentity="false">
<InputPath OutputPathName="..." />
<ExternalTableOutput Table="[awlt].[Customer]" />
</OleDbDestination>

Common Mistakes

  • Using invalid property names for the connection type: Properties like KeepNulls are specific to OLE DB destinations. Using them on an ADO.NET destination causes a build error. Check the Biml reference for valid properties per destination type.
  • Conflicting properties with metadata settings: If BimlFlex already sets a property (e.g., MaximumInsertCommitSize) and you add the same property, the build may fail with a duplicate attribute error. Review the generated Biml before adding properties.
  • Missing quotation marks around values: Property values must be quoted (e.g., KeepNulls="true" not KeepNulls=true). Missing quotes cause Biml parse errors.
  • Applying OLE DB properties to flat file destinations: When ObjectInherit = true is set and the batch contains mixed connection types, OLE DB-specific properties fail on non-OLE DB targets.
  • Confusing with Target Override: If you need to change the destination type or table mapping, use Target Override. Target Property only adds attributes to the existing destination element.
  • Target Override -- Replaces the entire target destination component.
  • Target Pipeline -- Injects data flow transformations before the target destination.
  • Source Property -- Adds properties to the source component (analogous to Target Property for the source side).
  • Data Flow Properties -- Adds expressions to the data flow task container rather than to individual components.