Skip to main content

BimlFlex Settings Reference Documentation

This document outlines the metadata and framework Settings available in BimlFlex.

These settings drive the behavior of the BimlFlex product.

By changing settings, the produced artifacts can adapt to specific requirements for file locations, naming conventions, data conventions etc.

Align these settings with the organizations best practices and environmental requirements.

note

For additional information about using Settings in BimlFlex, please refer to the Settings Editor documentation.

AzCopy

Setting
Type
Description
PathText DatatypeThe file path to the local installation of the AzCopy Command.
Log LocationText DatatypeSets the log location for AzCopy v.10 log files to the AZCOPY_LOG_LOCATION environment variable.
VersionText DatatypeThe AzCopy version used. This should match the AzCopy command found in the AzCopyPath setting. Use a numeric such as 8 or 10 as values to specify the AzCopy version.
Log LevelText DatatypeThe log level for AzCopy v.10 logs. Available log levels are: NONE, DEBUG, INFO, WARNING, ERROR, PANIC, and FATAL
Cap MbpsText DatatypeThe AzCopy v.10 transfer speed cap in Mbps.
ConcurrencyText DatatypeSpecify the number of concurrent AzCopy operations to start. To reuse the current environment variable without setting this value for each object, leave this blank when using AzCopy v.10.
Use AzCopyText DatatypeShould the generated files be uploaded using AzCopy as part of the SSIS Packages. Disable if the solution uses an alternative file management process.
Create ContainerText DatatypeDetermines if the pipeline includes a step to create the destination container. Only enable this when needed, such as to allow load scenarios where the target container is removed between loads.
Set Environment VariablesText DatatypeDetermines if the environment variables controlling AzCopy v.10 are set before each object is loaded.
Use SAS TokenText DatatypeDetermines if AzCopy uses a SAS Token for access. This is the only supported authentication mechanism for AzCopy v.10.

Azure

Setting
Type
Description
Data Factory NameText DatatypeThe default Azure Data Factory name to use.
Data Factory LocationText DatatypeThis refers to the geographical location where the Azure Data Factory instance and its associated data operations are hosted.
Subscription IdText DatatypeThe default Azure Subscription Id to use.
Resource GroupText DatatypeThe default Azure Resource Group name to use.
Key VaultText DatatypeThe default Azure Key Vault name to use for Linked Services.
ActivityLimitText DatatypeUse this setting to increase the number of activities in a single pipeline to up to 80, enabling more parallel calls, especially beneficial for batch pipelines. This enhancement helps optimize performance and efficiency by allowing more concurrent activities within a single data processing workflow.
Integration Runtime NameText DatatypeThe default Azure Data Factory Self Hosted Integration Runtime name to use for Linked Services.
Integration Runtime Is SharedText DatatypeIs the referenced Self Hosted Integration Runtime shared from another Data Factory?
Linked Resource IdText DatatypeThe Resource Id for the referenced Self Hosted Integration Runtime Linked Service that is shared to this Data Factory.
Deployment ContainerText DatatypeThe Storage Container to use when accessing Blob Storage for linked ARM template deployment.
Deployment Account NameText DatatypeThe Storage Account name to use when accessing Blob Storage for linked ARM template deployment.
Deployment Account KeyText DatatypeThe Storage Access Key to use when accessing Blob Storage for linked ARM template deployment.
Deployment SAS TokenText DatatypeThe Storage Access SAS Token to use when accessing Blob Storage for linked ARM template deployment.
Emit Powershell Environment ChecksText DatatypeDetermines if the PowerShell deployment files include environment verification by calling Get-InstalledModule -Name Az.
File Compression CodecText DatatypeThe compression type (Codec) to use for the Azure Data Factory File Dataset.
File Compression LevelText DatatypeThe compression level to apply for the Azure Data Factory target File Dataset.
File Encoding NameText DatatypeThe File Encoding Name for the Azure Data Factory target File Dataset.
File Null ValueText DatatypeThe Null value definition to use for the Azure Data Factory target File Dataset.
Archive Landing FilesText DatatypeDetermines if the landed files are moved to the defined archive container, once processing is completed.
OnError Landing FilesText DatatypeDetermines if the landed files are moved to the defined error container on error, once processing is completed.
Delete Landing FilesText DatatypeDetermines if the landed files are deleted once processing is completed.
Archive Source FilesText DatatypeDetermines if the source files are moved to the defined archive container, once processing is completed.
OnError Source FilesText DatatypeDetermines if the source files are moved to the defined error container on error, once processing is completed.
Delete Source FilesText DatatypeDetermines if the source files are deleted once processing is completed.
Archive StageText DatatypeDetermines if the staged Blob Storage files are moved to the defined archive container, once processing is completed. This will use AzCopy v.10 commands, so requires AzCopy v.10 and SAS Tokens for access.
Stage On ExtractText DatatypeDetermines if the Azure-based Extract and Load process runs the stage process for the extracted data in the destination database.
Create Dummy FileText DatatypeShould the Staging package copy a placeholder.dummy file to accommodate the PolyBase limitation when no files exist.
Create External On StageText DatatypeShould the Staging process DROP and CREATE EXTERNAL TABLE before running the Staging Stored Procedure.
External File ConversionText DatatypeBy default, the extraction process from a source to blob storage applies several conversions to create files that are supported the target storage type. This setting allows control of this conversion process.
Distribute Round Robin Temporary TablesText DatatypeEnable to use Round Robin distribution in Azure Synapose temporary tables instead of the default Hash distribution.

Azure Copy

Setting
Type
Description
Enable StagingText DatatypeDetermines if the Azure Data Factory Copy Activity uses Staging. Use this together with Copy Method PolyBase to load data to PolyBase supported targets.
Staging SettingsText DatatypeThe staging settings to use when enabling Staging for the Copy Activity. Use @@this to automatically use the Linked Service associated with the PolyBase Landing connection.
Enable LoggingText DatatypeEnable to use logging in the Azure Data Factory Copy Activity.
Log SettingsText DatatypeThe settings for the logging in the Copy Activity, when logging is enabled. Use @@this to automatically use the Linked Service associated with the PolyBase Landing connection.
Copy MethodText DatatypeFor the Copy Activity, specifies the Copy Method to use. Bulk Insert allows direct inserts to the target. PolyBase allows automatic staging in a Blob Container and loading through external tables and PolyBase to supported targets.
PolyBase SettingsText DatatypeThe default Azure PolyBase settings to use when using the Copy Method PolyBase and enabling Staging for the Copy Activity.
Copy BehaviorText DatatypeCopy Behavior in Azure Copy Activity sets the method of data transfer from source to destination, especially when existing data files are present at the destination.
Source SettingsText DatatypeAdjust how data is read, controlling query timeout, partitioning for parallel reads, and fault tolerance. Enhances control and optimization of data extraction.
Is RecursiveText DatatypeDetermines whether or not the Copy Activity should recurse into sub-folders of the specified source directory when reading files.
Retry AttemptsText DatatypeMaximum number of retry attempts for Azure Copy Activity.
Retry IntervalText DatatypeThe number of seconds between each retry attempt for Azure Copy Activity.
TimeoutText DatatypeMaximum amount of time the Azure Copy Activity can run. Default is 7 days. Format is in D.HH:MM:SS.
Secure InputText DatatypeWhen enabled, input from the Azure Copy Activity will not be captured in Azure Data Factory logging.
Secure OutputText DatatypeWhen enabled, output from the Azure Copy Activity will not be captured in Azure Data Factory logging.
Data Integration UnitsText DatatypeSpecify the powerfulness of the Azure Copy Activity executor. Value can be 2-256. When you choose Auto, the Data Factory dynamically applies the optimal DIU setting based on your source-sink pair and data pattern.
Degree of Copy ParallelismText DatatypeFor the Azure Copy Activity, specifies the degree of parallelism that data loading should use.
Data Consistency VerificationText DatatypeDetermines if the Azure Copy Activity validates data consistency for supported sources and sinks.

Azure Storage

Setting
Type
Description
Archive ContainerText DatatypeThe Container Name to use for the archive process. This should be indicative of the purpose of the contents, such as archive.
Archive Account NameText DatatypeThe Azure Blob Storage Account Name to use for archiving data as files in blob storage,
Archive Account KeyText DatatypeThe storage Access Key to use when accessing the Blob storage.
Archive SAS TokenText DatatypeA storage access SAS Token to use when accessing the Blob storage.
Error ContainerText DatatypeThe Container Name to use for the error process.
Error Account NameText DatatypeThe Azure Blob Storage Account Name to use for error files in blob storage.
Error Account KeyText DatatypeA storage Access Key to use when accessing the error file Blob storage.
Error SAS TokenText DatatypeA storage access SAS Token to use when accessing the error file Blob storage.
Stage ContainerText DatatypeThe Container Name to use for the staging process.
Stage Account NameText DatatypeThe Azure Blob Storage Account Name to use for staging data as files in blob storage.
Stage Account KeyText DatatypeThe storage Access Key to use when accessing the staging Blob storage.
Stage SAS TokenText DatatypeThe storage access SAS Token to use when accessing the staging Blob storage.
Blob Storage DomainText DatatypeThe AzCopy domain to use.
External File FormatText DatatypeThe default External File Format definition to use.
Folder Naming ConventionText DatatypeNaming convention to use for Azure Storage Folders.
File Naming ConventionText DatatypeNaming convention to use for Azure Storage Files.

Core

Setting
Type
Description
Add SQL DefaultsText DatatypeEnable to add SQL Default constraints to tables created. SQL Defaults are always added to staging layer tables.
Global Default DateText DatatypeThe global default date to use for timelines. Please note that this value does not override the RowEffectiveFromDate.
Convert GUID To StringText DatatypeDetermines if a source column of type GUID/UniqueIdentifier is automatically converted to a String data type.
Use BimlCatalogText DatatypeDetermines if BimlFlex uses the BimlCatalog database for logging, auditing and configuration variables.
Use Custom ComponentsText DatatypeDetermines if BimlFlex uses the Ssis Custom Components to log RowCounts and Generate Hash Keys.
Lookup Table PatternText DatatypeThe table naming pattern for the Lookup Table.
Integration Key To UpperText DatatypeDetermines if strings in the Integration Key will be upper-cased automatically.
String ConcatenatorText DatatypeThe string value used in concatenating Integration Keys and Hash values (sanding element). Defaults to ~.
System Column PlacementText DatatypeControl the placement of system columns relative to the defined table columns based on configurations. Choose whether system columns should be added before or after the table columns.
Database ReferencesText DatatypeUse this comma-delimited list to specify additional databases for matching that may not be defined as a connection when the Use Database Reference option is checked, applicable for Snowflake or SSDT.
Root PathText DatatypeThe default root path for any other BimlFlex related file operations.
Archive PathText DatatypeThe default path for file archive operations.
Export PathText DatatypeThe default export path for file exports.
Import PathText DatatypeThe default import path for file imports.
Lookup Cache PathText DatatypeThe default path for Cache files used in lookups.
Configuration PathText DatatypeThe folder where SSIS configuration files are stored.
7 Zip PathText DatatypeThe location of the 7-zip executable for zip operations requiring 7-zip.
Hash AlgorithmText DatatypeThe hashing algorithm to use. (MD5/SHA1/SHA2_256/SHA2_512).
Hash BinaryText DatatypeDetermines if the generated hash value is stored as a binary representation or as a string representation.
Hash Integration KeyText DatatypeDetermines if the Integration Key is hashed.
Use SQL Compatible HashText DatatypeDetermines if the SSIS custom inline hashing component use a hashing approach compatible with the SQL Server HASHBYTES() function.
Use SQL Compatible Row HashText DatatypeDetermines if the SSIS custom inline hashing component for Full Row Hashing use a hashing approach compatible with the SQL Server HASHBYTES() function.
Cast Boolean to True FalseText DatatypeDetermines if the SQL inline hashing function for MSSQL, SQLDB and Synapse converts BIT (Boolean) values to True/False instead of 1/0.
Hash Null Value ReplacementText DatatypeThe Null value replacement to be used when hashing.
SSIS Hash Null Value ReplacementText DatatypeThe Null value replacement to be used when hashing using the Varigence BimlFlex SSIS Custom component.
Use User Null AssignmentText DatatypeSpecifies whether all non-key columns should be set as nullable or if their nullability is determined by user-defined input. It's advised against setting columns as non-nullable, as this may result in data load failures.
Hide Secondary ExclusionsText DatatypeGlobal toggle to hide secondary metadata in BimlStudio due to specific projects or objects being excluded or unmapped.
Archive ImportText DatatypeDetermines if imported files are archived after processing.
Zip Archive ImportText DatatypeDetermines if imported files are compressed when they are archived.
Zip Output FileText DatatypeDetermine if the created output file is zipped.
Zip Extract File In MemoryText DatatypeDetermines if the file zip process runs in-memory or through file streams. Files larger than 2 GB are always zipped through file streams.

Data Mart

Setting
Type
Description
Append DimensionText DatatypeThe string to append to Dimension object names.
Append FactText DatatypeThe string to append to Fact object names.
Append StagingText DatatypeThe string to append to the Data Mart staging object names.
Append IdentityText DatatypeThe string to append to the object name when cloning and checking the Add Identity Column option.
Infer Dimension MembersText DatatypeDetermines if the Data Mart process infers Dimension Members.
Stage On Initial LoadText DatatypeDetermines if the Data Mart stage process executes as part of an initial load.
Apply Lookup FilterText DatatypeDetermines if the SSIS Lookup checks for existing rows and applies a filter condition when joining the source table to the destination table. This applies to Data Mart processing and aims to optimize memory usage.
Display Database NameText DatatypeDetermines if the Database name is included in Data Mart object names.
Display Schema NameText DatatypeDetermines if the Schema name is included in Data Mart object names.
Constraint ModeText DatatypeThe table reference constraint mode to apply for the Data Mart database. Allowed values are DoNotCreate, CreateAndNoCheck and CreateAndCheck.

Data Vault

Setting
Type
Description
Derive Staged ObjectText Datatype
Use Hash KeysText DatatypeDetermines if the Data Vault uses Hash Keys. Alternatively, Natural Keys can be used by disabling this setting.
Accelerate Link SatelliteText DatatypeDetermines if the BimlFlex Accelerator creates Link Satellites from source metadata, containing attributes and effectivess attributes.
Accelerate Link Integration KeysText Datatype
Apply Data Type Mapping DVText DatatypeDetermines if the Data Type Mappings are applied to the Data Vault. The Data Type Mappings function allow expansion of data types.
Naming ConventionText DatatypeNaming convention to use for Data Vault Accelerator.
Column Naming ConventionText DatatypeNaming convention for Columns to use for Data Vault Accelerator.
Accelerate Hub KeysText DatatypeDetermines if the BimlFlex Accelerator adds source key columns to the Hub in addition to the Integration Key.
Accelerate Link KeysText DatatypeDetermines if the BimlFlex Accelerator adds source key columns to the Link in addition to the Integration Key.
Accelerate Link Satellite KeysText DatatypeDetermines if the BimlFlex Accelerator adds the Integration Key to Link Satellites.
Accelerate Correct Key NamesText DatatypeShould the Accelerator correct Integration Key names based on the Object Business Name
Accelerate Show ColumnsText DatatypeEnable to set the default Accelerator view to show all Columns.
Accelerate Show ExpandedText DatatypeEnable to set the default Accelerator view to show the Expanded view (Hubs, Links and Satellites) instead of the Data Vault Backbone (Only Hubs and Links).
Reduce Link Keys in StagingText DatatypeEnable this to reduce additional Link hash keys in the staging table.
Infer Link HubText DatatypeDetermines if the Data Vault process loads all involved Hubs when a Link table is loaded, independent of Hub loads from referenced tables. Enabling this setting will force BimlFlex to always load all corresponding Hub tables when a Link is loaded from a given source, even though this could be redundant because the Hub information may be provided by referenced tables.This applies to scenarios where the source system reliably manages referential integrity.
Process On StageText DatatypeDetermines if the Data Vault stored procedure is called after the Extract has been done. For Azure projects this must be combined with AzureStageOnExtract.
Uniform Change TypeText DatatypeOn enabling, all source inserts and updates are treated alike. If RowChangeType is the only difference, no new record is added, preventing duplication and preserving attribute history.
Apply Lookup Filter DVText DatatypeFor Staging-to-Data Vault processes, determines if the SSIS Lookup checks for existing rows by applying a filter condition joining the staging table to the destination table. This is to optimize memory usage.
End Date SatelliteText DatatypeDetermines if end dating is applied to the Data Vault Satellites.
Delta Collapse RowsText DatatypeEnable to keep only the initial row in Satellite sequences of identical values, discarding later timestamped duplicates.
ELT Delta Is DerivedText DatatypeEnable if loading into Data Vault and the Delta has already been derived. This will provide optimized ELT loads for scenarios like streams or insert only transaction source tables.
Use Cache LookupText DatatypeEnables using the file based cache lookup feature of SSIS (for lookups).
Use TransactionsText DatatypeEnable to wrap Data Vault ELT load processes in transaction statements.
Use Where ExistsText DatatypeEnable to use WHERE EXISTS type SQL statements instead of LEFT JOIN type statements for lookups in ELT processing.
Pushdown Parallel ProcessingText Datatype
Stage Surrogate KeysText Datatype
Insert Script With TableText DatatypeWhen enabled, this setting ensures that the default insert script is retained and bundled together with the CREATE TABLE file. This is useful if you wish to keep the initial data insertion logic intact alongside the table structure.
Create Satellite ViewsText DatatypeGenerate easy-to-use DataVault satellite views that include effectivity timelines and current statuses, making it simpler to analyze your data over time.
Prefix View NameText DatatypePrefix for Current (CURR) and Effectivity (EFDT) satellite views.
Current View SuffixText DatatypePrefix for Current satellite views, default is _CURR
Effectivity View SuffixText DatatypePrefix for Effectivity satellite views, default is _EFDT
Add Zero KeysText DatatypeEnable to automatically use a Zero Key for Link Keys when the key column is empty. This ensures that all records are linked, even if some key data is missing.
Zero Key ExpressionText DatatypeOverride the default SQL expression to generate Zero keys in Data Vault insert scripts. Use with caution: altering this expression may affect data consistency.
Bridge Lag DaysText DatatypeSpecify the number of days the Bridge process should go back and look for changes to reprocess.
Bridge Add Surrogate KeyText DatatypeEnables the use of a concatenated surrogate key in the Bridge table, comprising the Link keys.
Pit Lag DaysText DatatypeSpecify the number of days the Point-In-Time process should go back and look for changes to reprocess.
Pit Add Surrogate KeyText DatatypeEnables the use of a concatenated surrogate key in the PointInTime table, comprising the Hub key and the RowEffectiveFromDate.
Constraint ModeText DatatypeThe table reference constraint mode to apply for the Data Vault database. Allowed values are DoNotCreate, CreateAndNoCheck and CreateAndCheck.
Enable RollbackText DatatypeFor SSIS Only. Determines if the Batch orchestration engine rolls back (deletes) committed changes to the Data Vault database in case of a failed process.

Data Vault Naming

Setting
Type
Description
Append HubText DatatypeThe string to append to Hub names.
Append LinkText DatatypeThe string to append to Link names.
Append Link SatelliteText DatatypeThe string to append to Link Satellite names.
Append SatelliteText Datatypehe string to append to Satellite names.
Append Point In TimeText DatatypeThe string to append to Point In Time table names.
Append BridgeText DatatypeThe string to append to Bridge table names.
Default SchemaText DatatypeThe default schema to use for the Data Vault in the Accelerator.
Append Surrogate KeyText DatatypeThe string to append to Surrogate/Hash Key column names.
Prefix Surrogate KeyText DatatypeShould we prefix the Hubs and Links Surrogate Keys.
Append ReferenceText DatatypeThe string to append to Reference table names.
Append Hierarchy LinkText DatatypeThe string to append to Hierarchical Link names.
Append Same As LinkText DatatypeThe string to append to Same-As Link names.
Schema HubText DatatypeOverride the Default Schema used in the Accelerator for Hub tables.
Schema LinkText DatatypeOverride the Default Schema used in the Accelerator for Link tables.
Schema Link SatelliteText DatatypeOverride the Default Schema used in the Accelerator for Link Satellite tables.
Schema SatelliteText DatatypeOverride the Default Schema used in the Accelerator for Satellite tables.
Schema Point In TimeText DatatypeOverride the Default Schema used in the Create Point In Time dialog.
Schema BridgeText DatatypeOverride the Default Schema used in the Create Bridge dialog.
Display Database NameText DatatypeDetermines if the database name is displayed in the Data Vault.
Display Schema NameText DatatypeDetermines if the schema name is displayed in the Data Vault

Databricks

Setting
Type
Description
Retry AttemptsText DatatypeNumber of retry attempts specifies how many times the Data Factory service will attempt to re-run the Databricks Notebook Activity in case of failure.
Retry IntervalText DatatypeRetry Interval specifies the amount of time to wait before attempting to retry a failed Databricks Notebook Activity. The interval is defined in seconds. A value of 0 indicates no retries will be attempted. Recommended value is between 30 to 600 seconds for most use-cases.
TimeoutText DatatypeMaximum amount of time that the Azure Data Factory will wait for the Databricks Notebook Activity to complete can run. Default is 12 hours days. Format is in D.HH:MM:SS.
Secure InputText DatatypeEnable this option to protect sensitive data passed into the Databricks Notebook. When this setting is enabled, the input values will be masked and not visible in logs or metadata. This is recommended for passing confidential or sensitive information.
Secure OutputText DatatypeWhen enabled, the output of this Databricks Notebook Activity will be securely stored and not visible in activity logs, ensuring sensitive data is protected. Disabling this option may expose sensitive output data in logs.
Build Output PathText DatatypeThe folder where Databricks files are created upon build.
Repository NameText DatatypeThe name of the Repository where the Databricks files are located for runtime.
Notebook PathText DatatypeThe folder where the Databricks files are located for runtime.
Append Notebook NameText DatatypeThe string to append to identified generated Databricks Notebooks.
Add Sql CommentsText DatatypeEnable this option to include user-defined metadata as SQL comments in your CREATE TABLE scripts.
Add Sql TagsText DatatypeEnable this option to include user-defined business metadata as SQL tags in your CREATE TABLE scripts.
Use Unity CatalogText DatatypeSpecifies if the table create scripts should use Unity Catalog or the LOCATION clause.
Use Managed TablesText DatatypeSpecifies if the table create scripts should use Unity Catalog or the LOCATION clause.
Use Global ParametersText DatatypeSpecifies if the Azure Data Factory will call Databricks Notebooks using Global Parameters.
Use Copy IntoText DatatypeWhen enabled, notebooks will use COPY INTO to read files instaed of CREATE OR REPLACE TEMPORARY VIEW SQL syntax.
Use Temporary ViewsText DatatypeWhen enabled, notebooks will use the CREATE OR REPLACE TEMPORARY VIEW SQL statement to store data in memory for quicker access, rather than the CREATE TABLE IF NOT EXISTS statement which stores data on disk.
Use Liquid ClusteringText DatatypeWhen enable Delta Lake liquid clustering to replace traditional table partitioning and ZORDER for simplified data layout and optimized query performance.
Add Create CatalogText DatatypeSpecifies if the table create scripts should include the CREATE CATALOG IF NOT EXISTS statement.
Add Truncate NotebooksText DatatypeEnable to generate notebooks that will truncate all existing tables in the workspace. Recommended for development environments only.
Add Drop NotebooksText DatatypeEnable to generate notebooks that will drop all existing tables in the workspace. Recommended for development environments only.
Table PropertiesText DatatypeSpecifies the table properties to be used for creating tables in Databricks using the CREATE TABLE statement.
Read Files OptionsText DatatypeCustomize READ_FILES options for the file operation. The default is READ_FILES (mergeSchema => 'true').
Copy OptionsText DatatypeCustomize COPY_OPTION for the COPY INTO operation. The default is COPY_OPTIONS ('mergeSchema' = 'true').
Display Time ZoneText DatatypeSets the time zone for displaying timestamps in notebooks. This setting does not alter the original data but converts displayed timestamps to the chosen time zone for easier interpretation.
Data Time ZoneText DatatypeSets the time zone for loading timestamps in notebooks. This setting does not alter the original data but converts timestamps to the chosen time zone.

Delete Detection

Setting
Type
Description
Enable Delete DetectionText DatatypeDetermines if BimlFlex applies a separate key load pattern that will enable detection of hard deletes in the source
Process On StageText DatatypeDetermines if the Delete Detection batch should be called after the source extract to staging process has been completed.
Apply Delete Detection PSAText DatatypeUse the default process to insert detected deletes into the Persistent Staging Area table.
Apply Delete Detection DVText DatatypeUse the default process to insert detected deletes into the Data Vault Satellite tables.
Archive Detection FilesText DatatypeDetermines if the delete detection files are moved to the defined archive container, once processing is completed.
OnError Detection FilesText DatatypeDetermines if the delete detection files are moved to the defined error container on error, once processing is completed.
Delete Detection FilesText DatatypeDetermines if the delete detection files are deleted once processing is completed.

Model

Setting
Type
Description
Entity Naming ConventionText DatatypeSpecifies the naming convention used for entities in your business model. Choose from predefined conventions like PascalCase, camelCase, UPPER_CASE, lower_case, etc.
Entity Technical Naming ConventionText DatatypeSpecifies the technical naming convention used for entities in your business model. Choose from predefined conventions like PascalCase, camelCase, UPPER_CASE, lower_case, etc.
Attribute Naming ConventionText DatatypeSpecifies the naming convention used for attributes in your business model. Choose from predefined conventions like PascalCase, camelCase, UPPER_CASE, lower_case, etc.
Attribute Technical Naming ConventionText DatatypeSpecifies the technical naming convention used for attributes in your business model. Choose from predefined conventions like PascalCase, camelCase, UPPER_CASE, lower_case, etc.
Use Short Names for HubsText Datatype
Apply Naming ConventionText DatatypeNaming convention to use for objects and columns.
Infer Integration Key FromText DatatypeThe convention to infer the Integration Key from. Case sensitive options are None, PrimaryKey, UniqueKey, FirstColumn, IdentityColumn and ColumnName::[NameOfColumn].
Apply Data Type MappingsText DatatypeApply Data Type Mappings to Imported Objects.
Pad Integration KeyText DatatypeNumber of Characters to pad the Integration Key to.
Append Integration KeyText DatatypeThe string to append to Integration Keys.
Key Ends WithText DatatypeThe strings that BimlFlex interprets as key identifiers.
Add Record Source To Integration KeyText DatatypeImport Metadata will add @@rs to Integration Keys if true.
Change References To Integration KeyText DatatypeDetermines if the Import Metadata feature adds derived Integration Keys based on source references, or use source columns for references.
Import ViewsText DatatypeDetermines if database Views are imported when importing Metadata.
Integration Key Concatenation OrderText DatatypeDetermines the column order in the derived Integration Key.
FlexToBk on IntegrationKeyText DatatypeEnable this setting to verify that all source objects define the FlexToBk expression for their integration keys.
FlexToBk on ReferenceText DatatypeEnable this setting to verify that all source objects define the FlexToBk expression for their reference keys.
FlexToBk on ReferenceText DatatypeEnable this setting to verify that all source objects define the FlexToBk expression for their reference keys.
@@rs on FlexToBkText DatatypeEnable this setting to verify that all keys using the FlexToBk expression include the @@rs parameter for source objects.

Naming

Setting
Type
Description
Suffix Or Prefix ColumnText DatatypeThe SuffixOrPrefixColumn key defines the behavior when defining column names.
Suffix Or Prefix ObjectText DatatypeThe SuffixOrPrefixObject key defines the behavior when naming objects.
Append Procedure NameText DatatypeThe string to append to procedure names.
Append Batch NameText DatatypeThe string to append to Batch names.
Append Load From Psa NameText DatatypeThe string to append to the Load From PSA process name.
Stage Column With Business NameText DatatypeWhen defining a Business Name for an Column in the Business Overrides section, setting this to true will use the Business Name as the staging column name.
Stage Object With Business NameText DatatypeWhen defining a Business Name for an Object in the Business Overrides section, setting this to true will use the Business Name as the staging table name.
Use Record Source As AppendText DatatypeSpecifies if the record source should be appended to object names
Use Record Source As SchemaText DatatypeDetermines if the Record Source is used as the schema name for Staging and Persistent Staging Area tables.

Operations

Setting
Type
Description
Archive Retention Period (Days)Text DatatypeThe archive data retention period in days to use for the BimlFlex database cleanup process.
Snapshot Retention Period (Days)Text DatatypeThe snapshot data retention period in days to use for the BimlFlex database cleanup process.

Orchestration

Setting
Type
Description

Snowflake

Setting
Type
Description
AccountText DatatypeThe Snowflake account name to use.
RegionText DatatypeThe Snowflake region to use.
WarehouseText DatatypeThe Snowflake warehouse name to use.
DatabaseText DatatypeThe Snowflake database name to use.
SchemaText DatatypeThe Snowflake schema name to use.
PasswordText DatatypeThe Snowflake password to use.
UserText DatatypeThe Snowflake user name to use.
SnowSQL ConfigText DatatypeLocation of the Snowflake SnowSQL configuration file.
SnowSQL PathText DatatypeThe path to the local installation of the Snowflake SnowSQL CLI Client tool.
SnowSQL ConnectionText DatatypeThe Snowflake SnowSQL connection to use.
File FormatText DatatypeThe Snowflake file format to use.
Execute AsText DatatypeChoose how to execute commands in Snowflake: as the CALLER initiating the operation or as the OWNER of the object being accessed.
Remove StageText DatatypeDetermines if the Snowflake stage is removed prior to loading the new stage file.
Auto SuspendText DatatypeDetermines where the Snowflake database can Auto Suspend.
Scale UpText DatatypeDetermines if the Snowflake processing should scale up the Snowflake Warehouse at the start of the Batch.
Scale Up SizeText DatatypeThe size the Snowflake Warehouse can be scaled up to.
Scale DownText DatatypeDetermines if the Snowflake processing should scale down the Snowflake Warehouse at end of the Batch.
Scale Down SizeText DatatypeThe size the Snowflake Warehouse can be scaled down to.
Output PathText DatatypeThe folder where SnowDT database files are created.
Clean Output PathText DatatypeSpecifies whether the output folder for the SnowDT (Snowflake Data Tools) project should be cleared during the build process.
Add Sql CommentsText DatatypeEnable this option to include user-defined metadata as SQL comments in your CREATE TABLE scripts.
Use Database ReferencesText DatatypeEnable the use of database variables for compatability with SchemaChange for dynamic cross-database interactions, enhancing modularity and eliminating the need to hardcode database names.
Database VariableText DatatypeThe database variable for compatability with SchemaChange for dynamic cross-database interactions, enhancing modularity and eliminating the need to hardcode database names.
Create TableText DatatypeThe syntax to be used for the Snowflake CREATE TABLE DDL. It enables the use of CREATE OR REPLACE and CREATE OR ALTER syntax options, when supported by the Snowflake environment.

SSDT

Setting
Type
Description
Include .NET Core Project SupportText DatatypeDetermines if SSDT Project files and build script files are created with .NET Core support.
.NET Core Targets PathText DatatypeThe folder where the .NET Core Target and build support files are located.
Suppress TSql WarningsText DatatypeSuppress TSql Build Warnings.
Use Database ReferencesText DatatypeSSDT (SQL Server Data Tools) database projects will be able to use database variables, which allows for more dynamic cross-database interactions. This facilitates modularity and eliminates the need to hardcode database names.
Solution NameText DatatypeThe SSDT Solution Name used when Use Database References when the 'Use Database References' feature is enabled. This name can help in better organization and identification of the project within your development environment.
Output PathText DatatypeThe folder where SSDT database projects are created.
Clean Output PathText DatatypeSpecifies whether the output folder for the SSDT (SQL Server Data Tools) project should be cleared during the build process.
Visual Studio VersionText DatatypeThe setting allows you to specify which version of Visual Studio you are using or targeting for your project.
Include External TablesText DatatypeDetermines if External Tables are included in the generated SSDT Project.
Overwrite External Table DefaultsText DatatypeDetermines if existing external table-related files are overwritten.
Include Master KeyText DatatypeDetermines if the Master Key statement is includes in the SSDT Project.
Default Master KeyText DatatypeThe default Master Key SQL Statement to use.
Include CredentialText DatatypeDetermines if the Credential statement is included in the SSDT Project.
Default CredentialText DatatypeThe default Credential SQL Statement to use.
Include External Data SourceText Datatypeetermines if the External Data Source statement is included in the SSDT Project.
Default External Data SourceText DatatypeThe default External Data Source SQL Statement to use.
Include External File FormatText DatatypeDetermines if the External File Format statement is included in the generated SSDT Project.
Default External File FormatText DatatypeThe default External File Format SQL Statement to use.

SSIS

Setting
Type
Description
Convert Date To String With ScaleText DatatypeUsed to control the converted DateTime in the FlexToBk to ensure compatability with the SQL code.
Use Compatable Date FormatText DatatypeUsed to control the converted DateTime in the FlexToBk to ensure compatability with the SQL code.
ServerText DatatypeThe SSIS Server name to use for generated deployment script files.
SSISDBText DatatypeThe SSISDB database name to use for generated deployment script files.
FolderText DatatypeThe SSIS Catalog folder name to use for generated deployment script files.
Create FolderText DatatypeAdd Create SSIS Catalog Folder in SSIS deployment script files.
SqlCmd OverrideText DatatypeOverride the sqlcmd connection in the Create SSIS Catalog folder in the deployment script.
BLOB Temp Storage PathText DatatypeThe Blob Temporary Storage Path that SSIS uses to spool temporary data to disk when it runs out of memory.
Buffer Temp Storage PathText DatatypeThe Buffer Temporary Storage Path that SSIS uses to spool temporary data to disk when it runs out of memory.
Command TimeoutText DatatypeSSIS Command Timeout to use. Override the value here to change the default SSIS behavior.
Auto Adjust Buffer SizeText DatatypeSSIS Auto Adjust Buffer Size configuration for supported SQL Server versions.
Check ConstraintsText DatatypeSSIS Destination configuration for checking constraints. Defaults to False, as that is recommended for data warehouse destinations.
Default Buffer Max RowsText DatatypeSSIS Data Flow configuration for Default Buffer Max Rows for supported destinations.
Default Buffer SizeText DatatypeSSIS Data Flow configuration for Default Buffer Size for supported destinations.
Delay ValidationText DatatypeDetermines if generated SSIS packages use delayed validation for metadata validation.
Engine ThreadsText DatatypeMaximum number of SSIS engine threads to employ.
Max Concurrent ExecutablesText DatatypeMaximum number of concurrent SSIS executables to employ.
Maximum Insert Commit SizeText DatatypeSSIS Data Flow configuration for Maximum Insert Commit Size for supported destinations.
Process SubfoldersText DatatypeDetermines if a flat file source loop loads files in subfolders, of the specified source folder.
Rows Per BatchText DatatypeSSIS Data Flow configuration for Rows Per Batch for supported destinations.
Validate External MetadataText DatatypeDetermines if generated SSIS packages validate external metadata.
Use UTF8 Data ConversionText DatatypeDetermines if SSIS Express-based extract packages apply UTF8 data conversion.

Staging

Setting
Type
Description
Constraint ModeText DatatypeThe table reference constraint mode to apply for the STG (Staging) tables in BimlStudio for diagram previewing. Allowed values are DoNotCreate, CreateAndNoCheck and CreateAndCheck.
Persist HistoryText DatatypeProvides an option to override the Connection level attribute PersistHistory for more granular control.
Apply Data Type MappingsText DatatypeDetermines if the Data Type Mappings that are applied to source tables are used in the Staging and Persistent Staging databases.
ConfigurationsText DatatypeChoose how to handle the StagingAttribute for each object. Options include Derived, Source, and Inherit. This setting allows you to override the default behavior for greater customization.
Delete Import FileText DatatypeDetermines if imported files are deleted after processing.
Use TRY_CAST ConversionText DatatypeDetermines if the select-to-stage tables uses TRY_CAST and TRY_CONVERT.
Add Row Hash Key IndexText DatatypeEnable to add a unique, non-clustered constraint on the FlexRowHashKey and EffectiveFromDate columns in staging tables.
Select Blob Row Order ByText DatatypeRow Order definition to use for Blob source queries.
Select Stage Row DistinctText DatatypeDetermines if the select-to-stage tables apply a row number function based on the defined key, or us a distinct based on the full row including the HASH.
Apply Extract Conversion In Data FlowText DatatypeDetermines if SSIS extracts-to-file apply data conversion for target files in the SSIS Data Flow instead of in the source select statement.
Extract File EncodingText DatatypeFor an extracted file, specify a different encoding than the standard Unicode as produced by the BimlFlex source to file process. Valid choices are ASCII, BigEndianUnicode, UTF32, UTF7, UTF8, Unicode.
Extract File Split SizeText DatatypeThe file size to split extracted files into multiple files for.
Create Persistent ViewText DatatypeEnables the creation of SQL Views to query data from the Persistent Staging Area (PSA), simulating Staging Tables. This feature facilitates full reload processing of your Data Vault tables.
Prefix Persistent ViewText DatatypePrefix for Persistent views, simulating Staging tables.
Row Hash Persistent ViewText DatatypeWhen enabled, the RowHash will be included in the Persistent views. Note: This may reduce performance during Data Vault table reloads.

Staging Naming

Setting
Type
Description
Append Name ExternalText DatatypeThe string to append to External tables when using PolyBase.
Append Name LandingText DatatypeThe string to append to Landing tables when deploying using Azure Data Factory Copy directly to the database.
Schema Name PatternText DatatypeSpecific override behavior for the schema name for staging tables and other database assets.
Object Name PatternText DatatypeSpecific override behavior for the object name for staging tables.
Delete Object Name PatternText DatatypeThe name to use for the Delete Objects when using Delete Detection.
Delete Schema Name PatternText DatatypeThe name to use for the Delete objects schema when using Delete detection.
Append Record SourceText DatatypeDetermines if the Record Source Code from the connection is appended to the staging object name.
Append SchemaText DatatypeDetermines if the source Schema is appended to the object name in the staging layer.
Display Database NameText DatatypeControls if the source database name should be included in the generated SSIS package name.
Display Schema NameText DatatypeControls if the source schema name should be included in the generated SSIS package name.

Staging Persistent

Setting
Type
Description
Rollback STGText DatatypeFor SSIS Only. Determines if the Batch orchestration engine rolls back (delete) committed changes to the Staging database in case of a failed process.
Rollback PSAText DatatypeFor SSIS Only. Determines if the Batch orchestration engine rolls back (deletes) committed changes to the Persistent Staging database in case of a failed process.
Append SchemaText DatatypeThe string to add to the PSA objects when Staging and Persistent Staging are co-located in the same database.
Temporal Table Pattern NameText DatatypeThe string to add to the Temporal PSA objects when Staging and Persistent Staging are co-located in the same database.
Enable End DateText DatatypeApply end dating in the PSA. This will allow timelines to be maintained in the PSA. Disable this to configure an insert-only approach for the PSA for optimized load performance.
Bypass Persistent ChecksText DatatypeEnable this to bypass lookups, and directly applies all records to the Staging and Persistent Staging tables.
Use Cache LookupText DatatypeDetermines if the PSA lookup caches the data to disk when using SSIS. Use this if it is not possible to use the normal in-memory lookup behavior due to memory constraints.
Disable Stage, Persist OnlyText DatatypeEnable this to skip the staging layer and only persist changes directly in the PSA. This applies to SSIS output only.
Delta Use Hash DiffText DatatypeEnable this option to use Hash Diff comparisons for Change Data Capture and Change Tracking sources. This method provides a more robust way to identify and capture only the changes, but may increase load time.
Delta Detection in SQLText DatatypeDetermines if the delta detection applied when loading changes to the PSA uses a SQL procedure that runs on the Staging Area table, and not as part of the PSA pattern.
Delta Is Late ArrivingText DatatypeDoes the PSA load include late arriving deltas.
Delta Stage All RowsText DatatypeToggle to control whether to stage every row or compress by removing duplicate rows with the same values.
Delta Is DerivedText DatatypeDetermines if a PSA table already receives a data delta. Enable this if loading into PSA and the delta has already been derived earlier.
Merge All RowsText DatatypeEnable this setting to merge all data from a manually mapped source to a PSA table target, bypassing delta detection with rowhash diff comparison.
Truncate If Source Has RowsText DatatypeChecks if the Persistent Staging Area (PSA) should be truncated if there is source data pending processing. Currently, this feature is not supported in Databricks and Snowflake environments.
Delta Collapse RowsText DatatypeEnable to keep only the initial row in sequences of identical values, discarding later timestamped duplicates.