Skip to content
Binary file modified content/assets/images/data-refresh-view.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified content/assets/images/data-refresh-view2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified content/assets/images/dax-package-manager-overview.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
52 changes: 52 additions & 0 deletions content/features/Command-line-Options.md
Original file line number Diff line number Diff line change
Expand Up @@ -214,6 +214,58 @@ $p = Start-Process -filePath TabularEditor.exe -Wait -NoNewWindow -PassThru `
exit $p.ExitCode
```

### Passing Parameters to Scripts via Environment Variables

When executing C# scripts with the `-S` switch in Azure DevOps pipelines, the recommended way to pass parameters is through environment variables rather than command-line arguments. C# scripts can read environment variables using `Environment.GetEnvironmentVariable()`, and Azure DevOps automatically makes all pipeline variables available as environment variables.

**Example - Setting environment variables in YAML:**

```yaml
variables:
deployEnv: 'Production'
serverName: 'prod-sql-server'

steps:
- script: TabularEditor.exe "Model.bim" -S "UpdateModel.csx" -D "$(serverName)" "MyDatabase" -O -V -E -W
displayName: 'Deploy with Script Parameters'
env:
DEPLOY_ENV: $(deployEnv)
SERVER_NAME: $(serverName)
```

**Example - PowerShell Task with environment variables:**

```yaml
- task: PowerShell@2
displayName: 'Run Tabular Editor Script'
env:
DEPLOY_ENV: 'UAT'
CONNECTION_STRING: $(sqldwConnectionString)
inputs:
targetType: 'inline'
script: |
$p = Start-Process -filePath TabularEditor.exe -Wait -NoNewWindow -PassThru `
-ArgumentList "`"Model.bim`" -S `"ConfigureModel.csx`" -B `"output/Model.bim`" -V"
exit $p.ExitCode
```

**In your C# script (e.g., UpdateModel.csx):**

```csharp
var deployEnv = Environment.GetEnvironmentVariable("DEPLOY_ENV");
var serverName = Environment.GetEnvironmentVariable("SERVER_NAME");

Info($"Configuring model for {deployEnv} environment on {serverName}");

// Apply environment-specific changes
foreach(var ds in Model.DataSources.OfType<ProviderDataSource>())
{
ds.ConnectionString = ds.ConnectionString.Replace("{SERVER}", serverName);
}
```

This approach is cleaner and more maintainable than hardcoding values in scripts or using complex string replacement techniques. For more information on using environment variables in C# scripts, see [C# Scripts - Accessing Environment Variables](xref:csharp-scripts#accessing-environment-variables).

## Running the Best Practice Analyzer

You can use the "-A" switch to have Tabular Editor scan your model for all objects that are in violation of any Best Practice Rules defined on the local machine (in the %AppData%\..\Local\TabularEditor\BPARules.json file), or as annotations within the model itself. Alternatively, you can specify a path of a .json file containing Best Practice Rules after the "-A" switch, to scan the model using the rules defined in the file. Objects that are in violation will be outputted to the console.
Expand Down
25 changes: 10 additions & 15 deletions content/features/built-in-bpa-rules.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,9 +43,7 @@ The 27 built-in rules cover four areas:
- **Maintenance**: Descriptions, calculation groups, unused objects

### Global and Per-Rule Control
<!--
![Placeholder: Screenshot showing BPA preferences with global enable/disable toggle and per-rule checkboxes](~/content/assets/images/features/bpa-built-in-rules-preferences.png)
-->
![Screenshot showing BPA preferences with global enable/disable toggle and per-rule checkboxes](~/content/assets/images/features/bpa-built-in-rules-preferences.png)
You can enable or disable built-in rules globally or individually. Settings persist across sessions and work independently from your custom rules.

To manage built-in rules:
Expand All @@ -55,28 +53,25 @@ To manage built-in rules:
4. Use the BPA Manager to enable or disable individual rules

### First-Run Notification
<!--<!--
![Placeholder: Screenshot of first-run notification dialog introducing built-in BPA rules](~/content/assets/images/features/bpa-built-in-rules-notification.png)
-->
![Screenshot of first-run notification dialog introducing built-in BPA rules](~/content/assets/images/features/bpa-built-in-rules-notification.png)

The first time you open a model after upgrading to a version with built-in rules, you'll see a notification explaining the feature with a link to preferences. This notification only appears once.

### Knowledge Base Integration

<!--
![Placeholder: Screenshot showing BPA window with rule selected and "View Documentation" button highlighted](~/content/assets/images/features/bpa-built-in-rules-kb-link.png)
-->

![Screenshot showing BPA window with rule selected and "View Documentation" button highlighted](~/content/assets/images/features/bpa-built-in-rules-kb-link.png)

Every built-in rule links to a knowledge base article through the `KnowledgeBaseArticle` property. Each article explains what the rule checks, why it matters, and how to fix violations.

To view documentation, select a rule in the Best Practice Analyzer window.


### Read-Only Protection

Built-in rules can't be edited, cloned, or deleted. This ensures all users have the same rule definitions. You can disable individual rules, but the rule definitions themselves remain unchanged.

<!--
![Placeholder: Screenshot showing built-in rule with read-only badge/icon in BPA window](~/content/assets/images/features/bpa-built-in-rules-readonly.png)
-->
![Screenshot showing built-in rule with read-only badge/icon in BPA window](~/content/assets/images/features/bpa-built-in-rules-readonly.png)

### ID Collision Prevention

Built-in rules use reserved ID prefixes. When you create a custom rule, Tabular Editor validates that your ID doesn't conflict with built-in rules and shows an error if it does.
Expand All @@ -85,7 +80,7 @@ Built-in rules use reserved ID prefixes. When you create a custom rule, Tabular

The initial release includes the following rules:

<!--

### Error Prevention Rules
- [Avoid Invalid Characters in Object Names](xref:kb.bpa-avoid-invalid-characters-names)
- [Avoid Invalid Characters in Descriptions](xref:kb.bpa-avoid-invalid-characters-descriptions)
Expand All @@ -112,7 +107,7 @@ The initial release includes the following rules:
- [Calculation Groups Should Contain Items](xref:kb.bpa-calculation-groups-no-items)
- [Perspectives Should Contain Objects](xref:kb.bpa-perspectives-no-objects)
- [Use Latest Power BI Compatibility Level](xref:kb.bpa-powerbi-latest-compatibility)
-->


## Working with Built-in and Custom Rules

Expand Down
80 changes: 80 additions & 0 deletions content/features/csharp-scripts.md
Original file line number Diff line number Diff line change
Expand Up @@ -226,6 +226,86 @@ In addition, the following .NET Framework assemblies are loaded by default:
- TabularEditor.Exe
- Microsoft.AnalysisServices.Tabular.Dll

## Accessing Environment Variables

When running C# scripts via the Tabular Editor CLI (especially in CI/CD pipelines), you can pass parameters to your scripts using environment variables. This is the recommended approach, as C# scripts executed by Tabular Editor CLI don't support traditional command-line arguments.

### Reading Environment Variables

Use the `Environment.GetEnvironmentVariable()` method to read environment variables in your script:

```csharp
// Read environment variables
var serverName = Environment.GetEnvironmentVariable("SERVER_NAME");
var environment = Environment.GetEnvironmentVariable("ENVIRONMENT");

// Use them in your script
foreach(var dataSource in Model.DataSources.OfType<ProviderDataSource>())
{
if(dataSource.Name == "SQLDW")
{
dataSource.ConnectionString = dataSource.ConnectionString
.Replace("{SERVER}", serverName)
.Replace("{ENV}", environment);
}
}

Info($"Updated connection strings for {environment} environment");
```

### Azure DevOps Integration

Environment variables integrate seamlessly with Azure DevOps pipelines, as all pipeline variables are automatically available as environment variables by default.

**Example Azure DevOps YAML Pipeline:**

```yaml
variables:
targetServer: 'Production'
targetDatabase: 'AdventureWorks'

steps:
- task: PowerShell@2
displayName: 'Deploy Model with Parameters'
env:
SERVER_NAME: $(targetServer)
DATABASE_NAME: $(targetDatabase)
inputs:
targetType: 'inline'
script: |
TabularEditor.exe "Model.bim" -S "DeploymentScript.csx" -D "$(targetServer)" "$(targetDatabase)" -O -V -E -W
```

In this example, the script `DeploymentScript.csx` can access `SERVER_NAME` and `DATABASE_NAME` using `Environment.GetEnvironmentVariable()`.

### Common Use Cases

Environment variables are particularly useful for:

- **Dynamic connection strings**: Update data source connections based on deployment environment (Dev, UAT, Production)
- **Conditional logic**: Apply different transformations based on target environment
- **Deployment configuration**: Control which objects to deploy or modify based on parameters
- **Multi-environment support**: Use the same script across different environments with different values

**Example - Environment-specific modifications:**

```csharp
var environment = Environment.GetEnvironmentVariable("DEPLOY_ENV") ?? "Development";
var refreshPolicy = Environment.GetEnvironmentVariable("ENABLE_REFRESH_POLICY") == "true";

// Apply environment-specific settings
foreach(var table in Model.Tables)
{
if(environment == "Production" && !refreshPolicy)
{
// Disable incremental refresh policies in production if specified
table.EnableRefreshPolicy = false;
}
}

Info($"Configured model for {environment} environment");
```

## Compatibility

The scripting APIs for Tabular Editor 2 and Tabular Editor 3 are mostly compatible, however, there are cases where you want to conditionally compile code depending on which version you're using. For this, you can use preprocessor directives, which were introduced in Tabular Editor 3.10.0.
Expand Down
13 changes: 13 additions & 0 deletions content/features/tmdl.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,19 @@ When saving a new model for the first time, Tabular Editor (since v. 3.7.0), wil

![New Model Tmdl](~/content/assets/images/new-model-tmdl.png)

## TMDL and Microsoft Fabric Git Integration

TMDL is fully compatible with Microsoft Fabric's Git integration feature. When you use the **Save with supporting files** option in Tabular Editor 3, the TMDL serialization format creates a folder structure that includes all necessary metadata files required by Fabric's Git integration.

The resulting folder structure includes:
- **.platform** file with metadata (display name, description, logical ID)
- **definition.pbism** file with semantic model settings
- **definition/** folder containing your TMDL model files

This combination allows you to commit your semantic models to Git repositories and synchronize them with Microsoft Fabric workspaces using Fabric's built-in Git integration capabilities. The human-readable nature of TMDL makes it particularly well-suited for code reviews and tracking changes in version control systems.

For detailed information on using this feature, see [Save with supporting files](xref:save-with-supporting-files).

# Next steps

- [TMDL overview (Microsoft Learn)](https://learn.microsoft.com/en-us/analysis-services/tmdl/tmdl-overview).
Expand Down
5 changes: 4 additions & 1 deletion content/features/using-bpa.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,9 @@ applies_to:

The Best Practice Analyzer (BPA) lets you define rules on the metadata of your model, to encourage certain conventions and best practices while developing your Power BI or Analysis Services Model.

> [!NOTE]
> Tabular Editor 3 includes a comprehensive set of [built-in Best Practice Analyzer rules](xref:built-in-bpa-rules) that are enabled by default for new users.

## BPA Overview
The BPA overview shows you all the rules defined in your model that are currently being broken:

Expand All @@ -39,7 +42,7 @@ Clicking the link (or pressing F10), brings up the full BPA window.

### Functionality

Whenever a change is made to the model, the Best Practice Analyzer scans your model for issues in the background. You can disable this feature under File > Preferences.
Whenever a change is made to the model, the Best Practice Analyzer scans your model for issues in the background. You can disable this feature under **Tools > Preferences > Best Practice Analyzer**.

The BPA Window in both TE2 and TE3 allows you to dock the window on one side of your desktop, while keeping the main window in the other side, allowing you to work with your model while you can see BPA issues.

Expand Down
22 changes: 21 additions & 1 deletion content/features/views/data-refresh-view.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,27 @@ A new active refresh will appear when a new refresh is triggered through the TOM
<figcaption style="font-size: 12px; padding-top: 10px; padding-bottom: 15px; padding-left: 75px; padding-right: 75px; color:#00766e"><strong>Figure 1:</strong> Data Refresh View in Tabular Editor. New refresh can be started by right-clicking a table and selecting refresh </figcaption>
</figure>

A new refresh will run in the background so that you can continue to build your dataset, and Tabular Editor will let you know if the refresh fails with a pop up.
A new refresh will run in the background so that you can continue to build your dataset, and Tabular Editor will let you know if the refresh fails with a pop up.

## Data Refresh view columns

The Data Refresh view displays the following information for each refresh operation:

- **Object**: The name of the model object being refreshed (table, partition, or model)
- **Description**: Additional details about the refresh operation and its current state
- **Progress**: Shows the number of rows that have been imported so far.
- **Start Time**: The date and time when the refresh operation began. This is useful for tracking when operations were initiated, especially when multiple refreshes are queued
- **Duration**: The elapsed time since the refresh operation started, updated in real-time for active operations

### Sorting refresh operations

You can sort the refresh operations by clicking on any column header. This is particularly useful for:

- Clicking the **Start Time** column to sort refresh operations chronologically, with the most recent operations appearing first (descending sort) or last (ascending sort)
- Sorting by **Duration** to identify long-running operations
- Sorting by **Object** to group refreshes by table or partition name

Click a column header once to sort ascending, and click again to sort descending. This makes it easy to identify the latest refresh operations when working with multiple refresh queues.

> [!NOTE]
> All the messages and durations shown in the Data Refresh window are estimates only. Tabular Editor listens to [trace events from SSAS](https://learn.microsoft.com/en-us/analysis-services/trace-events/analysis-services-trace-events?view=asallproducts-allversions) during processing. SSAS is not guaranteed to send all trace messages to the client (for example it may throttle the trace event notifications during times of peak CPU/memory consumption).
Expand Down
2 changes: 1 addition & 1 deletion content/features/views/user-interface.md
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@ The **File** menu primarily contains menu items for dealing with loading and sav
> In Tabular Editor 3 Desktop Edition the **Open > Model from file...** and **Open > Model from folder...** options are not available and the **Open > File...** dialog only allows opening [supporting files](xref:supported-files#supported-file-types), not files containing metadata.

- **Revert**: This option lets you reload the model metadata from the source, discarding any changes that are made in Tabular Editor, which have not yet been saved. This option is useful when Tabular Editor 3 is used as an External Tool for Power BI Desktop, and a change is made in Power BI Desktop while Tabular Editor 3 is connected. By choosing **Revert**, Tabular Editor 3 can reload the model metadata from Power BI Desktop without having to reconnect.
- **Close**: This closes the active document (for example a DAX Query, a C# script or a data model diagram). If the document has unsaved changes, Tabular Editor will prompt you to save the changes before closing.
- **Close Document** (Ctrl+W): Closes the currently active document or panel in the main area, such as a DAX Query, a C# script, a data model diagram, or any other view with focus. If the document has unsaved changes, Tabular Editor will prompt you to save the changes before closing. This command is context-aware and will close whichever item is currently active in the main workspace area.
- **Close model**: This unloads the currently loaded model metadata from Tabular Editor. If you made changes to the metadata, Tabular Editor will prompt you to save the changes before closing.
- **Save**: This saves the active document back to the source file. If no document is active, this saves the model metadata back to the source, which could be a Model.bim file, a Database.json (folder structure) or a connected instance of Analysis Services (including Power BI Desktop) or the Power BI XMLA endpoint.
- **Save as...** This allows you to save the active document as a new file. If no document is active, this allows you to save the model metadata as a new file, using the .bim (JSON-based) file.
Expand Down
7 changes: 5 additions & 2 deletions content/getting-started/bpa.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Things you can check with the Best Practice Analyzer:
The Best Practice Analyzer has access to the full metadata of the model, and can also access VertiPaq Analyzer statistics for more advanced scenarios.

> [!NOTE]
> Tabular Editor does not ship with any rules out-of-the-box. You will have to define your own rules initially, or use a set of standard rules such as [those recommended by the Power BI CAT Team](https://powerbi.microsoft.com/en-ca/blog/best-practice-rules-to-improve-your-models-performance/).
> Tabular Editor 3 includes a comprehensive set of [built-in Best Practice Analyzer rules](xref:built-in-bpa-rules) that are enabled by default.

# Managing Best Practice Rules

Expand Down Expand Up @@ -98,6 +98,9 @@ Tabular Editor displays the best practice rule violations within the **Best Prac

The **Best Practice Analyzer view** shows a list of all rules that have objects in violation. Below each rule is a list of the violating objects. You can double-click on an object in the list, to navigate to that object in the **TOM Explorer**.

> [!TIP]
> **Enterprise Edition users**: Built-in BPA rules are displayed alongside any custom rules you define. These rules are enabled by default and provide comprehensive best practice guidance. You can manage built-in rules through **Tools > Manage BPA Rules...** where they appear in the **(Built-in rules)** collection. For more information, see [Built-in BPA rules](xref:built-in-bpa-rules).

![Item options](~/content/assets/images/bpa-options.png)

When right-clicking on an object, you are presented with a number of options as shown above. These are:
Expand All @@ -116,7 +119,7 @@ The options shown above are also available as toolbar buttons at the top of the

In some cases, you may want to disable the Best Practice Analyzer background scan. For example, when you have rules that take a relatively long time to evaluate, or when you are working with very large models.

The background scan can be disabled under **Tools > Preferences > Features > Best Practice Analyzer** by unchecking the **Scan for Best Practice violations in the background**.
The background scan can be disabled under **Tools > Preferences > Best Practice Analyzer** by unchecking the **Scan for Best Practice violations in the background**.

Note that you can still manually perform a scan using the **Refresh** button of the **Best Practice Analyzer view**, as mentioned above, even when background scans are disabled.

Expand Down
Loading