Deployment pipelines and Power BI semantic models with Direct Lake on OneLake tables
- DataZoe
- 5 hours ago
- 2 min read
Deployment pipelines allow you to develop models and reports in a development workspace then push to a production workspace when ready. This ensures as you make changes to your semantic models you have time to update your reports and test them before impacting your report consumers. Apps are great for this too but only shield reports changes from your consumers. Apps are a great way to share your reports widely and to specific audiences.
When utilizing Direct Lake tables in a semantic model, Direct Lake on SQL works well with deployment pipelines rules. The connector used will accept binding to different data sources.

Direct Lake on OneLake is a marked improvement to Direct Lake on SQL, allowing multiple Direct Lake source tables and adding tables from import sources in the same semantic model. But, in deployment pipelines rules, the connector doesn't support binding to different data sources. You can still use the parameter rules part of deployment pipeline rules.

It is easy to set up, and the semantic model refreshes and works as expected. For Power BI Desktop and editing unfortunately there is a bug with these impacts:
The parameter can't be set up in Power BI Desktop's TMDL view.
You won't be able to live edit in Power BI Desktop after this change.
You can open the model in web modeling once you have it set up but Edit tables will be disabled.

So, for now, you can use other tools, such as Tabular Editor, XMLA in SSMS, or TMDL in any text editor, including editing the expression in a GIT repro in web. Here I will explain how to do it in VS Code using the Fabric Studio extension.

Get VS Code and the Fabric Studio extension and Login to see your workspaces.
In the Fabric workspaces section, find your semantic model
Expand Definition folder
Expand definition folder (yes there are two)
Click on expressions.tmdl to open it in the editor
Add the expression parameter and then reference it in your connection (see below)
Save the tmdl file
Right-click the Definition folder and choose Publish to Fabric.
expression 'DirectLake - Lakehouse' =
let
Source = AzureStorage.DataLake(#"OneLakeURI", [HierarchicalNavigation=true])
in
Source
annotation PBI_IncludeFutureArtifacts = False
expression OneLakeURI = "https://onelake.dfs.fabric.microsoft.com/GUID1/GUID2" meta [IsParameterQuery = true, IsParameterQueryRequired = true, Type = "Text"]
annotation PBI_ResultType = TextHere I took the existing URI and put it in a parameter. GUID1 is the Workspace ID and GUID2 is your Fabric source ID, in this case my Lakehouse ID. Be sure not to use the SQL analytics endpoint ID for any Direct Lake on OneLake source. Be sure "expression" is not indented for either expression.
Now when you go to your deployment pipeline and click on the rules, the semantic model should show the parameter rules section. Change the parameter to the new data source for the next stage!
Read more about TMDL at https://learn.microsoft.com/en-us/analysis-services/tmdl/tmdl-overview and Direct Lake at https://learn.microsoft.com/en-us/fabric/fundamentals/direct-lake-overview.
