Over a year ago, I wrote a blog about moving dataflows across workspaces using a PowerShell script. Especially useful if you want to move dataflow logic from your development to test, acceptance or production workspace.
I received a bunch of feedback on this script and run into some issues myself as well lately. It was about time for an update of the script! Below I share the issues that are addressed in this new version and what new additions are added to the script.
Issues addressed and new enhancements
The script leverages the Power BI REST API for imports in a group. This API can be leveraged for publishing pbix files as well as dataflows.
Conflict Handler Mode
The issues I faced pretty early already, is the Conflict Handler Mode. This parameter in the API call defines whether the existing dataflow should be overwritten, ignored or generating an unique name for the dataflow.
After some trial and error with this specific setting, this should now be addressed in the new version of the PowerShell script. By first time deployment, it will publish the dataflows with the name as is. If the script detects that there are existing dataflows with a similar name, it will use the overwrite setting in the conflict handler mode.
Changing connection strings
If you move a dataflow from Development to Production, what is a very likely scenario in a DTAP development approach, it is also likely that your data source changes at the same time. As the previous version of the script simply extracted the model.json from workspace A and uploaded in workspace B, there was no easy way to change the connection string during deployment. This resulted in manual steps after the dataflow was moved from A to B.
As dataflows do not support change of parameter values outside the dataflow yet, another solution had to be found. In the new version of the script, there is a new option which does a simple find & replace on a text string. So, if you specify the connection string for environment A as well as for B, it will do a find and replace during deployment. Below example shows the connection strings that are specified in a variable of the script.
< rest of the script > ... $replaceItems =@( [Replacement]::new("https://dlswedemod01.blob.core.windows.net","https://dlswedemop01.blob.core.windows.net") ) ... < rest of the script >
Move multiple dataflows at a time
Finally, it could result in a lot of work if you want to move a whole bunch of dataflows. The previous version of the script only supported to move one dataflow at a time.
The new version now supports to move multiple dataflows at a time. By specifying another variable in the script, you can easily conduct a list of source workspace, destination workspace and the dataflow name that you want to move during script execution.
< rest of the script > ... $dataFlowItems =@( [DataFlow]::new("Sales Analytics DEV","Sales Analytics PRD","Sales Analytics Actuals"), [DataFlow]::new("Sales Analytics DEV","Sales Analytics PRD","Sales Analytics Budgets"), [DataFlow]::new("Sales Analytics DEV","Sales Analytics PRD","Sales Analytics Forecasts"), [DataFlow]::new("Sales Analytics DEV","Sales Analytics PRD","Sales Analytics Management Structure"), [DataFlow]::new("Sales Analytics DEV","Sales Analytics PRD","Sales Analytics Opportunity") ) ... < rest of the script >
The full script, including description how to use it, is now available in my GitHub library for Power BI Automation. Find the full script there and take advantage of it!
Finally, I want to do a big shout-out to my colleague Ton Swart who contributed to this script to make these changes happen!
5 thoughts on “Update: Move dataflows across workspaces”
Pingback: Move dataflows across workspaces with the Power BI REST API – Data – Marc
Pingback: Moving Power BI Dataflows Across Workspaces – Curated SQL
Pingback: PowerBIQuiz: APIs & PowerShell - Benni De Jagere
Hey Marc, thanks for working on this and sharing your work.
I’ve been looking at automating our dataflows as well (like in PowerBI_MoveDataflows.ps1). Can you please help me understand how you came up with this script? Was it trial-and-error? Did you have guidance from anyone on the Power BI team at Microsoft?
I’m opening a support case with CSS, based on my (unfortunate) experiences with “CreateOrOverwrite”. This dataflow stuff is not documented well, and what little we do find in their documentation is wrong:
Their documentation is very wrong. As one case in point, you are using “Overwrite” return “Ignore”, but the docs clearly say that Abort and GenerateUniqueName are the only options allowed for dataflows.
I don’t particularly want to spend any more time banging my head on this, if you know some contacts at Microsoft who built this mess. If you have any contacts, I’ll pass them along to the Microsoft CSS team.
Given the series of changes you made in your “PowerBI_MoveDataflows.ps1” over the course of the past few years, I’m nervous that their REST api is not well-defined and may continue to change at any time. I’m worried about putting any mission-critical dependencies on undocumented features of their REST api. Do you have any thoughts in regards to the quality of the docs, or the likelihood that his API will continue to work as it does today?
One final note… your script doesn’t use “CreateOrOverwrite” (anymore) but that is briefly (and inconsistently) mentioned in the Microsoft docs here :
Quote: “Dataflows can be overwritten with the CreateOrOverwrite parameter”
… as a result of that note I spent an inordinate amount of time using that for the conflict-handler-mode. I will probably avoid it going forward, especially given that you had abandoned that conflict-handler-mode two years ago.
Thanks again for the tips you’ve shared about this REST api. I’ve had more luck with dataset-related operations than with the dataflow ones.
FYI, The method _postDataflowDefinition in PowerBI_MoveDataflows.ps1
… is returning something called an “import ID”
… but it it being labeled by your script as the “new dataflow with id : x”
It is a small detail, but there is actually a “dataflow ID” as well, and they are both guids so it is important not to confuse the two different things.