External Tools: Document your Power BI Model

In the Power BI Desktop July release, the External Tools capability was released (in preview). With that 3rd party tooling integration is now possible. Tools such as ALM Toolkit, DAX Studio and Tabular Editor integrate nicely with Power BI Desktop and are capable of writing data in your Power BI model (if new meta data preview is enabled).

External Tools capabilities are open source, so everyone can create his own tools to integrate with Power BI. Inspired by all the great content that was already created by David Eldersveld for Phyton integration and Erik Svensen with Excel Integration, I decided to build my own External tool as well. In this blog I describe how I created my external tool to document your Power BI model and how you can start using this!

Documentation?

Yes, really! I build and External Tool to document your Power BI Model. I know, documentation is not a very famous topic, but I believe a very important one! Lead time from data to insights is very short with Power BI, but often we forget to look back on what kind of monster we created. Especially if you want to share the dataset for reuse, I believe it is important to deliver some documentation as well. Maybe you even promote or certify this dataset in the future, which implicates that the model matches certain quality metrics and best practices.

I also posted a poll on Twitter to measure the sentiment around documenting Power BI solutions. I was a bit shocked, but not surprised at the same time by the results.

As we live in a busy era, we do not have that much time to document our solutions. Fellow MVP, Ásgeir Gunnarsson posted a nice comment on my tweet stating that documentation is often not prioritized by the customer.

Although the sentiment around documentation is not very positive, I believe that we as Data Analytics professionals should continue advising our customers to care about documentation. By providing an External Tool in Power BI Desktop, I hope that I can make it easier for a lot of people to deliver at least some basic documentation with their solutions.

External Tool Integration

As mentioned, in a few steps we can integrate a new External Tool in Power BI. Below I describe in a few steps how you can integrate the Model Documentation External Tool yourself. But first, let’s have a quick look on how it works and what the end-result will be.

Please know that the below description might not 100% match the current way of working, as I’m continuously updating the tool based on your feedback! This applies for the way of working as well as installing the tool! I advise to check the Change log as well.

How it works

I want to shortly describes how the approach works and what you can expect. Below a short line-up of steps in order.

  1. Open a Power BI solution that you want to document
  2. In the Top Ribbon under External Tools, click Document Model.
  3. After the click, you might see a PowerShell window appear and disappear quickly. This is the process that triggers all other actions to happen.
  4. At execution of the script, a json file is dumped in the defined location. By default this is C:\temp but you might have changed this to your own preference. See next chapter as well.
  5. A new Power BI Desktop instance will open and the .pbit template file will open directly. Here again, the location for the dumped file needs to be entered. Default location is listed in the template.
Opening the template. Please enter the file path as parameter. Default value is C:\temp\
  1. You can click load and this will start loading the report. You will be asked for Authentication to the source as well. Please choose Windows authentication here, as we query the local host on your computer.
Windows authentication while loading the data
  1. The provided Power BI template report will load. In the next chapter more about the template and how it looks like.

What you will get

As described above, clicking the button in the External Tools ribbon, a PowerShell script will be executed. This script dumps a file with the Server and DatabaseName in a local folder on your computer. As a second step, a Power BI template file will be opened that leverages the earlier created file and provides you with a new Power BI report that describes your Power BI model.

There is a lot of content available online that gives insights in what you can document about your Power BI model. Therefor I decided to focus on the External Tools integration rather than building a stunning report. Of course, you can customize the Power BI report or build your own model documentation report. Didier Terrien listed a few links on his blog that might be of inspiration of what is possible.

The report that is included in the solution, will tell you more about:

  • Tables and Columns in your data model
  • Power Query and DAX Table expressions
  • Measure Insights, the expressions, location, descriptions and more
  • Relationship overview, what relations are in your model
Power BI Report: Table and Column overview

Next to some overviews, I also included some analytics on best practices, such as:

  • Flagging auto generated date tables
  • Suggesting improvements on DAX Calculated columns
  • Errors in measures
  • Relationships with both-direction cross filtering
  • Many-to-many relationships
Power BI Report: Relationships overview

Integrate the External tool

Below I describe step by step what you need to do in order to integrate this External Tool in your own Power BI Desktop.

Step 1: Download necessary files
Download everything you need from my GitHub repository. This includes:

  • The Power BI Template file:
    ModelDocumentationTemplate.pbit
  • The PowerShell Script:
    Data-Marc_WriteConnectionDetailsToFile.ps1
  • External Tool integration file:
    Data-Marc_ModelDocumentor.pbitool.json

Step 2: Upload the integration file
The External Tool integration file is needed to get the button in the Power BI ribbon. In order to achieve this, you need to upload the Data-Marc_ModelDocumentor.pbitool.json file in the External Tools folder. For me this location was:
C:\Program Files (x86)\Common Files\microsoft shared\Power BI Desktop\External Tools

While uploading the files, it can be that Windows asks you to login with admin privileges before you can continue. This is mandatory to upload the files. If you cannot do this yourself, please contact your administrator.

pbitool.json file uploaded to the folder.

Step 3: Upload other files to temp folder
The two other files are referenced by the PowerShell script and need to be uploaded to a specific location as well. I have put them in a temp folder since this will only be used during script execution.

Please upload the PowerShell script and the Power BI Template file to: C:\temp. If this folder does not exist or you cannot upload files to the folder, you can put them in another location of your preference. Please know that this requires to do some minor edits to the PowerShell script as well.

Identical to the previous step, Windows might ask you to authenticate with admin privileges before you can continue.

Files to upload in C:\temp

Step 4: Restart Power BI Desktop
You have applied all required steps by now. Assuming that you did not change any of the file location, a new button will appear in the Power BI Desktop top ribbon for External Tools. In case you had Power BI running already, please restart Power BI desktop first.

If you made changes to the file locations, please read the next chapter first.

Power BI Desktop top ribbon – the new button Document Model is added

Did something not workout as expected for you, kindly check the FAQ that I added on GitHub to see if your question is already listed there. If not, please let me know and see if we can get it working for you!

Changes you can make

By following the four easy steps in the previous chapter, you can easily add the document model external tool to your Power BI Desktop. Though, you might want to customize some parts. Below I enlighten some of the changes you can easily make.

Change of file location

In the case you want to save the files on a different location then C:\temp, you must apply some changes to the PowerShell script. Looking at the PowerShell script, there are two items that refer to this location. Both are declared as variables at the top of the script for easy adjustment.

  • $OutputFolder, this is the location were a temporarily file with the server and database name will be saved.
  • $PBITLocation, this is the location where the Power BI template file is saved.

Everywhere in the code I added comments, so you know what the steps are doing and where they refer to.

# Below you can define your personal preference for file saving and reading. 
# The outputfolder will be used to dump a temporarily file with the connection to the model. 
# The PBITLocation defines where the templated PBIT file is saved. 
$OutputFolder = 'c:\temp\'
$PBITLocation = 'C:\temp\ModelDocumentationTemplate.pbit'

# Below section defines the server and databasename based on the input captured from the External tools integration. 
# This is defined as arguments \"%server%\" and \"%database%\" in the external tools json. 
$Server = $args[0]
$DatabaseName = $args[1]

# Write Server and Database information to screen. 
Write-Host $Server 
Write-Host $DatabaseName 

# Generate json array based on the received server and database information.
$json = @"
    {
    "Server": "$Server", 
    "DatabaseName": "$DatabaseName"
    }
"@

# Writes the output in json format to the defined file location. This is a temp location and will be overwritten next time. 
$OutputLocation = $OutputFolder + 'ModelDocumenterConnectionDetails.json'
$json  | ConvertTo-Json  | Out-File $OutputLocation

# Open PBIT template file from PBITLocation as defined in the variable. 
Invoke-Item $PBITLocation

If you decide to change the location of the PowerShell script, do not forget to also change this in the External Tool integration file. This json file refers to the location where the script is saved. At click of the button, this will be executed.

In the argument section, you see the file path that refers to the script. Other than that, your json might be a bit longer. For ingesting in my blog, I removed the base64 icon in below example.

{
	"version": "1.0",
	"name": "Document Model",
	"description": "This tool documents your Power BI data model in a separate Power BI report",
	"path": "C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\powershell.exe",
	"arguments": "C:\\temp\\Data-Marc_WriteConnectionDetailsToFile.ps1 \"%server%\" \"%database%\"",
	"iconData": "image/png;base64,<encoded png icon data>"
} 

Changes of the Power BI Template file

As said before, the provided Power BI Model Documentation template file, is just a start. There are many more things that you can do. It is completely up to you to use a different .pbit file or stick with this one.

As an inspiration, have a look at what Dynamic Management Views can provide in relation to Tabular Models. Also check out the earlier shared link to Didier Terrien blog with a bunch of other examples.

Wrap up and remarks

I believe this new external tool will help you deliver at least some basic documentation to every Power BI solution you will build from now on. This basic template provides you some insights in what has been build and some checks on best practices. Of course, everything can be customized to your own preference and in the way how you like it to work.

Personally, I tried to stay away saving a file on the local machine. But I could not get anything working to flow the Server and DatabaseName directly in the Power BI template file. Therefore, temporarily storage of this information was needed. Every time you click the Document Model button in Power BI Desktop, the file will be overwritten. With that, we prevent to mess-up your computer. As far as I know, this is the only working approach at this moment. Maybe this changes in the future, maybe not? Who knows..?

Finally, I made everything available on my GitHub page. If you want to enhance it, change it or you have some awesome ideas, please contribute to it! I would love to improve this solution and let more people use it. In the end we all benefit from some proper documentation 😊.

Last but not least, I would like to thank Imke Feldmann and Erik Svensen for thinking along in this solution and sharing their knowledge!

Links


Updates:
June 29th 2020: Added links to FAQ page on GitHub.

36 thoughts on “External Tools: Document your Power BI Model

  1. Pingback: Documenting a Power BI Model – Curated SQL

  2. Pingback: Third-Party Tools to Ease Power BI Development and Increase Analyst Productivity – Olivier Travers

  3. Joep

    Hi Marc,

    I followed the steps and after opening a PBIX file and click on the Document Model button, I only see the PowerShell popup for a short time and after that, nothing happens. I used the same folder structure as you did so not sure why it’s not working. Any thoughts where I can find the missing link?

    Like

    1. Hi Joep,

      Sorry to hear that. Let’s see if we can debug and find what caused the issue.

      There are a few things you can check;
      – After the PowerShell popup disappears, can you please check C:\temp if there is a file dumped named: ModelDocumenterConnectionDetails.json ?
      – If the above steps works, please open the file and check if it includes both Server and DatabaseName
      – Double check if the pbit file is saved on the exact same location with the same name as is declared in the PowerShell script in the top two lines.
      – Finally, if that also succeeded, please open the pbit file separately. Does this work?

      Let me know the outcome!
      Marc

      Like

      1. Joep

        Hi Marc,

        The folder C:\temp does not contain a new file with the name ModelDocumenterConnectionDetails.json or any new file. only the 2 files as mentioned in your manual are in that folder.

        Like

      2. Hmm… okay, thanks for the prompt reply!
        Anything else in specific I should know? You didn’t make any changes to the pbitool.json file? Didn’t rename files? So I can try to reproduce on my side.

        Like

  4. Joep

    no I copied all files directly out the ZIP into the destination folders. It should work for any PBIX files and on every version? I have the most recent version of PB in place so think that could not be the issue.

    Like

    1. Hi Joep,

      I quickly build a enhanced PowerShell script that has some error handling in it. Can you please drop me a message via the contact form, so I can mail it to you separately. If this all succeeds, I will of course upload it to GitHub as well.

      Thanks!

      Like

  5. anastasios akparidis

    Hi Marc , this is an amazing tool , and helps greatly in understanding the model and much needed documentation for them.

    I have run to a little problem though and i am not sure how to proceed. I have added the file to the required folders but when the pbit loads up i get the below :

    “Query ‘Tables’ (step ‘Renamed Columns1’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.”

    I am not sure how to deal with this , any help will be greatly appreciated.

    I use a 64bit version of power bi desktop.

    Thank you
    AK

    Like

  6. Marcin

    Hi,
    I have an issue to add your tool. I download files from your GIT repository and paste to the valid location without modifying files.
    Error message:
    Unable to load the external tools. C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools\Data-Marc_ModelDocumentor.pbitool.json could not be loaded: Invalid json

    I use Version: 2.83.5894.961 64-bit (July 2020) of PowerBI
    and I have already ALM Toolkit and Tabular Editor.

    Like

    1. Hi Marcin,

      I’m sorry to hear that. I quickly tried to reproduce your case, but for me it works smoothly.

      – Reinstalled Power BI to the same version as you run.
      – I redownloaded the file from GitHub
      – Added the file,
      – Rebooted Power BI Desktop
      – Run the tool without any issues.

      What you possibly can do, is try to check the json file on errors with tools like VS Code. Maybe your system detects an error that is ignored on my side? Further, there is not much I can do, as I can’t reproduce the issue.

      Like

  7. Sebastiaan Vermeulen

    Hi Marc,
    Interesting article! I missed your poll on twitter, but I would have responded that I do document my Power BI models. Especially the choices in the Power query part as all steps are consecutive and the choice why a particular step is performed at a specific moment in the query is another always clear from the outside. I have to say that 6 months later I am often not able to remember why I made a particular choice in my query.
    Nevertheless, Imke placed some strong warnings on activating the new meta data preview (https://www.thebiccountant.com/2020/04/13/power-bi-store-datasets-in-enhanced-metadata-format-warning/). Can you respond to the points she makes to be reluctant in activating this feature? Her blog post is from April. Is it safe to say that some of her comments are no longer valid? I am curious what your view is in this matter.

    Like

    1. Hi Sebastiaan,

      Thanks for your comment and happy to see that you do document your models!

      Regarding the warnings around the new meta data format, it is preview for a reason 🙂 You should always be careful with these things. Though, as you probably know, Power BI is involving fast! As the blogpost is from April, many things have changed already. Having that said, in the end we all need to move to the new meta data, so my personal way of working is to experiment with it already.

      If you doubt if it is already stable, you can always first make a copy of your pbix file before you migrate it to the new meta data. Though, I haven’t experienced any issues with it lately.

      Cheers,
      Marc

      Like

  8. O_mii

    Hi Marc , Must say this is an amazing tool and can be start for lot of possibilities.

    I have run to a little problem though and i am not sure how to proceed. I have added the file to the required folders but when the pbit loads up i get the below :

    “Query ‘Tables’ (step ‘Renamed Columns1’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.”

    I checked the FAQ page & did as instructed i.e. i set the Privacy level to “Always ignore Privacy level settings” But problem still exists.

    When i restart the PBI and enter the parameter value, It shows another error as below:

    “OLE DB: The following system error occurred: No connection could be made because the target machine actively refused it.”

    Like

    1. Hi,

      Thanks for commenting and good to see that you were able to fix the first one. The second might be cause by a few things, can you kindly check the following;
      – did you trigger it from the PBI top ribbon?
      – if 1 applies, kindly check if there is a file saved in the c:\temp\ folder that includes the server and database? It should be a json file
      – if this is also correct and the json does include both server and databasename, did you enter the last \ in the name after the pbit opened? The full path should be c:\temp\
      – update the OLEDB drivers on you machine.

      Let me know which one you got stuck!
      – Marc

      Like

      1. O_mii

        Hi Marc,

        Thanks for quick reply. Appreciate it 🙂

        I’ve had changed the location of PBIT file and PowerShell Script but i missed “\” in file path while editing PS Script.

        Now it works perfect.

        Great work!

        Like

  9. TJ

    Hi Marc,

    This is amazing! I have been looking for a solution to better document my Power BI files.

    I am running into a problem. I’m unable to properly load the PBIT file. The queries in the PBIT all have the following message: “An error happened while reading data from the provider: ‘The ‘MSOLAP.8’ provider is not registered on the local machine.'”

    I have checked to make sure my Power BI version is the latest: Version: 2.83.5894.961 32-bit (July 2020)

    I also am running Excel 32-bit. I’m wondering if you have any thoughts on what might be the problem. Thanks!

    Like

    1. Hey TJ,

      This is due to the MSOLAP connector that you’re using. You have to download / upgrade this connector to match your Power BI bits version.

      Cheers,
      Marc

      Like

  10. vosaxalo

    Hi Marc,
    I have noted following issues:

    1) I have an alias for my tabular server that has a backslash in it (e.g. ssas.server\tab). When your ps script prepares the json connection file, the Server is set to \”ssas.server\\tab\” , but in the PBI template I see “ssas.server ab”
    2) The name of my source data model contains some spaces (e.g. my data model). When your ps script prepares the json connection file, the Databasename is set to \”my \”

    As workaround I have modified manually the json file as follow:
    \”Server\”: \”ssas.server\\\\tab\”, \r\n \”DatabaseName\”: \”my data model\”\r\n

    Regards

    Like

  11. Nick

    Please can you confirm are the reports genersted locally on the machine. Is there any data that is exported or processed across the interent. I work In the Public Sector so conscious of Information Governance etc.

    Like

    1. Hi Nick,
      No data is exported through internet and everything is rendered locally and queried through the local host on your computer.

      The only thing that might happen, is an automatic download of the pbit template file, in case the template can’t be found on the appropriate location. This is implemented since version 1.1.0. Please know, is a download not an upload of your data.

      Hope this answers your question.
      Cheers,
      Marc

      Like

  12. Pingback: Herramientas externas en Power BI Desktop - dataXbi

  13. Brian Mather

    Hi Marc,

    I’m hitting a blocker (seemingly on all my models) which is that the DMV SELECT * FROM $SYSTEM.DISCOVER_CALC_DEPENDENCY is not working. It returns

    “Unsupported dependency node discovered”.

    I’m trying to figure out if there is some anti-pattern that I’m using regularly. Have you had any issues running that DMV?

    Here’s some others also experiencing issues with the DMV

    https://github.com/DaxStudio/DaxStudio/issues/236

    Like

    1. Hey Brian,

      I’m not familiar with this error and haven’t heard anyone else about it. I’m curious to hear your results! Let me know so I can add it to the FAQ as well.

      Cheers,
      Marc

      Like

      1. Brian Mather

        I’ve just done the homework I should have done first…. created a brand new pbix model. And it works… grrr.. Now to iterate through building back up and find when it breaks. Will let you know 🙂

        Like

  14. Brian Mather

    So I’ve tracked it down to M. It happens where I am calling a SQL source using the ServerName and DatabaseName as parameters

    let
    Source = Sql.Database(#”Azure SQL Server”,#”Azure SQL Database”),
    dbo_vw_myview = Source{[Schema=”dbo”,Item=”vw_myview”]}[Data]
    in
    dbo_vw_myview

    If I replace parameters with the actual strings, then the dependency DMV works. But surely parameterising the ServerName and DatabaseName is good practice in order to be able to switch between DEV and TEST?

    Interestingly, if I have two queries – one with the parameters and one where the parameters are replaced by the string for server and database, the DMV breaks – even if the query that uses the parameters is set to not load.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s