External Tools: Document your Power BI Model

In the Power BI Desktop July release, the External Tools capability was released (in preview). With that 3rd party tooling integration is now possible. Tools such as ALM Toolkit, DAX Studio and Tabular Editor integrate nicely with Power BI Desktop and are capable of writing data in your Power BI model (if new meta data preview is enabled).

External Tools capabilities are open source, so everyone can create his own tools to integrate with Power BI. Inspired by all the great content that was already created by David Eldersveld for Phyton integration and Erik Svensen with Excel Integration, I decided to build my own External tool as well. In this blog I describe how I created my external tool to document your Power BI model and how you can start using this!

This version of the Power BI Model Documenter is outdated. Please check the latest information on the product page.

Documentation?

Yes, really! I build and External Tool to document your Power BI Model. I know, documentation is not a very famous topic, but I believe a very important one! Lead time from data to insights is very short with Power BI, but often we forget to look back on what kind of monster we created. Especially if you want to share the dataset for reuse, I believe it is important to deliver some documentation as well. Maybe you even promote or certify this dataset in the future, which implicates that the model matches certain quality metrics and best practices.

I also posted a poll on Twitter to measure the sentiment around documenting Power BI solutions. I was a bit shocked, but not surprised at the same time by the results.

As we live in a busy era, we do not have that much time to document our solutions. Fellow MVP, Ásgeir Gunnarsson posted a nice comment on my tweet stating that documentation is often not prioritized by the customer.

Although the sentiment around documentation is not very positive, I believe that we as Data Analytics professionals should continue advising our customers to care about documentation. By providing an External Tool in Power BI Desktop, I hope that I can make it easier for a lot of people to deliver at least some basic documentation with their solutions.

External Tool Integration

As mentioned, in a few steps we can integrate a new External Tool in Power BI. Below I describe in a few steps how you can integrate the Model Documentation External Tool yourself. But first, let’s have a quick look on how it works and what the end-result will be.

Please know that the below description might not 100% match the current way of working, as I’m continuously updating the tool based on your feedback! This applies for the way of working as well as installing the tool! I advise to check the Change log as well.

How it works

I want to shortly describes how the approach works and what you can expect. Below a short line-up of steps in order.

  1. Open a Power BI solution that you want to document
  2. In the Top Ribbon under External Tools, click Document Model.
  3. After the click, you might see a PowerShell window appear and disappear quickly. This is the process that triggers all other actions to happen.
  4. At execution of the script, a json file is dumped in the defined location. By default this is C:\temp but you might have changed this to your own preference. See next chapter as well.
  5. A new Power BI Desktop instance will open and the .pbit template file will open directly. Here again, the location for the dumped file needs to be entered. Default location is listed in the template.
Opening the template. Please enter the file path as parameter. Default value is C:\temp\
  1. You can click load and this will start loading the report. You will be asked for Authentication to the source as well. Please choose Windows authentication here, as we query the local host on your computer.
Windows authentication while loading the data
  1. The provided Power BI template report will load. In the next chapter more about the template and how it looks like.

What you will get

As described above, clicking the button in the External Tools ribbon, a PowerShell script will be executed. This script dumps a file with the Server and DatabaseName in a local folder on your computer. As a second step, a Power BI template file will be opened that leverages the earlier created file and provides you with a new Power BI report that describes your Power BI model.

There is a lot of content available online that gives insights in what you can document about your Power BI model. Therefor I decided to focus on the External Tools integration rather than building a stunning report. Of course, you can customize the Power BI report or build your own model documentation report. Didier Terrien listed a few links on his blog that might be of inspiration of what is possible.

The report that is included in the solution, will tell you more about:

  • Tables and Columns in your data model
  • Power Query and DAX Table expressions
  • Measure Insights, the expressions, location, descriptions and more
  • Relationship overview, what relations are in your model
Power BI Report: Table and Column overview

Next to some overviews, I also included some analytics on best practices, such as:

  • Flagging auto generated date tables
  • Suggesting improvements on DAX Calculated columns
  • Errors in measures
  • Relationships with both-direction cross filtering
  • Many-to-many relationships
Power BI Report: Relationships overview

Integrate the External tool

Below I describe step by step what you need to do in order to integrate this External Tool in your own Power BI Desktop.

Step 1: Download necessary files
Download everything you need from my GitHub repository. This includes:

  • The Power BI Template file:
    ModelDocumentationTemplate.pbit
  • The PowerShell Script:
    Data-Marc_WriteConnectionDetailsToFile.ps1
  • External Tool integration file:
    Data-Marc_ModelDocumentor.pbitool.json

Step 2: Upload the integration file
The External Tool integration file is needed to get the button in the Power BI ribbon. In order to achieve this, you need to upload the Data-Marc_ModelDocumentor.pbitool.json file in the External Tools folder. For me this location was:
C:\Program Files (x86)\Common Files\microsoft shared\Power BI Desktop\External Tools

While uploading the files, it can be that Windows asks you to login with admin privileges before you can continue. This is mandatory to upload the files. If you cannot do this yourself, please contact your administrator.

pbitool.json file uploaded to the folder.

Step 3: Upload other files to temp folder
The two other files are referenced by the PowerShell script and need to be uploaded to a specific location as well. I have put them in a temp folder since this will only be used during script execution.

Please upload the PowerShell script and the Power BI Template file to: C:\temp. If this folder does not exist or you cannot upload files to the folder, you can put them in another location of your preference. Please know that this requires to do some minor edits to the PowerShell script as well.

Identical to the previous step, Windows might ask you to authenticate with admin privileges before you can continue.

Files to upload in C:\temp

Step 4: Restart Power BI Desktop
You have applied all required steps by now. Assuming that you did not change any of the file location, a new button will appear in the Power BI Desktop top ribbon for External Tools. In case you had Power BI running already, please restart Power BI desktop first.

If you made changes to the file locations, please read the next chapter first.

Power BI Desktop top ribbon – the new button Document Model is added

Did something not workout as expected for you, kindly check the FAQ that I added on GitHub to see if your question is already listed there. If not, please let me know and see if we can get it working for you!

Changes you can make

By following the four easy steps in the previous chapter, you can easily add the document model external tool to your Power BI Desktop. Though, you might want to customize some parts. Below I enlighten some of the changes you can easily make.

Change of file location

In the case you want to save the files on a different location then C:\temp, you must apply some changes to the PowerShell script. Looking at the PowerShell script, there are two items that refer to this location. Both are declared as variables at the top of the script for easy adjustment.

  • $OutputFolder, this is the location were a temporarily file with the server and database name will be saved.
  • $PBITLocation, this is the location where the Power BI template file is saved.

Everywhere in the code I added comments, so you know what the steps are doing and where they refer to.

# Below you can define your personal preference for file saving and reading. 
# The outputfolder will be used to dump a temporarily file with the connection to the model. 
# The PBITLocation defines where the templated PBIT file is saved. 
$OutputFolder = 'c:\temp\'
$PBITLocation = 'C:\temp\ModelDocumentationTemplate.pbit'

# Below section defines the server and databasename based on the input captured from the External tools integration. 
# This is defined as arguments \"%server%\" and \"%database%\" in the external tools json. 
$Server = $args[0]
$DatabaseName = $args[1]

# Write Server and Database information to screen. 
Write-Host $Server 
Write-Host $DatabaseName 

# Generate json array based on the received server and database information.
$json = @"
    {
    "Server": "$Server", 
    "DatabaseName": "$DatabaseName"
    }
"@

# Writes the output in json format to the defined file location. This is a temp location and will be overwritten next time. 
$OutputLocation = $OutputFolder + 'ModelDocumenterConnectionDetails.json'
$json  | ConvertTo-Json  | Out-File $OutputLocation

# Open PBIT template file from PBITLocation as defined in the variable. 
Invoke-Item $PBITLocation

If you decide to change the location of the PowerShell script, do not forget to also change this in the External Tool integration file. This json file refers to the location where the script is saved. At click of the button, this will be executed.

In the argument section, you see the file path that refers to the script. Other than that, your json might be a bit longer. For ingesting in my blog, I removed the base64 icon in below example.

{
	"version": "1.0",
	"name": "Document Model",
	"description": "This tool documents your Power BI data model in a separate Power BI report",
	"path": "C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\powershell.exe",
	"arguments": "C:\\temp\\Data-Marc_WriteConnectionDetailsToFile.ps1 \"%server%\" \"%database%\"",
	"iconData": "image/png;base64,<encoded png icon data>"
} 

Changes of the Power BI Template file

As said before, the provided Power BI Model Documentation template file, is just a start. There are many more things that you can do. It is completely up to you to use a different .pbit file or stick with this one.

As an inspiration, have a look at what Dynamic Management Views can provide in relation to Tabular Models. Also check out the earlier shared link to Didier Terrien blog with a bunch of other examples.

Wrap up and remarks

I believe this new external tool will help you deliver at least some basic documentation to every Power BI solution you will build from now on. This basic template provides you some insights in what has been build and some checks on best practices. Of course, everything can be customized to your own preference and in the way how you like it to work.

Personally, I tried to stay away saving a file on the local machine. But I could not get anything working to flow the Server and DatabaseName directly in the Power BI template file. Therefore, temporarily storage of this information was needed. Every time you click the Document Model button in Power BI Desktop, the file will be overwritten. With that, we prevent to mess-up your computer. As far as I know, this is the only working approach at this moment. Maybe this changes in the future, maybe not? Who knows..?

Finally, I made everything available on my GitHub page. If you want to enhance it, change it or you have some awesome ideas, please contribute to it! I would love to improve this solution and let more people use it. In the end we all benefit from some proper documentation 😊.

Last but not least, I would like to thank Imke Feldmann and Erik Svensen for thinking along in this solution and sharing their knowledge!


Updates:
June 29th 2020: Added links to FAQ page on GitHub.

80 thoughts on “External Tools: Document your Power BI Model

  1. Pingback: Documenting a Power BI Model – Curated SQL

  2. Pingback: Third-Party Tools to Ease Power BI Development and Increase Analyst Productivity – Olivier Travers

  3. Joep

    Hi Marc,

    I followed the steps and after opening a PBIX file and click on the Document Model button, I only see the PowerShell popup for a short time and after that, nothing happens. I used the same folder structure as you did so not sure why it’s not working. Any thoughts where I can find the missing link?

    Like

    1. Hi Joep,

      Sorry to hear that. Let’s see if we can debug and find what caused the issue.

      There are a few things you can check;
      – After the PowerShell popup disappears, can you please check C:\temp if there is a file dumped named: ModelDocumenterConnectionDetails.json ?
      – If the above steps works, please open the file and check if it includes both Server and DatabaseName
      – Double check if the pbit file is saved on the exact same location with the same name as is declared in the PowerShell script in the top two lines.
      – Finally, if that also succeeded, please open the pbit file separately. Does this work?

      Let me know the outcome!
      Marc

      Like

      1. Joep

        Hi Marc,

        The folder C:\temp does not contain a new file with the name ModelDocumenterConnectionDetails.json or any new file. only the 2 files as mentioned in your manual are in that folder.

        Like

      2. Hmm… okay, thanks for the prompt reply!
        Anything else in specific I should know? You didn’t make any changes to the pbitool.json file? Didn’t rename files? So I can try to reproduce on my side.

        Like

      3. Gintare

        Hi Marc, I also get a flickering powershell and then nothing happens. When I checked C:\temp if there is a file dumped named: ModelDocumenterConnectionDetails.json – it was not there but it was in C:\BusinessOpsTemp and yeah it does not have neither the server nor DatabaseName.
        It just shows:
        ” {\r\n \”Server\”: \”\”, \r\n \”DatabaseName\”: \”\”\r\n }”

        I haven’t moved or did anything with the original docs that I downloaded, so not sure where to begin fixing

        Like

    2. Vee

      Hi.
      I cant seem to get this work on a new machine.
      I have followed all your instructions. WHen i click on Document model – a PowerShell screen pop ups (with some red writing – which i cant see as it closes after about 2 secs).
      I got this to work on one users machine, but on another machine – i followed the exact same steps and it doesnt work?

      Like

      1. Vee

        Hi. This has been resolved. Ran the following command : Set-ExecutionPolicy Unrestricted -Scope CurrentUser (all on your own responsibility and risk!) and it now runs.

        The powershell runs and it tries to open the pbit, but I keep receiving this error :

        Feedback Type:
        Frown (Error)

        Error Message:
        A Task’s exception(s) were not observed either by Waiting on the Task or accessing its Exception property. As a result, the unobserved exception was rethrown by the finalizer thread.

        When i press cancel, i get the following error

        “We’re sorry. This report ran out of memory. Power BI Desktop will close now. Please restart.

        Like

      2. Vee

        Hi. Ok, so its getting a bit further now – but still not working.

        When i load my report , i click on Document Model, the template file loads and it asks me to enter my default location (i enter C:\BusinessOpsTemp)…. I then get the following error :

        OLD DB : The following system error occured. No connection could be made because the target machine actively refused it.

        Like

      3. Please check the FAQ, this will answer your question.
        Can I kindly ask you to only post your comments once, instead of reposting the same content multiple times.

        Thanks,
        Marc

        Like

      4. Vee

        Not sure if my last post posted successfully,

        In regards to the “OLD DB : The following system error occurred. No connection could be made because the target machine actively refused it” error.

        I am entering C:\BusinessOpsTemp\ (with the backslash at the end).
        The C:\BusinessOpsTemp\ folder has a json file called “ModelDocumenterConnectionDetails and this file contains a server and database name.

        Like

  4. Joep

    no I copied all files directly out the ZIP into the destination folders. It should work for any PBIX files and on every version? I have the most recent version of PB in place so think that could not be the issue.

    Like

    1. Hi Joep,

      I quickly build a enhanced PowerShell script that has some error handling in it. Can you please drop me a message via the contact form, so I can mail it to you separately. If this all succeeds, I will of course upload it to GitHub as well.

      Thanks!

      Like

  5. anastasios akparidis

    Hi Marc , this is an amazing tool , and helps greatly in understanding the model and much needed documentation for them.

    I have run to a little problem though and i am not sure how to proceed. I have added the file to the required folders but when the pbit loads up i get the below :

    “Query ‘Tables’ (step ‘Renamed Columns1’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.”

    I am not sure how to deal with this , any help will be greatly appreciated.

    I use a 64bit version of power bi desktop.

    Thank you
    AK

    Like

  6. Marcin

    Hi,
    I have an issue to add your tool. I download files from your GIT repository and paste to the valid location without modifying files.
    Error message:
    Unable to load the external tools. C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools\Data-Marc_ModelDocumentor.pbitool.json could not be loaded: Invalid json

    I use Version: 2.83.5894.961 64-bit (July 2020) of PowerBI
    and I have already ALM Toolkit and Tabular Editor.

    Like

    1. Hi Marcin,

      I’m sorry to hear that. I quickly tried to reproduce your case, but for me it works smoothly.

      – Reinstalled Power BI to the same version as you run.
      – I redownloaded the file from GitHub
      – Added the file,
      – Rebooted Power BI Desktop
      – Run the tool without any issues.

      What you possibly can do, is try to check the json file on errors with tools like VS Code. Maybe your system detects an error that is ignored on my side? Further, there is not much I can do, as I can’t reproduce the issue.

      Like

  7. Sebastiaan Vermeulen

    Hi Marc,
    Interesting article! I missed your poll on twitter, but I would have responded that I do document my Power BI models. Especially the choices in the Power query part as all steps are consecutive and the choice why a particular step is performed at a specific moment in the query is another always clear from the outside. I have to say that 6 months later I am often not able to remember why I made a particular choice in my query.
    Nevertheless, Imke placed some strong warnings on activating the new meta data preview (https://www.thebiccountant.com/2020/04/13/power-bi-store-datasets-in-enhanced-metadata-format-warning/). Can you respond to the points she makes to be reluctant in activating this feature? Her blog post is from April. Is it safe to say that some of her comments are no longer valid? I am curious what your view is in this matter.

    Like

    1. Hi Sebastiaan,

      Thanks for your comment and happy to see that you do document your models!

      Regarding the warnings around the new meta data format, it is preview for a reason 🙂 You should always be careful with these things. Though, as you probably know, Power BI is involving fast! As the blogpost is from April, many things have changed already. Having that said, in the end we all need to move to the new meta data, so my personal way of working is to experiment with it already.

      If you doubt if it is already stable, you can always first make a copy of your pbix file before you migrate it to the new meta data. Though, I haven’t experienced any issues with it lately.

      Cheers,
      Marc

      Like

  8. O_mii

    Hi Marc , Must say this is an amazing tool and can be start for lot of possibilities.

    I have run to a little problem though and i am not sure how to proceed. I have added the file to the required folders but when the pbit loads up i get the below :

    “Query ‘Tables’ (step ‘Renamed Columns1’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.”

    I checked the FAQ page & did as instructed i.e. i set the Privacy level to “Always ignore Privacy level settings” But problem still exists.

    When i restart the PBI and enter the parameter value, It shows another error as below:

    “OLE DB: The following system error occurred: No connection could be made because the target machine actively refused it.”

    Like

    1. Hi,

      Thanks for commenting and good to see that you were able to fix the first one. The second might be cause by a few things, can you kindly check the following;
      – did you trigger it from the PBI top ribbon?
      – if 1 applies, kindly check if there is a file saved in the c:\temp\ folder that includes the server and database? It should be a json file
      – if this is also correct and the json does include both server and databasename, did you enter the last \ in the name after the pbit opened? The full path should be c:\temp\
      – update the OLEDB drivers on you machine.

      Let me know which one you got stuck!
      – Marc

      Like

      1. O_mii

        Hi Marc,

        Thanks for quick reply. Appreciate it 🙂

        I’ve had changed the location of PBIT file and PowerShell Script but i missed “\” in file path while editing PS Script.

        Now it works perfect.

        Great work!

        Like

  9. TJ

    Hi Marc,

    This is amazing! I have been looking for a solution to better document my Power BI files.

    I am running into a problem. I’m unable to properly load the PBIT file. The queries in the PBIT all have the following message: “An error happened while reading data from the provider: ‘The ‘MSOLAP.8’ provider is not registered on the local machine.'”

    I have checked to make sure my Power BI version is the latest: Version: 2.83.5894.961 32-bit (July 2020)

    I also am running Excel 32-bit. I’m wondering if you have any thoughts on what might be the problem. Thanks!

    Like

    1. Hey TJ,

      This is due to the MSOLAP connector that you’re using. You have to download / upgrade this connector to match your Power BI bits version.

      Cheers,
      Marc

      Like

  10. vosaxalo

    Hi Marc,
    I have noted following issues:

    1) I have an alias for my tabular server that has a backslash in it (e.g. ssas.server\tab). When your ps script prepares the json connection file, the Server is set to \”ssas.server\\tab\” , but in the PBI template I see “ssas.server ab”
    2) The name of my source data model contains some spaces (e.g. my data model). When your ps script prepares the json connection file, the Databasename is set to \”my \”

    As workaround I have modified manually the json file as follow:
    \”Server\”: \”ssas.server\\\\tab\”, \r\n \”DatabaseName\”: \”my data model\”\r\n

    Regards

    Like

  11. Nick

    Please can you confirm are the reports genersted locally on the machine. Is there any data that is exported or processed across the interent. I work In the Public Sector so conscious of Information Governance etc.

    Like

    1. Hi Nick,
      No data is exported through internet and everything is rendered locally and queried through the local host on your computer.

      The only thing that might happen, is an automatic download of the pbit template file, in case the template can’t be found on the appropriate location. This is implemented since version 1.1.0. Please know, is a download not an upload of your data.

      Hope this answers your question.
      Cheers,
      Marc

      Like

  12. Pingback: Herramientas externas en Power BI Desktop - dataXbi

  13. Brian Mather

    Hi Marc,

    I’m hitting a blocker (seemingly on all my models) which is that the DMV SELECT * FROM $SYSTEM.DISCOVER_CALC_DEPENDENCY is not working. It returns

    “Unsupported dependency node discovered”.

    I’m trying to figure out if there is some anti-pattern that I’m using regularly. Have you had any issues running that DMV?

    Here’s some others also experiencing issues with the DMV

    https://github.com/DaxStudio/DaxStudio/issues/236

    Like

    1. Hey Brian,

      I’m not familiar with this error and haven’t heard anyone else about it. I’m curious to hear your results! Let me know so I can add it to the FAQ as well.

      Cheers,
      Marc

      Like

      1. Brian Mather

        I’ve just done the homework I should have done first…. created a brand new pbix model. And it works… grrr.. Now to iterate through building back up and find when it breaks. Will let you know 🙂

        Like

  14. Brian Mather

    So I’ve tracked it down to M. It happens where I am calling a SQL source using the ServerName and DatabaseName as parameters

    let
    Source = Sql.Database(#”Azure SQL Server”,#”Azure SQL Database”),
    dbo_vw_myview = Source{[Schema=”dbo”,Item=”vw_myview”]}[Data]
    in
    dbo_vw_myview

    If I replace parameters with the actual strings, then the dependency DMV works. But surely parameterising the ServerName and DatabaseName is good practice in order to be able to switch between DEV and TEST?

    Interestingly, if I have two queries – one with the parameters and one where the parameters are replaced by the string for server and database, the DMV breaks – even if the query that uses the parameters is set to not load.

    Like

  15. Henry

    Hello. I keep getting the following error
    “OLE DB: The following system error occurred. No connection could be made becuase the target machine actively refused it”

    I have read the FAQs and the comments on this thread, but nothing has worked.

    I enter C:\BusinessOpsTemp\ when prompted to do so.
    You have mentioned updating the OLD DB driver… is there a link on this on where i can download it from. I saw this link :https://docs.microsoft.com/en-us/sql/connect/oledb/oledb-driver-for-sql-server?view=sql-server-ver15 – but not sure which one to download.

    Thanks.

    Like

  16. victorkuznetsovbl

    Hi Marc,

    Thank you so much for developing this amazing tool. It saves us a lot of time avoiding developing our own solution to document our reports.

    I was able to successfully run the documenter for local files. It works as expected and everyone are happy with the results. But as always, it couldn’t be that easy =)

    There are two things that I don’t know how to solve (sorry if this is something simple but I’m only at the beginning of my Power BI journey).

    First. In addition to the logic which exists in the reports our company is using data flows quite a lot and our auditors also would like to know what is happening there. The question is how is it possible to document the data flow model?

    Second. If the report was built on the dataset published online then in ModelDocumenterConnectionDetails.json file instead of \”Server\”: \”localhost:57985\” I get the link to Power BI API “Server\”: \”pbiazure://api.powerbi.com\”. When trying to refresh the report I’m getting the following error: “OLE DB: Connection failed: Required Identity Provider AAD parameters were not specified in the connection string.”. Would be great if you can point me the direction I need to dig into to solve this problem.

    Thank you and have a great day.

    Like

    1. Hi Victor,

      Glad to hear that it works for you 😉

      With regards to the dataflows, there is currently no option included in the tool to do this. What could be an option that you could investigate, is using the Power BI APIs to extract dataflow logic.

      About the live connect to datasets, thats currently not supported either, as the connectionsting exported is not complete. This is a known limitation, but it’s on my list to investigate options to make this happen. Keep an eye out for upcoming updates, maybe… One day 🙂

      Cheers,
      Marc

      Like

  17. Grzegorz

    Hello Marc,
    This is really impressive work! Amazing!

    I have 2 follow up questions:
    1) I notice there is option to export to Excel you had in mind – how would that work? A power query model would be created in Excel?

    2) I am actually looking for another functionality – I would like to run your tool on 2 versions of my model, and compare what was changed. We are asked to document the changes incrementally, so somehow I need to extract only the parts that changed. Is there any plan to have such functionality in your tool?

    Like

    1. Hi,

      Thanks! To answer your questions:
      1) yes, that is exactly how it works. I’ve moreless copy-pasted the Power BI version into Excel. Personally I’m more a fan of the Power BI version, but people requested an Excel version which was easier to enhance. Please also know that there is a current limitation on the Excel version. Please check GitHub to see the issue raised there.

      2) At this moment that’s not possible with my tool. Though, its something that I would really love to integrate, but that takes time. If you want to extract something right now, I strongly suggest to look at ALM Toolkit. This tool allows you to compare two *.pbix files and then export differences in an Excel format.

      Hope this helps!
      Cheers, Marc

      Like

      1. Grzegorz

        Great, thanks. I must have missed the Excel version, I will have a look at github. I thought it was not prepared yet.

        Regarding point 2 – I will look into ALM toolkit. In meantime I also found “Power BI Side tools” – this allows to export all M code queries, DAX queries and layout (json file) and edit this in Visual Studio Code. And from Visual Studio Code, I can use github and run the version compare…

        A bit manual work, but will do the job I guess.

        Anyway, thanks a lot for your tool, it is really great.
        Cheers, Grzegorz

        Like

  18. Kris

    Hi I am having trouble downloading the files.
    I am able to download the ModelDocumentationTemplate.pbit but not the other two, they just open to scripts that are displayed. How do I download? Do I need a GitHub account to do so?
    The Power BI Template file:
    ModelDocumentationTemplate.pbit
    The PowerShell Script:
    Data-Marc_WriteConnectionDetailsToFile.ps1
    External Tool integration file:
    Data-Marc_ModelDocumentor.pbitool.json

    Like

  19. Pingback: Support of external tools in Power BI Desktop is GA - Power Community

  20. jrmoore05

    Getting the following error after downloading the OLE DB. Got the link from your github FAQ

    OLE DB: The following system error occured: No connection could be made because the target machine actively refused it.

    Thanks
    Jarrett

    Like

    1. Hi Jarrett,

      Could it be that the dataset is analysis services or a live connected dataset in the PBI service?
      If so, at this moment that is not supported yet unfortunately. I’m working on it.

      Thanks,
      Marc

      Like

      1. jrmoore05

        Marc,

        Thanks for the reply. Finally got it to work. The dataset wasn’t analysis services or a live connected dataset in PBI service, but good to know not currently supported.

        Thanks
        Jarrett

        Like

  21. Pingback: Power BI Challenge 13 – Acing Your Documentation | Enterprise DNA

  22. Pingback: How to use Power BI External Tools to Document Models – TDG

  23. Pingback: Enterprise DNA May Updates: New Masterclasses, Showcases, and More

  24. Rishabh Kumar Jain

    I did try to run the Documents Tool several times, but it just keeps on showing me this error, can you please halp me out with this ?

    Tables
    Query ‘Tables’ (step ‘Renamed Columns1’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.
    TablesMetadata
    Query ‘TablesMetadata’ (step ‘Changed Type’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.
    Table Expressions
    Query ‘Table Expressions’ (step ‘Changed Type’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.
    Measure Expressions
    Query ‘Measure Expressions’ (step ‘Renamed Columns1’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.
    Relationships
    Query ‘Relationships’ (step ‘Changed Type’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.
    Columns
    Query ‘Columns’ (step ‘Renamed Columns1’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.
    Columns Metadata
    Query ‘Columns Metadata’ (step ‘Renamed Columns1’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.
    DataTypes
    Loading blocked by failures with other queries.
    Crossfilter
    Loading blocked by failures with other queries.
    Cardinality
    Loading blocked by failures with other queries.
    SummarizeBy
    Loading blocked by failures with other queries.
    Row Level Security
    Query ‘Row Level Security’ (step ‘Expanded Role Expression Details’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.

    Like

  25. Rishabh Kumar Jain

    Also marc, I would like to mention this error is coming everytime, I am working on import data and sometimes on SQL import data, can you please help me resolve this issue and also in C drive it creates C:\BusinessOpsTemp this folder not the temp one.

    So what power shell scrips and other chages I need to make, to get going with the work ?

    Like

    1. The BusinessOpsTemp folder should be okay. This is changed after initial release. You can find the latest version of the script in Github (link below) or go for the BusinessOps Installer, which configures everything for you.

      –Marc

      Like

  26. Pingback: Enterprise DNA May Updates: New Masterclasses, Showcases, and More – CoE Portal

  27. Desiree Ann Cayat

    Hey Marc, the document model tool exporting to a PBI document is working but the excel is not. The complete error might look something like below: Tables – Query ‘Tables’ (step ‘Renamed Columns1’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination. and Loading blocked by failures with other queries.
    I already set the privacy level to Always ignore Privacy Level settings, still same error with choosing option 2 in the document model tool for excel output.

    Like

    1. Hi Desiree,
      Just to be sure, did you save the Excel template file after you changed the privacy levels? Cause this error is solely caused by privacy levels. So it might be that you only changed it for the current file, and not in the template file itself?

      –Marc

      Like

  28. Benjamin

    Hello Marc!

    Awesome tool you have built, very useful.

    Currently we are separating our datasets from out reports, and I was wondering how hard, or easy depending on the perspective, it would be to have the documentation in the dataset itself, instead of creating a new report only containing the documentation. So when you press the button “Model Documenter” you get the option to use the current file or create the new one.

    Especially in the case when you separate the dataset from the report, there is a win win situation.

    Please share your thoughts on this!

    Like

    1. Hi Benjamin,

      Thanks for your comment and great feedback! If I understand correctly, you want to put the documentation in the pbix from the data model. Sounds reasonable to me. Though this is potentially a bit harder than we might think. Simply because there is no way to script visuals etc as far as I know. Also, the backend from the model documenter relies on a local export of the vpax file which is extracted at tool execution. Due to this, it cannot be refreshed as part of your data model refresh in the Power BI Service, which brings another challenge.

      I’ll keep this in mind, maybe things change in the future and it might be possible at some point. Who knows? Would you mind adding this suggestion as “issue” on github? So we can track new stuff there 🙃

      Thanks!
      –Marc

      Like

      1. Benjamin

        Hello Marc,

        This is exactly what I mean!

        I think if the visuals are put in the “dataset template”, you than can start building and once you done select “Document dataset” and it creates the necessary files.

        Regarding the data model refresh challenge, I do not think this will be an actual issue, because once you upload the dataset to the service, you are not really changing it. Only when you open it again and modify it, but then you can just redocument it.

        Thanks for your support, I will create “an issue”.

        Like

  29. Question: When I run the latest version of model documentor I get an error:

    Colum[Parameter] is part of a composite key but not all columns of the composite key are included in the expression.

    Error Exporting VPAX file.

    I think this is related to a field parameter table. Any ideas how to work around this?

    Like

  30. Alperen Cicek

    Hey Marc,

    in my company we use a lot of sensitive data. The data is not permitted to be seen or leaked to people outside the company. Is the model documenter safe with regards to data privacy and safety?

    Best regards Alperen

    Like

    1. Hi Alperen,

      Great question. You don’t have to worry about security. The model documenter is only reading meta data of the model and has nothing to do with the actual data. In case you’re using DAX Studio today, it is the same backend as used by the model documenter. It generates a vpax file, and takes that as input to the model documenter report template.
      In case you need more details, please ping me a message by filling the contact form. Happy to elaborate further in a mail.

      –Marc

      Like

  31. Pingback: Power BI Challenge 13 – Acing Your Documentation – Quant Insights Network

Leave a comment