top of page
AutorenbildBrunner_BI

Benefits of using DAX INFO() functions instead of the DMVs via the XMLA endpoint in Power BI

Aktualisiert: 10. Dez.

Updated on Dec 12th 2024


The XMLA endpoint is the standard way to query a model in Power BI or Fabric for data and metadata. One major limitation though is that it is only available for workspaces in Premium or Fabric capacities. Every query is charged to your capacity resulting in a CU cost (it shows as interactive consumption in the Capacity metrics).


An interesting alternative to the XMLA endpoint is the Execute Queries API call. This method allows queries to models residing in shared / Pro Workspaces as well.

Additionally, for XMLA you need to have Contributor permissions in the workspace, to execute this API call you only need build permissions on the model and Viewer in the workspace (still checking if there is a way around the Viewer as well).


At this point (Dec 2024) I am still exploring the cost differences between two identicaly queries sent via XMLA and the DAX INFO() functions.


Since the DAX INFO() functions are getting more and more complete (INFO.CALCDEPENDENCY was just added in the June update of Power BI), this has really offered an alternative for querying semantic models.

However, there are also some constraints: each query is limited to 100,000 rows or 1,000,000 values, and there is a limit of 120 queries per minute per user. In most cases, the limitations will not be an issue.

Using DAX INFO() functions we can get a lot of the DMVs that we are used to getting via the XMLA endpoint.


For example INFO.CALCDEPENDENCY is equivalent to DISCOVER_CALC_DEPENDENCY


You can see a list INFO functions and the respective DMV queries here.


This post will demonstrate an easy way to execute a DAX query against a model in any workspace (if the user has access) using a simple Python script and exporting the results to an Excel file. 

You can basically execute any DAX query using this API call.


Requirements 
Three requirements must be met to execute this API call: 
  1. The tenant setting "Dataset Execute Queries REST API" must be enabled (Admin portal > Tenant Settings > Integration settings) see below. 

  2. The user must have access to the workspace where the semantic model resides. 

  3. You need to have the dataset and group id of the model and workspace (group)

 
How to (Python 3.12): 
The URI Parameters for the Execute Queries In Group call are the Group ID (Workspace ID) and the Dataset ID (Semantic model ID). You can find these by opening the model in the Power BI service and checking your URL. 

In this example, we use Python version 3.12.4 along with the following packages: Pandas version 2.2.2, requests version 2.32.3, and azure-identity version 1.17.1 for authentication.

After installing the packages, run the Python script below, replacing the variables workspace_id, semantic_model_id, dax_query, and excel_path with your values: 

Please replace WORKSPACE ID and MODELID accordingly

from azure.identity import InteractiveBrowserCredential 
import requests 
import pandas as pd 

# Replace the following variables with your own values 
workspace_id = "WORKSPACEID" 
semantic_model_id ="MODELID" 
dax_query = "EVALUATE INFO.CALCDEPENDENCY()" 

excel_path = "daxQuery.xlsx" 

# Do not change the following code: 
api = "https://analysis.windows.net/powerbi/api/.default" 
authority = "https://login.microsoftonline.com/" 
auth = InteractiveBrowserCredential(authority=authority) 
access_token = auth.get_token(api).token 
url=f"https://api.powerbi.com/v1.0/myorg/groups/{workspace_id}/datasets/{semantic_model_id}/executeQueries" 
payload = {"queries": [{"query": dax_query}]} 
header = {"Authorization": f"Bearer {access_token}"} 

response = requests.post(url, headers=header, json=payload) 

if response.status_code == 200: # Case its a success 

    data = response.json() 
    rows = data['results'][0]['tables'][0]['rows'] # Extract the rows data 
    df = pd.DataFrame(rows) # Create the DataFrame 
    df.to_excel(excel_path, index=False) 
 
The example script will execute the EVALUATE dimCountries query and save the results as an Excel file. Step by step, it will first open a browser for authentication using the InteractiveBrowserCredential class. After that, it will get the token and pass it to the header. Then it will execute the POST request, get the JSON data, parse the rows field of the response, and create a Pandas DataFrame. 

Conclusion 

Using the Execute Queries In Group API allows users to overcome some XMLA limitations, such as being restricted to premium workspaces. You can also get around the need of being a Contributor in a workspace to query a model. In general, it is a great and easy way to query small to medium size semantic models within your Power BI tenant.

139 Ansichten0 Kommentare

Aktuelle Beiträge

Alle ansehen

Comments


bottom of page