Get the display value of a dataverse choice field in Power BI

I’m working on a simple report to pull some data from Dataverse into a Power BI report. The data includes some choice fields, and when I first generated the report, I only retrieved the internal value of the fields. So my Status field values look something like this: 10000001 Cleary, no one wants to see this, and I needed to grab the display values: Ordered, Processing, Shipped

Part of the problem was related to using Native Query to pull in the data and not selecting the correct field. Meaning there are two or more columns for each choice field.
My query looked something like this:
Select title, status, createdon from orders where customerid = ‘875-6309’

The query should have been:
Select title, statusname, createdon from orders where customerid = ‘875-6309’

tl;dr
try placing the word ‘name’ directly after your choice field name.
status would be statusname
state would be statename

Dataverse Resource not found for the segment table_name

I’m working with the Dataverse Web API and ran into this error while trying to write to a table.
Invoke-RestMethod : {"error":{"code":"0x8006088a","message":"Resource not found for the segment 'table name'."}}

The fix is to use the plural name of the table, but sometimes my engrish ain’t the bestest and I was struggling to figure out what the plural of Taco Order was (joking). If you want to find all the tables in your environment quickly, you can toss the API URL into a browser, which will list all the plural table names.

Example: https://taco.crm.dynamics.com/api/data/v9.1/

How Do You Get a Power Page Attachment That’s Stored in Blob Storage

My Power App Portal (Power Pages) environments are configured to use Azure blob storage for form attachments. One of the primary reasons for doing this is to avoid filling up expensive dataverse storage with endless attachments submitted by enduers.

This article outlines how to set up Azure storage: link

What I’m going to demo is how to get ONE attachment that’s uploaded to a form. If your form allows multiple attachments, you’d simply loop through them.

In the example, I’m using the soon-to-be-obsolete dataverse connector, but the same basic flow design applies to the normal connector.

When a row is added to my table, the flow is triggered.
The flow then queries the Note (annotation) table using the ID from the source table.
filter query: (_objetid_value eq souce_table_id)

The list rows notes query will result in an array being returned, but I’m only dealing with one attachment, so there’s no need to loop through it. To avoid an unnecessary loop, a function can be used to target a single object from the array: first(body(‘List_rows_Notes’)?[‘value’])?[‘annotationid’]

From the Get row note action, annotationid and filename will be needed to help form the path to the blob. Using the concat function I’m combing the container name, annotationid, and filename. Also, note the transformation on annotationid, the hyphens need to be removed, and the string needs to be lowercase. The last part of the transformation is to remove .azure.txt from the filename.

concat('/blobcontainer/',toLower(replace(outputs('Get_row_Note')?['body/annotationid'], '-', '')),'/', split(outputs('Get_row_Note')?['body/filename'], '.azure.txt')[0])

The end result of the transformation will be:
/blobcontainer/annotationid/filename /blobcontainer/cf03e4cf7f72ad118561002248881923/example.pdf

With the path to the blob formed, the get blob content action can retrieve the file.

It’s that simple.

A couple of notes:
It would be wise to leverage a virus-scanning tool like Cloudmersive.
If you haven’t already noticed, when a user uploads a file that contains special characters in the name…it’s saved to the Note table without the special characters, but when it’s moved to blob storage, the characters will be in the name. Yes, that’s a bug Microsoft has yet to fix. You can avoid this by adding Javascript to the upload page to block files that fall into this category. OR. Write another flow to clean file names before the form is processed.
Example:
Uploaded filename: my report 1:2:3.pdf
Note table: my report 123.pdf
Blob: my report 1:2:3.pdf

Filter a Power BI Source Before Importing

This has bugged me for a long time, and I wasn’t sure how to get around it. I was tasked with creating a Power BI report using data from a reasonably big data set, but the report only displayed a really small part of it. The source is in the neighborhood of 10,000,000 rows, and my report displayed roughly 100,000. Why try to direct query, import, or incrementally refresh all of the data when such a small part of the data is needed? I was looking for a way to filter the data set before you add the applied steps in the query editor. Something like: Select * from Orders where Status = Open

The answer I was looking for is Native Query, and I’m not sure how I completely failed to learn about this until now.

For this example, I’m using the Northwind Traders orders dataverse table. Connect to the dataverse environment, select the orders table, then open the Advanced Editor window. Change the query to this:

let
    Source = CommonDataService.Database("taco.crm.dynamics.com"),
    superQuery = Value.NativeQuery(Source, "Select * from nwind_orders")
in
    superQuery

All I’m doing here is checking to see if the newly formed query is working. Next, right-click on the step where you created the Native Query in the Applied Steps window and select view Native Query. Update the query only to show New orders. This can be done by adding a Where clause:

Select * From nwind_orders Where nwind_orderstatusidname = 'New'

Click Ok, and the query should refresh, only showing new orders.

let
    Source = CommonDataService.Database("taco.crm.dynamics.com"),
    superQuery = Value.NativeQuery(Source, "Select * From nwind_orders Where nwind_orderstatusidname = 'New'")
in
    superQuery

That’s all it takes to filter the data set before transforming it! Be sure to check the linked documentation and take a look at query folding.

Create Approvals That NEVER Expire

If you are reading this, you likely ran into an issue where you created an approval flow, but it expired before the recipient had time to approve or reject it. The timeout for an approval or any flow is thirty days; then, it stops running. Yes, there are some clever workaround to alert if the flow times out, but who wants to mess with that?

The approach I took to solve this was to leverage some of the existing tooling, then add to them. When you create an approval, a row is created in the dataverse Approval table. As we all know, a flow is trigger-based, so why not create one that simply monitors the Approval table, then handles things from there?

At a high level, here is the basic approach.

Start by creating a simple flow that initiates an approval, then run it. In my example, note the value in the Item Link field; this will come into play later.

Next, navigate to make.powerapps.com, expand the Dataverse section, and click on Tables. After the page loads, click the All link under Tables, then search for approval. If you search for approval and do not get a result, make sure you click the All link.



Open the Approval table; in it, you will see your approval, possibly more depending on how old your environment is or if many people in your company are using approvals. When looking at the data, the takeaway is what is stored in the table and what can be used in the flow that handles the outcome of the approval. In my case, using the Item Link field is key to handling the approval response. With it, I can filter the value and know if I need to take action on the item or not.

When creating the flow that responds to the approval, you can filter it at the design level or in the trigger settings. I went with the trigger setting due to the number of approvals that could be firing across my organization in our default tenant. Why do you need to filter it? Just assume other approvals might be writing to the same dataverse table.

Trigger Conditions

@contains(triggerBody()?['msdyn_flow_approval_itemlink'],'https://www.sharepointed.com/stuff/')

@not(equals(triggerBody()?['msdyn_flow_approval_result'], null))

The above conditions filter the value I passed in the create approval flow (Item Link) and if the item has been approved or rejected.

Here is an overview of the flow that handles the outcome of the approval. I mixed dataverse connector types due to an issue with the trigger condition not working with the green dataverse connector. In the Expand Query field, I used the Fetch XML builder to query over to the Approval Response table to get the comment field; not used in the example, but nonetheless, it’s there. From the Get a row by ID action, the response of the approval is available to use to handle the outcome (Result) of the approval.

To my knowledge, there is no reason why you can’t create an approval that is active for months, if not years.

Notes:
1) You can access and review the approval records using PowerBI, Flow, Access, ___
2) You can bulks update the records using PowerShell, Flow, Access (be real careful), __
3) You can pass items in the Details field, then parse them out when handling the approval. Here is one simple example where I’m passing a SharePoint item ID from the approval and parsing it in the response flow:



Response flow compose statement that parses the Details field.

Expression: last(split(triggerBody()?['msdyn_flow_approval_details'],'**SPItemID:** '))




YES, this is a lot, but the general idea is simple; create an approval and handle the response.