Build Your Own ChatGPT-Style App Using Power Automate and Power Apps

Just for fun, I wanted to see how I could replicate a basic ChatGPT like experience using the tools available to me in the Power Platform. I will cover setting up the Power Automate flow to interface the OpenAI completions endpoint using GPT-4 Turbo and a Power App for the UI. In a future post, I will create another flow(s) to work with the Assistants API and take this to the next level.

A quick look at the app and how it works. The app will keep the chat context active using the completions endpoint until you want to clear it out. Like I said, it’s basic!

Here is what I used to get this up and running:
Power App
Power Automate flow with the HTTP connector
OpenAI account
OpenAI API key (credit card required)

Here is the layout of the flow and Power App


The key for this to work and keep the context of the conversation flowing is to pass all of the previous questions and responses back to the API each time a new question is asked. Yes, that can start to add up in terms of burning a lot of tokens for a large chat.

The input is converted to the following format and passed to the API using the app.
role: user
content: question asked
The response from the API is the same.
role: assistant
content: response from the API
Using the two, the collection is built and displayed in a gallery.

Power App setup
ButtonQuestion OnSelect property does the following:
Set the button text
Set the varQuestion variable to the value of TextInputQuestion
Add the question and user role to the colResponses collection
Convert the colResponses collection to JSON
Cal the f Get HTTP flow passing in the collection
with the response from the flow, add the value(s) to the colResponses collection
Reset TextInputQuestion
Set the button text

GalleryRespnses properties:
Visible: If(IsBlank(ThisItem.content),false,true)
Text: ThisItem.content
X: If(ThisItem.role = “assistant”,40,5)
AutoHeight: true

Now for the flow! The flow is triggered from the app and passes a value.
Messages: the new question from the app and previous responses and questions (collection)

The Parse JSON action will take the Messages input and shape the JSON used in the Select Messages action.

The Select Messages action will use the output of the Parse JSON action to help form the correct input for the HTTP action.

Using a Compose action, the model and messages values are set. Note: in the example, I’m using GPT 4 Turbo preview.

The HTTP Request action uses the following values:
Method: Post
URL: https://api.openai.com/v1/chat/completions
Headers:
{
“Content-Type”: “application/json”,
“Authorization”: “Bearer @{outputs(‘Compose_API_Key’)}”
}
Body: output of the Compose Message Body


Another Parse JSON action is used to handle the response from the HTTP Request.


In the Select Response action, the output of Parse JSON Response is used to populate the role and content values.

The final action is to respond to the Power App. The respond to PowerApp step will not work if you use the dynamic content value to populate the response. You must use this expression: outputs(‘Select_Response’)

Again, if data is not sent back to the Power App, chances are you missed the sentence before this one.

As I mentioned at the beginning of this post, this is a basic example. If you see anything wrong with the process or places where it could be streamlined, please let me know.

Effortlessly Trigger a Flow from a Power App: A Simple Step-by-Step Example

In this post, I want to show how easy it is to call a Flow from a Power App. The goal of the Power App is to pass values to the Flow, have it add them together, and return a result.

Starting from the Power Apps portal
click Create –> Blank app, Black canvas app, name the app, for the format option, select tablet, then click Create button.

Power App overview:

Field TypeField NameOption
Text InputTextInputOneFormat: Number
Text InputTextInputTwoFormat: Number
LabelLabelNumberOne
LabelLabelNumberTwo
LabelLabelTotal
LabelLabelMathResult
ButtonButtonCalc

Flow overview:
The Flow can be created directly in the Power App designer or the Power Platform portal. For this example, I’m going to use the portal.

From https://make.powerapps.com,
Click on New flow and select Automated cloud flow

Click the Skip button at the bottom of the window (this will make sense in a min.)

With the Flow designer open, click PowerApps or search for it, then click on PowerApps (V2)

In this step, add two number inputs to the action

I named my number inputs as follow: inputNumberOne and inputNumberTwo

The Flow will respond to the app using the Repost to a PowerApp or flow action. For the output, again select number, and I named mine outputNumber .

the formula should be: add(triggerBody()[‘number’],triggerBody()[‘number_1’])

Name the Flow as Flow do Math, and save it. You can test the Flow simply by clicking the Test button and supplying two input values. The Flow can be named something different, but this name aligns with the below example.

Back in the PowerApp, click the Power Automate icon.

With the Power Automate window open, click on Add flow and select the newly created Flow, or search for it and select it.

On the app design surface, select the button and update its OnSelect property to:
Set(varNumber, FlowDoMath.Run(TextInputOne.Text,TextInputTwo.Text).outputnumber)

Select the LabelMathResult field and set its Text value to varNumber

Run the app, input values in the text fields, then click the button.

What just happened?


The values of the two text input fields were passed to the Flow, it added them together and returned the value in the outputnumber field; that value was then set to the varNumber variable.

In future posts, I will dive deeper into more complex examples.



How do you find ALL the Flows that reference a SharePoint site or list?

I asked this question when I first started down the path of learning about Flow:
How do you find all the Flows running on or referencing a SharePoint list?

UPDATE / EDIT – READ THIS Part
Before you start on this, please ensure that your account or the account you are using to run the script has sufficient permissions to the target environment(s).

$oneFlow = Get-AdminFlow -FlowName "00000-ae95-4cab-96d8-0000000" -EnvironmentName "222222-4943-4068-8a2d-11111111"

$refResources = $oneFlow.Internal.properties.referencedResources
Write-Host $refResources



If you run that command and look at the returned properties and see an error, that means you do not have the correct permissions to move forward. You can check your permissions in the Power Platform admin center: https://admin.powerplatform.microsoft.com/

/end of update

Think about it: someone in your company creates a Flow that runs when a SharePoint item is updated. Fast forward a year or so, and that coworker has moved on, and the Flow needs to be updated. If you work for a small company or one that hasn’t fallen in love with Power Platform and Flow, you’re likely in luck, and finding the Flow will take a few minutes. In my case, there are currently 2,712 Flows in my tenant that span several environments.

The PowerShell script I’ve created will query a tenant using the Get-AdminFlow command, return all Flows, and then loop through them. The script can be adjusted to target a single environment using the EnvironmentName parameter. Note: running the script using the Get-Flow action will return all the Flows your AD account can access.

#Install-Module AzureAD
#Install-Module -Name Microsoft.PowerApps.Administration.PowerShell  
#Install-Module -Name Microsoft.PowerApps.PowerShell -AllowClobber 

#connect-AzureAD

function Get-UserFromId($id) {
    try {
        $usr = Get-AzureADUser -ObjectId $id
        return $usr.displayName
    }
    catch {
        return $null
    }
}

#get all flows in the tenant
$adminFlows = Get-AdminFlow 

#set path for output
$Path = "$([Environment]::GetFolderPath('Desktop'))\Flow_Search_for_SharePoint_$(Get-Date -Format "yyyyMMdd_HHmmss").csv"

#set target site
$targetSPSite = "https://yourTenant.sharepoint.com/sites/yourSITE"
$targetSPList = "4f4604d2-fa8f-4bae-850f-4908b4708b07"
$targetSites = @()

foreach ($gFlow in $adminFlows) {

    #check if the flow references the target site
    $refResources = $gFlow.Internal.properties.referencedResources | Where-Object { $_.resource.site -eq $targetSPSite }

    #check if the flow references the target list
    #$refResources = $gFlow.Internal.properties.referencedResources | Where-Object { $_.resource.list -eq $targetSPList }

    if ($refResources -ne $null) {

        #optional - get the user who created the Flow
        $createdBy = Get-UserFromId($gFlow.internal.properties.creator.userId)

        $row = @{}
        $row.Add("EnvironmentName", $gFlow.EnvironmentName)
        $row.Add("Name", $gFlow.DisplayName)
        $row.Add("FlowEnabled", $gFlow.Enabled)
        $row.Add("FlowGUID", $gFlow.FlowName)
        $row.Add("CreatedByUser", $createdBy)
        $row.Add("CreatedDate", $gFlow.CreatedTime)
        $row.Add("LastModifiedDate", $gFlow.lastModifiedTime)
        
        $targetSites += $(new-object psobject -Property $row)
    }
}

#output to csv
$targetSites | Export-Csv -Path $Path -NoTypeInformation

If you don’t want to get the display name of the user who created the Flow, comment out the part of the script that calls the Get-UserFromId function, and you won’t need to connect to Azure.

And to answer my original question: How do you find all the Flows running on or referencing a SharePoint list?
In the script, comment out the part of the script that references $targetSPSite and un-comment $targetSPList. You can get the GUID of the list by navigating to list settings and looking at the URL. Another option is to open the list, view the Page Source, then look for the “listId” property.

In a future post(s), I will outline how to search for all Flows that use different connectors, Dynamics 365 tables (dataverse), triggered from Power Apps, or other objects. All of the info is in the properties of the Flow; getting to it can be a little fun.

Use a Power Automate Flow to Scrub File Names of Unwanted Characters

Last year, my team rolled out a Power App Portal (Power Pages) to allow customers to submit requests with attachments. The attachments are stored in Azure Blob Storage, and we use Cloud Mersive to virus scan the submitted attachments. Not to get too deep into the weeds, the process flows like this:
Attachment is uploaded –> lands in the Dataverse Note (annotation) table –> then is shipped to blob storage

Now the problem: users can name a file whatever they like and upload them. This quickly became an issue due to Flow not always being able to find the blobs associated with the request if the filename contained some special characters. Example: MyTrademark®.pdf

There are some great examples online for replacing special characters with a space or another supported character, but I wanted to take a different approach that seemed a lot more efficient to me. Where my example differs is the use of the Filter Array Flow action to only check the characters of the filename, as opposed to looping through each letter of the alphabet and comparing it to each letter in the filename.

Here’s the completed Flow, but I’ll dig into each step in this post.

The Compose Chars action holds the array of characters I will use to validate the characters in the supplied filename. This can be shortened if the input filename is to be set to uppercase or lowercase; only one set of the alphabet is needed.

[{"Char":"A"},{"Char":"B"},{"Char":"C"},{"Char":"D"},{"Char":"E"},{"Char":"F"},{"Char":"G"},{"Char":"H"},{"Char":"I"},{"Char":"J"},{"Char":"K"},{"Char":"L"},{"Char":"M"},{"Char":"N"},{"Char":"O"},{"Char":"P"},{"Char":"Q"},{"Char":"R"},{"Char":"S"},{"Char":"T"},{"Char":"U"},{"Char":"V"},{"Char":"W"},{"Char":"X"},{"Char":"Y"},{"Char":"Z"},{"Char":"a"},{"Char":"b"},{"Char":"c"},{"Char":"d"},{"Char":"e"},{"Char":"f"},{"Char":"g"},{"Char":"h"},{"Char":"i"},{"Char":"j"},{"Char":"k"},{"Char":"l"},{"Char":"m"},{"Char":"n"},{"Char":"o"},{"Char":"p"},
{"Char":"q"},{"Char":"r"},{"Char":"s"},{"Char":"t"},{"Char":"u"},{"Char":"v"},{"Char":"w"},{"Char":"x"},{"Char":"y"},{"Char":"z"},{"Char":"0"},{"Char":"1"},{"Char":"2"},{"Char":"3"},{"Char":"4"},{"Char":"5"},{"Char":"6"},{"Char":"7"},{"Char":"8"},{"Char":"9"}]

Compose Org Filename: string('my super 123 longer $%^&^ file /// name ^^^ with junk in it.xlsx')
Compose Split Extension: last(split(outputs('Compose_Org_Filename'), '.'))
Compose Concat Extension: concat('.', outputs('Compose_Split_Extension'))
Compose Get Filename: split(outputs('Compose_Org_Filename'), outputs('Compose_Concat_Extension'))[0]

The point of the Apply to each loop is to iterate over each item in the filename. Note: I’m using a Chunk function to break apart the filename. I first tried using a Spilt function, but there would be no end to what the delimiter might be.

Apply to each: chunk(outputs('Compose_Get_Filename'),1)

Filter array Chars: From: Compose Chars
char is equal to Current item
Here is the advanced view of the action:
@equals(item()?['char'], items('Apply_to_each'))
If you think of it like a SQL statement, it would be:
Select * from Compose Chars Where Char = Current item
The filter checks if the current item in the apply to each loop is in the Compose Chars array.

Condition: empty(body('Filter_array_Chars')) is equal to true
If the current item is not in the array, skip it (yes), else start building the filename (no)

This hack is needed due to not being able to set a Flow action equal to itself. Think of it like a programmatic iteration. i++ or i = i + 1

Compose Temp is a placeholder for the varNameBuilder variable.
Set variable Name Builder: concat(outputs('Compose_Temp'),items('Apply_to_each'))


Compose Clean Filename: concat(variables('varNameBuilder'),outputs('Compose_Concat_Extension'))

Copy of the Flow can be downloaded here:
https://www.sharepointed.com/wp-content/uploads/2023/02/FilenameScrubber_20230227.zip

Things to consider:
Empty filename –
What if the filename is nothing but special / unwanted characters? At the end of the Flow, you’d want to use a Length function to check varNameBuilder to see if it’s greater than X.
example: !@#@$#$%%^&.pdf
The result from the Flow would be .pdf, and updating the filename would fail. To my knowledge, you can’t name a file like that, but you get the point.

Also, I’ve had users upload files with non-English characters, so there is a viable chance that someone, at some point, might upload a file like this: 我喜欢炸玉米饼.pdf

Making the Flow available to other flows –
If the Flow is created in a solution, it could be used more like a function, and other Flows in the solution could reference it. This would be a great example of a reusable Child Flow.

How Do You Get a Power Page Attachment That’s Stored in Blob Storage

My Power App Portal (Power Pages) environments are configured to use Azure blob storage for form attachments. One of the primary reasons for doing this is to avoid filling up expensive dataverse storage with endless attachments submitted by enduers.

This article outlines how to set up Azure storage: link

What I’m going to demo is how to get ONE attachment that’s uploaded to a form. If your form allows multiple attachments, you’d simply loop through them.

In the example, I’m using the soon-to-be-obsolete dataverse connector, but the same basic flow design applies to the normal connector.

When a row is added to my table, the flow is triggered.
The flow then queries the Note (annotation) table using the ID from the source table.
filter query: (_objetid_value eq souce_table_id)

The list rows notes query will result in an array being returned, but I’m only dealing with one attachment, so there’s no need to loop through it. To avoid an unnecessary loop, a function can be used to target a single object from the array: first(body(‘List_rows_Notes’)?[‘value’])?[‘annotationid’]

From the Get row note action, annotationid and filename will be needed to help form the path to the blob. Using the concat function I’m combing the container name, annotationid, and filename. Also, note the transformation on annotationid, the hyphens need to be removed, and the string needs to be lowercase. The last part of the transformation is to remove .azure.txt from the filename.

concat('/blobcontainer/',toLower(replace(outputs('Get_row_Note')?['body/annotationid'], '-', '')),'/', split(outputs('Get_row_Note')?['body/filename'], '.azure.txt')[0])

The end result of the transformation will be:
/blobcontainer/annotationid/filename /blobcontainer/cf03e4cf7f72ad118561002248881923/example.pdf

With the path to the blob formed, the get blob content action can retrieve the file.

It’s that simple.

A couple of notes:
It would be wise to leverage a virus-scanning tool like Cloudmersive.
If you haven’t already noticed, when a user uploads a file that contains special characters in the name…it’s saved to the Note table without the special characters, but when it’s moved to blob storage, the characters will be in the name. Yes, that’s a bug Microsoft has yet to fix. You can avoid this by adding Javascript to the upload page to block files that fall into this category. OR. Write another flow to clean file names before the form is processed.
Example:
Uploaded filename: my report 1:2:3.pdf
Note table: my report 123.pdf
Blob: my report 1:2:3.pdf

How to run a Databricks Notebook using Power Automate

Part of a project I was working on required mashing up some data from SharePoint with data stored in datalake. We settled on creating a Databricks notebook to read an input file, query data lake using the input file, and then export an enriched file.

Here’s a high-level overview of what’s going to be created:

Call the notebook, parse the JSON response, loop until the notebook has finished, then respond to the notebook’s output.

In my case, triggering the notebook will require knowing its URL, bearer token, job id, and input parameters.


Parse the response from the HTTP call:


The notebook will take a little time to spin up, then process the input file. The best way to handle this is to leverage a basic do-until loop to check the status of the notebook job. I opted to use a one-minute delay, call the API to get the job status, parse the response, then evaluate if it’s finished.


One thing to note about the do until action, you don’t want it to run for eternity, and to avoid adding complexity to it, you don’t want to add extra evaluations like: if looped X times, stop
If you expand the Change limits option, you can set how many times it loops or change the duration. Here I’ve set the action to stop looping after 20 tries. For more info on this, please check SPGuides for a detailed overview.

The last step in the flow is to process the response from the notebook. If the job is success(full), get the file from blob storage and load it to SharePoint; otherwise, create a Slack alert.

That’s it; using the example above, you can trigger a Databricks notebook using a Flow.

Authentication
When I set this up, my company allowed the use of Personal Access Tokens (PAT).
https://docs.databricks.com/dev-tools/auth.html#pat
The PAT was then used in the Flow to trigger the notebook.

Power App Unable to add flow

I was in the process of porting a production Power App to a dev environment, and I ran into this error.

Unable to add flow
There was a problem adding your service. Please try again later.

The problem was that I tried to add an existing Flow to my Power App, but it was turned off.



Navigate to make.powerapps.com, locate the Flow you are trying to add to the Power App, turn it on, then try adding to the app.

Create Approvals That NEVER Expire

If you are reading this, you likely ran into an issue where you created an approval flow, but it expired before the recipient had time to approve or reject it. The timeout for an approval or any flow is thirty days; then, it stops running. Yes, there are some clever workaround to alert if the flow times out, but who wants to mess with that?

The approach I took to solve this was to leverage some of the existing tooling, then add to them. When you create an approval, a row is created in the dataverse Approval table. As we all know, a flow is trigger-based, so why not create one that simply monitors the Approval table, then handles things from there?

At a high level, here is the basic approach.

Start by creating a simple flow that initiates an approval, then run it. In my example, note the value in the Item Link field; this will come into play later.

Next, navigate to make.powerapps.com, expand the Dataverse section, and click on Tables. After the page loads, click the All link under Tables, then search for approval. If you search for approval and do not get a result, make sure you click the All link.



Open the Approval table; in it, you will see your approval, possibly more depending on how old your environment is or if many people in your company are using approvals. When looking at the data, the takeaway is what is stored in the table and what can be used in the flow that handles the outcome of the approval. In my case, using the Item Link field is key to handling the approval response. With it, I can filter the value and know if I need to take action on the item or not.

When creating the flow that responds to the approval, you can filter it at the design level or in the trigger settings. I went with the trigger setting due to the number of approvals that could be firing across my organization in our default tenant. Why do you need to filter it? Just assume other approvals might be writing to the same dataverse table.

Trigger Conditions

@contains(triggerBody()?['msdyn_flow_approval_itemlink'],'https://www.sharepointed.com/stuff/')

@not(equals(triggerBody()?['msdyn_flow_approval_result'], null))

The above conditions filter the value I passed in the create approval flow (Item Link) and if the item has been approved or rejected.

Here is an overview of the flow that handles the outcome of the approval. I mixed dataverse connector types due to an issue with the trigger condition not working with the green dataverse connector. In the Expand Query field, I used the Fetch XML builder to query over to the Approval Response table to get the comment field; not used in the example, but nonetheless, it’s there. From the Get a row by ID action, the response of the approval is available to use to handle the outcome (Result) of the approval.

To my knowledge, there is no reason why you can’t create an approval that is active for months, if not years.

Notes:
1) You can access and review the approval records using PowerBI, Flow, Access, ___
2) You can bulks update the records using PowerShell, Flow, Access (be real careful), __
3) You can pass items in the Details field, then parse them out when handling the approval. Here is one simple example where I’m passing a SharePoint item ID from the approval and parsing it in the response flow:



Response flow compose statement that parses the Details field.

Expression: last(split(triggerBody()?['msdyn_flow_approval_details'],'**SPItemID:** '))




YES, this is a lot, but the general idea is simple; create an approval and handle the response.

Create Dynamic Hyperlinks And Send An Email Action

Over the years, some updates to Flow have been better than others, and others, not so much. If memory serves, the send an email action would use dynamic hyperlinks without much work, but something went sideways with one of the updates causing dynamic hyperlinks not to work as you’d want.

Here is a basic example of including a hyperlink in an outgoing email; further down the page, I’ll provide a more realistitc example.


In this example, I setup the Flow to trigger when an item is added to a SharePoint library. The key thing to note in this example is the value of the varHyerplink variable; note the double quotes around the link to item.

Use Power Automate to Update a SharePoint Person Field

Using the SharePoint HTTP flow action to update a person or group field, I kept getting this error:

A 'PrimitiveValue' node with non-null value was found when trying to read the value of a navigation property; however, a 'StartArray' node, a 'StartObject' node, or a 'PrimitiveValue' node with null value was expected.

The field I was attempting to update is named Submitted By, with an internal name of Submitted_x0020_By. Each time I tried to update the field I was seeing the error noted above. It wasn’t until I looked at one of my previous flow runs did I notice what the issue was. It turns out, that the field name I should be using is Submitted_x0020_ById.



Update flow:

How do you update a Person field if the field allows for multiple selections? The example below will update the field with two different user values, but clearly, this could be extended to be more dynamic.

body('Send_an_HTTP_request_to_SharePoint_User_1')?['d']?['Id']
body('Send_an_HTTP_request_to_SharePoint_User_2')?['d']?['Id']

concat('[',outputs('Compose_1'),',',outputs('Compose_2'),']
{
    "__metadata": {
        "type":"SP.Data.AssignedToListListItem"
    },
    "SubmittedByIDsId": {
         "results": [
                 6,
                 54
          ]
    }
}