About Ian Hayse

A lengthy career working as a SharePoint developer, admin, and architect. I'm now working in the Power Platform and Azure spaces. What happened to InfoPath?

Use a Power Automate Flow to Scrub File Names of Unwanted Characters

Last year, my team rolled out a Power App Portal (Power Pages) to allow customers to submit requests with attachments. The attachments are stored in Azure Blob Storage, and we use Cloud Mersive to virus scan the submitted attachments. Not to get too deep into the weeds, the process flows like this:
Attachment is uploaded –> lands in the Dataverse Note (annotation) table –> then is shipped to blob storage

Now the problem: users can name a file whatever they like and upload them. This quickly became an issue due to Flow not always being able to find the blobs associated with the request if the filename contained some special characters. Example: MyTrademark®.pdf

There are some great examples online for replacing special characters with a space or another supported character, but I wanted to take a different approach that seemed a lot more efficient to me. Where my example differs is the use of the Filter Array Flow action to only check the characters of the filename, as opposed to looping through each letter of the alphabet and comparing it to each letter in the filename.

Here’s the completed Flow, but I’ll dig into each step in this post.

The Compose Chars action holds the array of characters I will use to validate the characters in the supplied filename. This can be shortened if the input filename is to be set to uppercase or lowercase; only one set of the alphabet is needed.

[{"Char":"A"},{"Char":"B"},{"Char":"C"},{"Char":"D"},{"Char":"E"},{"Char":"F"},{"Char":"G"},{"Char":"H"},{"Char":"I"},{"Char":"J"},{"Char":"K"},{"Char":"L"},{"Char":"M"},{"Char":"N"},{"Char":"O"},{"Char":"P"},{"Char":"Q"},{"Char":"R"},{"Char":"S"},{"Char":"T"},{"Char":"U"},{"Char":"V"},{"Char":"W"},{"Char":"X"},{"Char":"Y"},{"Char":"Z"},{"Char":"a"},{"Char":"b"},{"Char":"c"},{"Char":"d"},{"Char":"e"},{"Char":"f"},{"Char":"g"},{"Char":"h"},{"Char":"i"},{"Char":"j"},{"Char":"k"},{"Char":"l"},{"Char":"m"},{"Char":"n"},{"Char":"o"},{"Char":"p"},
{"Char":"q"},{"Char":"r"},{"Char":"s"},{"Char":"t"},{"Char":"u"},{"Char":"v"},{"Char":"w"},{"Char":"x"},{"Char":"y"},{"Char":"z"},{"Char":"0"},{"Char":"1"},{"Char":"2"},{"Char":"3"},{"Char":"4"},{"Char":"5"},{"Char":"6"},{"Char":"7"},{"Char":"8"},{"Char":"9"}]

Compose Org Filename: string('my super 123 longer $%^&^ file /// name ^^^ with junk in it.xlsx')
Compose Split Extension: last(split(outputs('Compose_Org_Filename'), '.'))
Compose Concat Extension: concat('.', outputs('Compose_Split_Extension'))
Compose Get Filename: split(outputs('Compose_Org_Filename'), outputs('Compose_Concat_Extension'))[0]

The point of the Apply to each loop is to iterate over each item in the filename. Note: I’m using a Chunk function to break apart the filename. I first tried using a Spilt function, but there would be no end to what the delimiter might be.

Apply to each: chunk(outputs('Compose_Get_Filename'),1)

Filter array Chars: From: Compose Chars
char is equal to Current item
Here is the advanced view of the action:
@equals(item()?['char'], items('Apply_to_each'))
If you think of it like a SQL statement, it would be:
Select * from Compose Chars Where Char = Current item
The filter checks if the current item in the apply to each loop is in the Compose Chars array.

Condition: empty(body('Filter_array_Chars')) is equal to true
If the current item is not in the array, skip it (yes), else start building the filename (no)

This hack is needed due to not being able to set a Flow action equal to itself. Think of it like a programmatic iteration. i++ or i = i + 1

Compose Temp is a placeholder for the varNameBuilder variable.
Set variable Name Builder: concat(outputs('Compose_Temp'),items('Apply_to_each'))


Compose Clean Filename: concat(variables('varNameBuilder'),outputs('Compose_Concat_Extension'))

Copy of the Flow can be downloaded here:
https://www.sharepointed.com/wp-content/uploads/2023/02/FilenameScrubber_20230227.zip

Things to consider:
Empty filename –
What if the filename is nothing but special / unwanted characters? At the end of the Flow, you’d want to use a Length function to check varNameBuilder to see if it’s greater than X.
example: !@#@$#$%%^&.pdf
The result from the Flow would be .pdf, and updating the filename would fail. To my knowledge, you can’t name a file like that, but you get the point.

Also, I’ve had users upload files with non-English characters, so there is a viable chance that someone, at some point, might upload a file like this: 我喜欢炸玉米饼.pdf

Making the Flow available to other flows –
If the Flow is created in a solution, it could be used more like a function, and other Flows in the solution could reference it. This would be a great example of a reusable Child Flow.

Power App and SharePoint List Form Hide Field on New Item Form

How do you hide a field on a PowerApp when opening a new form? The approach below uses a single screen form instead of multiple screens for the various forms.

I started by creating a new SharePoint list and added two text fields:
Not on New Form
On New Form
Using the customize form option, I entered the Power App designer.

When the PowerApp designer opens, it will look like this:

To help see what’s going on with the form mode, add a text label to the form and set its Text property to: "Form Mode: " & Text(SharePointForm1.Mode)

Select the field (Data Card) that should not appear on the new item form, then select the Visible property. For the Visible property, enter the following: If(SharePointForm1.Mode = 1, false, true) . If your SharePointForm1 is named something else, use it instead of the value I presented.

Breaking down the formula a little: If the SharePoint form mode is equal to 1, visible should be false, else true.

Save and publish the app, then check if it’s functional as planned.

New item form with Form Mode: 1

Display item form with Form Mode: 2

Edit item form with Form Mode: 0

Dataverse Resource not found for the segment table_name

I’m working with the Dataverse Web API and ran into this error while trying to write to a table.
Invoke-RestMethod : {"error":{"code":"0x8006088a","message":"Resource not found for the segment 'table name'."}}

The fix is to use the plural name of the table, but sometimes my engrish ain’t the bestest and I was struggling to figure out what the plural of Taco Order was (joking). If you want to find all the tables in your environment quickly, you can toss the API URL into a browser, which will list all the plural table names.

Example: https://taco.crm.dynamics.com/api/data/v9.1/

How Do You Get a Power Page Attachment That’s Stored in Blob Storage

My Power App Portal (Power Pages) environments are configured to use Azure blob storage for form attachments. One of the primary reasons for doing this is to avoid filling up expensive dataverse storage with endless attachments submitted by enduers.

This article outlines how to set up Azure storage: link

What I’m going to demo is how to get ONE attachment that’s uploaded to a form. If your form allows multiple attachments, you’d simply loop through them.

In the example, I’m using the soon-to-be-obsolete dataverse connector, but the same basic flow design applies to the normal connector.

When a row is added to my table, the flow is triggered.
The flow then queries the Note (annotation) table using the ID from the source table.
filter query: (_objetid_value eq souce_table_id)

The list rows notes query will result in an array being returned, but I’m only dealing with one attachment, so there’s no need to loop through it. To avoid an unnecessary loop, a function can be used to target a single object from the array: first(body(‘List_rows_Notes’)?[‘value’])?[‘annotationid’]

From the Get row note action, annotationid and filename will be needed to help form the path to the blob. Using the concat function I’m combing the container name, annotationid, and filename. Also, note the transformation on annotationid, the hyphens need to be removed, and the string needs to be lowercase. The last part of the transformation is to remove .azure.txt from the filename.

concat('/blobcontainer/',toLower(replace(outputs('Get_row_Note')?['body/annotationid'], '-', '')),'/', split(outputs('Get_row_Note')?['body/filename'], '.azure.txt')[0])

The end result of the transformation will be:
/blobcontainer/annotationid/filename /blobcontainer/cf03e4cf7f72ad118561002248881923/example.pdf

With the path to the blob formed, the get blob content action can retrieve the file.

It’s that simple.

A couple of notes:
It would be wise to leverage a virus-scanning tool like Cloudmersive.
If you haven’t already noticed, when a user uploads a file that contains special characters in the name…it’s saved to the Note table without the special characters, but when it’s moved to blob storage, the characters will be in the name. Yes, that’s a bug Microsoft has yet to fix. You can avoid this by adding Javascript to the upload page to block files that fall into this category. OR. Write another flow to clean file names before the form is processed.
Example:
Uploaded filename: my report 1:2:3.pdf
Note table: my report 123.pdf
Blob: my report 1:2:3.pdf

Password Complexity Page using Azure B2C and Power Pages

Currently working on a project, and my UX team asked if it was possible to change the look of the B2C sign-up / password change page to include visual hints to meet the password complexity requirements. We’ve all seen it before, you visit a site where you need to sign up, and the password needs to be X characters long and contain this and that, but some sites include a cute visual to help identify what requirements have been met.

image borrowed from jQuery Script


Articles and blog posts I used to get this working:
1. Customize the Azure AD B2C user interface for portals
2. Enable JavaScript and page layout versions in Azure Active Directory B2C
3. JS Password Validation
4. Customize the look and feel of your Azure AD B2C page

If you read the B2C documentation, it’s strongly noted not to use JS libraries outside of the libraries native to B2C. I opted to keep my solution as simple as possible to avoid additional security gaps.

To get this working, I followed the steps outlined in link 1. There I created all of the needed assets in the Portal Management section of the Power Pages environment. Next, I used the content from link 3 to update the Web Template that I created in the previous step. After that, I updated the Web Template to include the div noted in link 4; this is extremely important and can’t be skipped. The last part of the process is to update the B2C user flow policy to reference the page created in step 1.

Here is a copy of my Web Template file from Portal Management.

<!DOCTYPE html>
<html>
<head>
<style>
      /* Style all input fields */
      input {
        width: 100%;
        padding: 12px;
        border: 1px solid #ccc;
        border-radius: 4px;
        box-sizing: border-box;
        margin-top: 6px;
        margin-bottom: 16px;
      }

      /* Style the submit button */
      input[type="submit"] {
        background-color: #04aa6d;
        color: white;
      }

      /* Style the container for inputs */
      .container {
        background-color: #f1f1f1;
        padding: 20px;
      }

      /* The message box is shown when the user clicks on the password field */
      #message {
        display: none;
        background: #f1f1f1;
        color: #000;
        position: relative;
        padding: 20px;
        margin-top: 10px;
      }

      #message p {
        padding: 10px 35px;
        font-size: 18px;
      }

      /* Add a green text color and a checkmark when the requirements are right */
      .valid {
        color: green;
      }

      .valid:before {
        position: relative;
        left: -35px;
        content: "✔";
      }

      /* Add a red text color and an "x" when the requirements are wrong */
      .invalid {
        color: red;
      }

      .invalid:before {
        position: relative;
        left: -35px;
        content: "✖";
      }
    </style>
</head>
<body>
 <!--this div is the most important part of the process--> 
   <div id="api"></div>
    <div id="message">
      <h3>Password must contain the following:</h3>
      <p id="letter" class="invalid">A <b>lowercase</b> letter</p>
      <p id="capital" class="invalid">A <b>capital (uppercase)</b> letter</p>
      <p id="number" class="invalid">A <b>number</b></p>
      <p id="length" class="invalid">Minimum <b>8 characters</b></p>
    </div>
    
    <script>
      var myInput = document.getElementById("password");
      var letter = document.getElementById("letter");
      var capital = document.getElementById("capital");
      var number = document.getElementById("number");
      var length = document.getElementById("length");

      // When the user clicks on the password field, show the message box
      myInput.onfocus = function () {
        document.getElementById("message").style.display = "block";
      };

      // When the user clicks outside of the password field, hide the message box
      myInput.onblur = function () {
        document.getElementById("message").style.display = "none";
      };

      // When the user starts to type something inside the password field
      myInput.onkeyup = function () {
        // Validate lowercase letters
        var lowerCaseLetters = /[a-z]/g;
        if (myInput.value.match(lowerCaseLetters)) {
          letter.classList.remove("invalid");
          letter.classList.add("valid");
        } else {
          letter.classList.remove("valid");
          letter.classList.add("invalid");
        }

        // Validate capital letters
        var upperCaseLetters = /[A-Z]/g;
        if (myInput.value.match(upperCaseLetters)) {
          capital.classList.remove("invalid");
          capital.classList.add("valid");
        } else {
          capital.classList.remove("valid");
          capital.classList.add("invalid");
        }

        // Validate numbers
        var numbers = /[0-9]/g;
        if (myInput.value.match(numbers)) {
          number.classList.remove("invalid");
          number.classList.add("valid");
        } else {
          number.classList.remove("valid");
          number.classList.add("invalid");
        }

        // Validate length
        if (myInput.value.length >= 8) {
          length.classList.remove("invalid");
          length.classList.add("valid");
        } else {
          length.classList.remove("valid");
          length.classList.add("invalid");
        }
      };      
</script>
</body>
</html>

The idea behind this was to keep it as simple as possible and to get a basic example created. Yes, you can store the file in blob storage, but I wanted to keep all portal parts close together and avoid added complexity. (not that creating this page in Portal Management was easy)

How to run a Databricks Notebook using Power Automate

Part of a project I was working on required mashing up some data from SharePoint with data stored in datalake. We settled on creating a Databricks notebook to read an input file, query data lake using the input file, and then export an enriched file.

Here’s a high-level overview of what’s going to be created:

Call the notebook, parse the JSON response, loop until the notebook has finished, then respond to the notebook’s output.

In my case, triggering the notebook will require knowing its URL, bearer token, job id, and input parameters.


Parse the response from the HTTP call:


The notebook will take a little time to spin up, then process the input file. The best way to handle this is to leverage a basic do-until loop to check the status of the notebook job. I opted to use a one-minute delay, call the API to get the job status, parse the response, then evaluate if it’s finished.


One thing to note about the do until action, you don’t want it to run for eternity, and to avoid adding complexity to it, you don’t want to add extra evaluations like: if looped X times, stop
If you expand the Change limits option, you can set how many times it loops or change the duration. Here I’ve set the action to stop looping after 20 tries. For more info on this, please check SPGuides for a detailed overview.

The last step in the flow is to process the response from the notebook. If the job is success(full), get the file from blob storage and load it to SharePoint; otherwise, create a Slack alert.

That’s it; using the example above, you can trigger a Databricks notebook using a Flow.

Authentication
When I set this up, my company allowed the use of Personal Access Tokens (PAT).
https://docs.databricks.com/dev-tools/auth.html#pat
The PAT was then used in the Flow to trigger the notebook.

Use Power Automate to Create Jira Tasks

I’m working on a Power Pages project that requires a Jira service desk task to be created for each portal submission. Out of the box, Jira provides a simple connector to create tasks and requests, but the connect falls short of handling field types other than simple text. This means choice, checkbox, and dropdown fields are not available. This only leaves a couple of options, and I opted to use a simple HTTP action to create the tasks.

Basic overview of what I’ll be creating:
Flow that’s triggered by a dataverse row creation
Create a Jira task and populate metadata
Attach a file to the Jira task

Jira fields and types:
Issue Type – Choice
Request Type – Choice
Tortilla – Choice
Meat – Choice
Veggies – Checkbox multi-select
Number of Tacos – Number
Pickup Date Time – Date and Time
Summary – Text
Attachment – Attachment

Interfacing with the Jira API requires knowing a little about the fields you’ll be updating and the project and issue type you want to use. If you haven’t created one already, you need a Jira API token to work with the API.


Request type:
Go to Project Settings, then look at the URL and copy the value after pid=
https://taco.atlassian.net/secure/project/EditProject!default.jspa?pid=10001

With the ID, you can query the service desk request-types endpoint
https://taco.atlassian.net/rest/servicedesk/1/servicedesk/request/10001/request-types
In the returned payload, note the portal key and key values; combine the two, and you have the request type value tr/9f7c4029-6d23-4cb1-bb8a-02d0050d944b

Project key:
The project key is available on the project settings page, listed under the name field.
Example: TACOS

Issue type:
For simplicity, I’m only dealing with one issue type, and I captured the issueType value using the request-types endpoint noted above.
Example: “issueType”: 10015

For the remaining field values, you can get them in one of two ways.
Create a new issue in the browser, then use the browser developer tools (F12 or Ctrl + Shift + I) to inspect each field’s HTML value.

The other option is to click the gear icon (top right), select Issues, click on Custom Fields, search for a field, click on it, click Edit detail, and then grab the ID value from the URL. Once the ID is captured, join it with customfield_, resulting in customfield_10073, which is the field’s internal value.

In this example, the summary issue type fields are the only ones that do not have a customfield_X naming convention. It might be possible that some system-generated fields have a different naming convention, but I’ll dig into that another day.

Column Display NameColumn Internal NameColumn Type
Issue Typeissuetypesystem
Request Typecustomfield_10010system
Tortillacustomfield_10073Select List (single)
Meatcustomfield_10074Select List (single)
Veggiescustomfield_10075Checkboxes
Number of Tacoscustomfield_10076Number Field
Pickup Date Timecustomfield_10077Date Time Picker
Summarysummarysystem

Endpoint URL:
https://taco.atlassian.net/rest/api/3/issue/
Headers: {“Content-Type”: “application/json”}
Authentication: Raw
Key: Basic aWhddsfadfafa..NOT…A…REAL…KEY..dafdfdafd=
Example payload:

{
  "fields": {
    "project": {
      "key": "TACOS"
    },
    "customfield_10010": "tr/9f7c4029-6d23-4cb1-bb8a-02d0050d944b",
    "summary": "Taco order summary",
    "issuetype": {
      "id": "10015"
    },
    "customfield_10073": {"value": "Flour"},
    "customfield_10074": {"value": "Chicken"},
    "customfield_10075": [{"value": "Pico"},{"value": "Grilled Veggies"}],
    "customfield_10076": 2,
    "customfield_10077":"2022-11-05T11:05:00.000+0000"
  }
}

View of the task in Jira

How do you attach a file to a Jira task using Power Automate?
Attaching a file to a Jira task requires one more API call, and it’s simple!

Endpoint URL:
https://taco.atlassian.net/rest/api/3/issue/Key/attachments
Headers: {“X-Atlassian-Token”: “no-check”}
Authentication: Raw
Key: Basic aWhddsfadfafa..NOT…A…REAL…KEY..dafdfdafd=

Example payload:

{
  "$content-type": "multipart/form-data",
  "$multipart": [
    {
      "headers": {
        "Content-Disposition": "form-data; name=\"file\"; filename=@{outputs('Get_file_properties')?['body/{FilenameWithExtension}']}"
      },
      "body": @{body('Get_file_content')}
    }
  ]
}

Attachment
I’m getting a file from SharePoint and passing its contents to the API call for the attachment. The same thing works with Azure blob storage or grabbing a file from the dataverse. If you want to attach more than one file, create additional HTTP attachment calls.

Here’s a simple overview of the Flow:

Parse JSON schema:

{
    "type": "object",
    "properties": {
        "id": {
            "type": "string"
        },
        "key": {
            "type": "string"
        },
        "self": {
            "type": "string"
        }
    }
}

How to Audit Power Platform and SharePoint

This post will be an ongoing adventure into using Microsoft Purview to audit, track, review, and learn about updates to objects within the Power Platform. My adventure into this tool was prompted by my in-house security teams asking if I could help identify if a specific SharePoint list had been viewed and who viewed it. In SharePoint on-prem, this sort of info could be mined in a site, but with SharePoint Online, the auditing is offloaded to Purview.

To kick things off, I will run a report to see who has accessed my SharePoint Dev site this week. From the audit page, you can set a date range for your search and select activities like deleting a file or adding someone to a group; for the file, folder, or site box, you enter the site you want to target. Last but not least is the user’s box; this one is self-explanatory.

Search results are ready for viewing:

The results show that a user created a list item and then viewed the list a few times.


The audit logs are held for ~90 days; outside variables can impact this. Here is a warning if you try to search for items older than 90 days:

Audit log retention policies might impact search results. Activities that happened over 90 days ago will only show up in results for users who have licensing for long-term audit log retention.

That’s it for now; as you can see, this tool can be extremely valuable, especially when dealing with audits or if data magically goes missing.

Future updates to this article will show how to track changes to SharePoint lists, dataverse objects, Power BI, Power Automate (Flow), and more!

URL to access the compliance center / Purview: https://compliance.microsoft.com/auditlogsearch

Here is a new post showing how to search a single library in SharePoint:
https://www.sharepointed.com/2024/03/sharepoint-audit-using-purview/

Azure Runbook Job Name error: Token request failed..Exception

When you move from a SharePoint on-prem environment to SharePoint Online, you lose the server-side environment you’d normally use to run PowerShell scripts or tasks to interact with SharePoint. In my opinion, and please correct me if I’m wrong, the closest thing to a server-side environment in a cloud environment is Azure Runbooks or Azure Function Apps. I went with Azure Runbooks due to its ability to handle long-running tasks.

The error I recently encountered in my runbook was: runbook name error: Token request failed..Exception . At first, I thought there might be something wrong with the way I was connecting to Keyvault, but that wasn’t it. Next was my connection to SharePoint, this is handled using a SharePoint-generated client ID and secret. Oddly enough, I had just updated this a few months back, so it wasn’t an obvious candidate for a failure point.

I went to my target SharePoint site, created a new set of credentials using siteName/_layouts/15/AppRegNew.aspx and siteName/_layouts/15/appinv.aspx. After creating the credentials, I went back to the runbook and plugged them in, and it worked!

Long story short, if you get this error: Token request failed..Exception try creating a new client ID and secret and see if it helps clear things up.

You can also use this script to test your client id and secret. Connect-PnPOnline | PnP PowerShell

$siteUrl = "https://taco.sharepointonline/sites/burrito"
$testConn = Connect-PnPOnline -Url $siteUrl -AppId "1111-2222-3333-4444-555555555555" -AppSecret "X3tssvCebdl/c/gvXsTACOajvBurrito=" -ReturnConnection
$list = Get-PnPList "Tacos"
Write-Output $list