Power Query and Jira Search API Return Everything Project Related

From Power BI, how do you retrieve all the issues related to a Jira project?

To follow along with this post, you’ll need the following:
Power Query – Excel or Power BI
Jira API key
Powershell, Python, or a tool to base64 encode a string

To get a Jira API key, navigate to your Jira homepage, click your account icon (top right), and select Manage account. The shortcut to the account page is here: https://id.atlassian.com/manage-profile/security
From the account page, click on the Security header link, then click on Create and manage API tokens.


From the API Tokens page, click on the Create API token button. When the window opens, enter a value in the Label field and click the Create button. From your new API token window, copy the API token value. YOU WILL NEED this later in this demo.

When accessing the Jira API, basic auth is used, and for this example, we need to base64 encode the login credentials. You can use your favorite programming/scripting language or a website like Base64Encode.org to encode the credentials. In a production scenario, I do NOT suggest using a public website to encode/decode anything. This is just a demo; my API key will be revoked once I finish writing this.

Example string of what needs to be encoded:
 Your Jira login email address + colon + API token
 youEmail@example:APItoken

Example:
 ian@example.com:ETETT3xFfGF009ETR_3bbA7Gk_ZLzPCAocvKcAvSzKe5-ysU_8fGwBxuRSsyx7efUcvaTACOSh3sgJWGtictvqGBF0yCy3ZzzKb39_gHBgYxnxjBURRITO3ofEWea_Wf4P9XWWmiNACHOWxUUA7O9cwFhH9WO6hq4-yAwEnbQUESOc=AEAA3875

Encoded value:
aWFuQGV4YW1wbGUuY29tOkVURVRUM3hGZkdGMDA5RVRSXzNiYkE3R2tfWkx6UENBb2N2S2NBdlN6S2U1LXlzVV84Zkd3Qnh1UlNzeXg3ZWZVY3ZhVEFDT1NoM3NnSldHdGljdHZxR0JGMHlDeTNaenpLYjM5X2dIQmdZeG54akJVUlJJVE8zb2ZFV2VhX1dmNFA5WFdXbWlOQUNIT1d4VVVBN085Y3dGaEg5V082aHE0LXlBd0VuYlFVRVNPYz1BRUFBMzg3NQ==  

Next, we need to open Power Query to capture the Jira data.
Excel: Data tab –> Get Data –> Launch Power Query editor
Power BI: Home tab –>Transform data –> Transform data

In Power Query, select New Source, then select Blank Query. From the toolbar, click Advanced Editor. Delete everything in the Query window. Below is the M code for connecting to Jira and parsing the API response. You will want to update the code to replace the Url value, the Authorization value, and jql=”project = YourProject” values! You can get the URL and YourProject value from your Jira project homepage.

let
    Source = (startAt as number) as table =>
    let
        Url = "https://YOUR-COMPANY.atlassian.net/rest/api/3/search",
        WebContents = Web.Contents(Url,
            [
                Headers = [#"Authorization"="Basic aWFuQGV4YW1wbGUuY29tOkVURVRUM3hGZkdGMDA5RVRSXzNiYkE3R2tfWkx6UENBb2N2S2NBdlN6S2U1LXlzVV84Zkd3Qnh1UlNzeXg3ZWZVY3ZhVEFDT1NoM3NnSldHdGljdHZxR0JGMHlDeTNaenpLYjM5X2dIQmdZeG54akJVUlJJVE8zb2ZFV2VhX1dmNFA5WFdXbWlOQUNIT1d4VVVBN085Y3dGaEg5V082aHE0LXlBd0VuYlFVRVNPYz1BRUFBMzg3NQ==", #"Content-Type"="application/json"],
                Query=[startAt=Text.From(startAt), jql="project = YourProject"]
            ]
        ),
        ParsedJson = Json.Document(WebContents),
        Issues = ParsedJson[issues],
        TableFromJson = Table.FromList(Issues, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
        Total = ParsedJson[total],
        NextStartAt = startAt + List.Count(Issues)
    in
        if NextStartAt < Total then
            Table.Combine({TableFromJson, @Source(NextStartAt)})
        else
            TableFromJson,

    InitialCall = Source(0),
    #"Expanded Column1" = Table.ExpandRecordColumn(InitialCall, "Column1", {"expand", "id", "self", "key", "fields"}, {"Column1.expand", "Column1.id", "Column1.self", "Column1.key", "Column1.fields"}),
    #"Expanded Column1.fields" = Table.ExpandRecordColumn(#"Expanded Column1", "Column1.fields", {"statuscategorychangedate", "customfield_10990", "fixVersions", "customfield_10991", "customfield_10992", "resolution", "customfield_10751", "customfield_10104", "customfield_10984", "customfield_10985", "customfield_10986", "customfield_10987", "customfield_10989", "lastViewed", "customfield_10462", "priority", "customfield_10980", "customfield_10100", "customfield_10463", "customfield_10101", "customfield_10465", "customfield_10102", "customfield_10103", "labels", "customfield_10973", "customfield_10974", "customfield_10732", "customfield_10733", "customfield_10975", "customfield_10976", "aggregatetimeoriginalestimate", "customfield_10977", "timeestimate", "customfield_10978", "versions", "customfield_10979", "issuelinks", "assignee", "status", "components", "customfield_10450", "customfield_10451", "customfield_10452", "customfield_10453", "customfield_10454", "customfield_10841", "customfield_10445", "customfield_10842", "customfield_10843", "customfield_10447", "customfield_10448", "customfield_10844", "customfield_10569", "customfield_10449", "aggregatetimeestimate", "customfield_10728", "creator", "subtasks", "reporter", "aggregateprogress", "customfield_10200", "customfield_10676", "customfield_10830", "customfield_10831", "customfield_10710", "customfield_10438", "progress", "votes", "issuetype", "timespent", "project", "aggregatetimespent", "customfield_11003", "customfield_10423", "customfield_10544", "customfield_10545", "customfield_10424", "customfield_10942", "customfield_10546", "customfield_10821", "customfield_10822", "customfield_10943", "customfield_10427", "customfield_10823", "customfield_10944", "customfield_10945", "customfield_10428", "resolutiondate", "customfield_10946", "customfield_10827", "customfield_10948", "workratio", "customfield_10949", "customfield_10828", "customfield_10829", "watches", "created", "customfield_10541", "customfield_10543", "customfield_10533", "customfield_10654", "customfield_10655", "customfield_10930", "customfield_10931", "customfield_10414", "customfield_10656", "customfield_10657", "customfield_10415", "customfield_10811", "customfield_10933", "customfield_10537", "customfield_10812", "customfield_10416", "customfield_10658", "customfield_10659", "customfield_10538", "customfield_10417", "customfield_10935", "customfield_10418", "customfield_10419", "customfield_10818", "customfield_10819", "updated", "timeoriginalestimate", "customfield_10492", "description", "customfield_10893", "customfield_10531", "customfield_10895", "customfield_10653", "customfield_10006", "customfield_10007", "security", "customfield_10801", "customfield_10802", "customfield_10805", "customfield_10806", "customfield_10807", "customfield_10928", "customfield_10809", "summary", "customfield_10000", "customfield_10001", "customfield_10002", "customfield_10640", "customfield_10641", "customfield_10642", "customfield_10753", "customfield_10478", "customfield_10633", "customfield_10513", "environment", "customfield_10637", "customfield_10516", "customfield_10879", "customfield_10517", "duedate", "customfield_10639"}, {"Column1.fields.statuscategorychangedate", "Column1.fields.customfield_10990", "Column1.fields.fixVersions", "Column1.fields.customfield_10991", "Column1.fields.customfield_10992", "Column1.fields.resolution", "Column1.fields.customfield_10751", "Column1.fields.customfield_10104", "Column1.fields.customfield_10984", "Column1.fields.customfield_10985", "Column1.fields.customfield_10986", "Column1.fields.customfield_10987", "Column1.fields.customfield_10989", "Column1.fields.lastViewed", "Column1.fields.customfield_10462", "Column1.fields.priority", "Column1.fields.customfield_10980", "Column1.fields.customfield_10100", "Column1.fields.customfield_10463", "Column1.fields.customfield_10101", "Column1.fields.customfield_10465", "Column1.fields.customfield_10102", "Column1.fields.customfield_10103", "Column1.fields.labels", "Column1.fields.customfield_10973", "Column1.fields.customfield_10974", "Column1.fields.customfield_10732", "Column1.fields.customfield_10733", "Column1.fields.customfield_10975", "Column1.fields.customfield_10976", "Column1.fields.aggregatetimeoriginalestimate", "Column1.fields.customfield_10977", "Column1.fields.timeestimate", "Column1.fields.customfield_10978", "Column1.fields.versions", "Column1.fields.customfield_10979", "Column1.fields.issuelinks", "Column1.fields.assignee", "Column1.fields.status", "Column1.fields.components", "Column1.fields.customfield_10450", "Column1.fields.customfield_10451", "Column1.fields.customfield_10452", "Column1.fields.customfield_10453", "Column1.fields.customfield_10454", "Column1.fields.customfield_10841", "Column1.fields.customfield_10445", "Column1.fields.customfield_10842", "Column1.fields.customfield_10843", "Column1.fields.customfield_10447", "Column1.fields.customfield_10448", "Column1.fields.customfield_10844", "Column1.fields.customfield_10569", "Column1.fields.customfield_10449", "Column1.fields.aggregatetimeestimate", "Column1.fields.customfield_10728", "Column1.fields.creator", "Column1.fields.subtasks", "Column1.fields.reporter", "Column1.fields.aggregateprogress", "Column1.fields.customfield_10200", "Column1.fields.customfield_10676", "Column1.fields.customfield_10830", "Column1.fields.customfield_10831", "Column1.fields.customfield_10710", "Column1.fields.customfield_10438", "Column1.fields.progress", "Column1.fields.votes", "Column1.fields.issuetype", "Column1.fields.timespent", "Column1.fields.project", "Column1.fields.aggregatetimespent", "Column1.fields.customfield_11003", "Column1.fields.customfield_10423", "Column1.fields.customfield_10544", "Column1.fields.customfield_10545", "Column1.fields.customfield_10424", "Column1.fields.customfield_10942", "Column1.fields.customfield_10546", "Column1.fields.customfield_10821", "Column1.fields.customfield_10822", "Column1.fields.customfield_10943", "Column1.fields.customfield_10427", "Column1.fields.customfield_10823", "Column1.fields.customfield_10944", "Column1.fields.customfield_10945", "Column1.fields.customfield_10428", "Column1.fields.resolutiondate", "Column1.fields.customfield_10946", "Column1.fields.customfield_10827", "Column1.fields.customfield_10948", "Column1.fields.workratio", "Column1.fields.customfield_10949", "Column1.fields.customfield_10828", "Column1.fields.customfield_10829", "Column1.fields.watches", "Column1.fields.created", "Column1.fields.customfield_10541", "Column1.fields.customfield_10543", "Column1.fields.customfield_10533", "Column1.fields.customfield_10654", "Column1.fields.customfield_10655", "Column1.fields.customfield_10930", "Column1.fields.customfield_10931", "Column1.fields.customfield_10414", "Column1.fields.customfield_10656", "Column1.fields.customfield_10657", "Column1.fields.customfield_10415", "Column1.fields.customfield_10811", "Column1.fields.customfield_10933", "Column1.fields.customfield_10537", "Column1.fields.customfield_10812", "Column1.fields.customfield_10416", "Column1.fields.customfield_10658", "Column1.fields.customfield_10659", "Column1.fields.customfield_10538", "Column1.fields.customfield_10417", "Column1.fields.customfield_10935", "Column1.fields.customfield_10418", "Column1.fields.customfield_10419", "Column1.fields.customfield_10818", "Column1.fields.customfield_10819", "Column1.fields.updated", "Column1.fields.timeoriginalestimate", "Column1.fields.customfield_10492", "Column1.fields.description", "Column1.fields.customfield_10893", "Column1.fields.customfield_10531", "Column1.fields.customfield_10895", "Column1.fields.customfield_10653", "Column1.fields.customfield_10006", "Column1.fields.customfield_10007", "Column1.fields.security", "Column1.fields.customfield_10801", "Column1.fields.customfield_10802", "Column1.fields.customfield_10805", "Column1.fields.customfield_10806", "Column1.fields.customfield_10807", "Column1.fields.customfield_10928", "Column1.fields.customfield_10809", "Column1.fields.summary", "Column1.fields.customfield_10000", "Column1.fields.customfield_10001", "Column1.fields.customfield_10002", "Column1.fields.customfield_10640", "Column1.fields.customfield_10641", "Column1.fields.customfield_10642", "Column1.fields.customfield_10753", "Column1.fields.customfield_10478", "Column1.fields.customfield_10633", "Column1.fields.customfield_10513", "Column1.fields.environment", "Column1.fields.customfield_10637", "Column1.fields.customfield_10516", "Column1.fields.customfield_10879", "Column1.fields.customfield_10517", "Column1.fields.duedate", "Column1.fields.customfield_10639"}),
    #"Expanded Column1.fields.status" = Table.ExpandRecordColumn(#"Expanded Column1.fields", "Column1.fields.status", {"name", "id"}, {"Column1.fields.status.name", "Column1.fields.status.id"}),
    #"Expanded Column1.fields.assignee" = Table.ExpandRecordColumn(#"Expanded Column1.fields.status", "Column1.fields.assignee", {"accountId", "emailAddress", "displayName"}, {"Column1.fields.assignee.accountId", "Column1.fields.assignee.emailAddress", "Column1.fields.assignee.displayName"}),
    #"Changed Type" = Table.TransformColumnTypes(#"Expanded Column1.fields.assignee",{{"Column1.fields.updated", type datetimezone}}),
    #"Expanded Column1.fields.issuetype" = Table.ExpandRecordColumn(#"Changed Type", "Column1.fields.issuetype", {"id", "name"}, {"Column1.fields.issuetype.id", "Column1.fields.issuetype.name"}),
    #"Expanded Column1.fields.reporter" = Table.ExpandRecordColumn(#"Expanded Column1.fields.issuetype", "Column1.fields.reporter", {"accountId", "emailAddress", "displayName"}, {"Column1.fields.reporter.accountId", "Column1.fields.reporter.emailAddress", "Column1.fields.reporter.displayName"}),
    #"Expanded Column1.fields.priority" = Table.ExpandRecordColumn(#"Expanded Column1.fields.reporter", "Column1.fields.priority", {"name", "id"}, {"Column1.fields.priority.name", "Column1.fields.priority.id"})
in
    #"Expanded Column1.fields.priority"

After pasting the M code in the Query window, click Done. When the Access Web content window opens, select Anonymous!

Once connected, Power Query will hit the Jira API and loop through the response, building a rich dataset. In the right column, my M code added a few extra Applied Steps. You can remove these or add steps to clean up the returned dataset.

When you are ready, click the Close & Apply button. The next couple of screenshots are focused on Power BI, but the same data will be available in Excel. Here, you can see the total number of items returned by the calls to the Jira API from Power Query.

One big thing to note: if you don’t need to pull the entire dataset down from Jira, try creating sample JQL queries and copying the URL value to Power Query. I grabbed this screenshot from the Issues page of my project. Note that adding more filters will change the URL to reflect the update. The URL contains the JQL query string.

You’d want to modify This section of Power Query M code.

PowerShell Currency Conversion Using FRED

I’m working on a project that requires currency conversions between the US dollar and the Euro. In the most basic case, the project has two requirements: backfill historical Euro amounts and get current Euro amounts using a transaction date and USD value. Plugging into an API to get the current currency exchange rate is simple enough, but finding an open or free dataset with 10+ years of currency transactions was another thing. Google and GPT/LLMs are littered with what appear to be free sites, but they are limited to ~100 API calls, or their dataset is not deep enough for my use case. I landed on a great tool provided by the Federal Reserve named FRED, short for Federal Reserve Economic Data. FRED is a free site with APIs sitting on a treasure trove of data. When I started this project, I simply went to the FRED and downloaded the dataset I needed (link), but I wanted to ensure that my process was current and could handle new transactions for years to come. Using the FRED API requires signing up for a free account. You will want a FRED API key to follow along with this demo.

What I’m going to demo in this post: creating a FRED account, using PowerShell to read from an Excel file, querying an API, writing back to the Excel file

FRED account:
Visit the https://fred.stlouisfed.org/ site, click My Account (top right), and click Create New Account when the modal opens. After you’ve created an account, navigate to the My Account page and click API Keys in the left nav. On the API Keys page, click the Request API Key button, input some text in the description box, click the agreement checkbox, and then click Request API Key. Link to the API Key page: https://fredaccount.stlouisfed.org/apikeys

For this demo, I’ve created a simple Excel file with the following columns and datatypes: TransDate (date), USD (currency), ConversionDate (date), EUR (currency)

To interact with an Excel file from PowerShell, I went with the ImportExcel module. In VsCode or your IDE of choice, run this command: Install-Module ImportExcel -Scope CurrentUser

I will test reading from the Excel file, loop through the rows, and output their values to get the ball rolling and ensure the ImportExcel module works.
$excelPath: location of the Excel file
$worksheetName: name of the worksheet/tab where the data is stored (optional)
$excelData: imported Excel data

$excelPath = "C:\code\CurrencyDemo.xlsx"
$worksheetName = "Historical"
$excelData = Import-Excel -Path $excelPath -WorksheetName $worksheetName

foreach($row in $excelData){
    Write-Output "Transction date: $($row.TransDate) Amount: $($row.USD) USD"
}

Next, I will test my connection to the FRED API, returning a sample transaction. There are two important things to note in the next script. The $series variable is bound to the USD to Euro Spot exchange rate value; if you need to work with a different currency, visit the Daily Rates page and filter by the Geographies or use the site search if you cannot find what you are looking for there. If you type Peso in the site search, it will suggest the Mexican Peso to U.S. Dollar. Clicking on the search result will open the page for that conversion, and the page will reveal the $series value needed for the conversion. The Peso to USD series is DEXMXUS (look at the URL or the value to the right of the conversion overview). The next important variable to note is $date; this is obvious for this example, but you can query the API for larger data ranges if needed and work with the larger API response.

# Your FRED API Key
$apiKey = "75fa2e6ce85_taco_805016ea4d764c5"

# Set the parameters
$series = "DEXUSEU"  # This is the series ID for USD to Euro exchange rate
$date = "2024-01-16"

# Construct the API URL
$url = "https://api.stlouisfed.org/fred/series/observations?series_id=$series&observation_start=$date&observation_end=$date&api_key=$apiKey&file_type=json"

# Make the API request
$response = Invoke-RestMethod -Uri $url -Method Get

# Check if we got a result
if ($response.observations.Count -gt 0) {
    $usd_to_eur_rate = [double]$response.observations[0].value
    $eur_to_usd_rate = [math]::Round(1 / $usd_to_eur_rate, 4)
    Write-Output "The USD to Euro conversion rate on $date was: $usd_to_eur_rate"
    Write-Output "The Euro to USD conversion rate on $date was: $eur_to_usd_rate"
} else {
    Write-Output "No data available for the specified date."
}

In the last script for this demo, I will combine all the parts and provide an example for dealing with input dates that are Saturday or Sunday. From what I’ve learned on this journey, currencies are not typically traded seven days a week, so if an input date falls on a weekend, there needs to be an offset to the preceding Friday. This script must be extended in a production scenario to deal with major holidays.

function CurrencyConversion {
    param (
        $convDate,
        $usdAmount
    )

    # Parse the input string into a datetime object
    $parsedDate = [datetime]::ParseExact($convDate.Date, "M/d/yyyy HH:mm:ss", [Globalization.CultureInfo]::InvariantCulture)
    $apiDateValue = $parsedDate.ToString("yyyy-MM-dd")

    # Your FRED API Key
    $apiKey = "75fa2e6ce85_taco_805016ea4d764c5"
    $seriesId = "EXUSEU"

    # Construct the API URL
    $apiUrl = "https://api.stlouisfed.org/fred/series/observations?series_id=$seriesId&api_key=$apiKey&file_type=json&observation_start=$apiDateValue&observation_end=$apiDateValue"

    # Make the API call
    $response = Invoke-RestMethod -Uri $apiUrl

    # Check if there are any observations for the given date
    if ($response.observations.count -gt 0) {
        # Assuming the first observation is the one we're interested in
        $usd_to_eur_rate = [double]$response.observations[0].value
        $eur_to_usd_rate = [math]::Round(1 / $usd_to_eur_rate, 4)
    }
    else {
        Write-Host "No data found for ................................................ $parsedDate"
    }

    $convertedValue = $usdAmount * $eur_to_usd_rate
    return $convertedValue
    
}

function DateConversion {
    param (
        $conversionDate
    )

    # Check if 'conversionDate' is not null or empty
    if (-not [string]::IsNullOrWhiteSpace($conversionDate)) {
        # Parse the input date into a datetime object
        $targetDate = [datetime]::Parse($conversionDate)

        # Check if the day is Saturday or Sunday
        if ($targetDate.DayOfWeek -eq [DayOfWeek]::Saturday) {
            $conversionDate = $targetDate.AddDays(-1).ToString("yyyy-MM-dd")
        }
        elseif ($targetDate.DayOfWeek -eq [DayOfWeek]::Sunday) {
            $conversionDate = $targetDate.AddDays(-2).ToString("yyyy-MM-dd")
        }
    }

    return $conversionDate
}

$excelPath = "C:\code\CurrencyDemo.xlsx"
$worksheetName = "Historical"
$excelData = Import-Excel -Path $excelPath -WorksheetName $worksheetName

foreach ($row in $excelData) {

    $transDate = $row.TransDate
    $amountUSD = $row.USD
    $submittedDate = $null

    # Get the date for the currency conversion
    if (-not [string]::IsNullOrWhiteSpace($transDate)) {
        $submittedDate = DateConversion -conversionDate $transDate
    }

    # Check if both Submitted Date and USD are not null or empty
    if (-not [string]::IsNullOrWhiteSpace($submittedDate) -and 
        -not [string]::IsNullOrWhiteSpace($amountUSD)) {
        $convertedValue = CurrencyConversion -convDate $submittedDate -usdAmount $amountUSD
    }

    Write-Output "Converted value for $($amountUSD) USD on $($submittedDate.ToShortDateString()): $convertedValue"

    #update the excel row with the output
    $row.EUR = $convertedValue
    $row.ConversionDate = $submittedDate
}

# Export the updated data to Excel
$excelData | Export-Excel -Path $excelPath -WorksheetName $worksheetName 

To streamline the script, I created two helper functions. One handles the weekend-to-Friday conversion, and the other makes the API call to FRED. The script will loop over all of the rows in the spreadsheet, handle the currency conversion, and then bulk-write the output to the target Excel file. The highlighted values notate where a weekend date was passed, and the script handled the offset to the preceding Friday.

Yes, some places in the script need improvement, but I wanted to provide a simple example for handling currency conversion with PowerShell and historical dates. As always, please don’t hesitate to reach out or leave a comment if any part of this doesn’t make sense or if there’s a more-better way of doing things.

Note:
Be mindful of the number of calls you make to the API in a given timeframe. I was testing this process and hammered on the API with ~1,000 calls and hit an API limit error. Adding a simple pause to the script fixed the problem. i.e. after X calls, pause for X seconds.

Power Automate: Sending Emails Using Excel Data

Recently, a user asked me how they could send emails using a flow, with Excel as the data source. I’m going to provide an in-depth guide that covers every step needed to accomplish this.

What’s needed to follow along:
Access to a Power Platform environment
URL: https://make.powerautomate.com/
OneDrive (but you can use SharePoint)
Excel file

Open a new Excel file and populate it with the following columns:
RowID, Employee Name, Manager Email, Email Sent
Enter data in each of the cells, but ensure that RowID has a unique value for each row.

After the data is entered, you will want to create a table encompassing the cells that were just populated. With one of the cells selected, click the Insert tab at the top of the file, then click Table.

When the Create Table popup opens, ensure it includes the rows and columns you created, and the my tables has headers box should be checked. Click Ok to close the popup.

From the File tab, click Save As and save the file to a OneDrive location. You can save the file to your desktop or another location, then copy it to a folder in OneDrive. It makes no difference how the file gets there, but for this flow to work, it needs to be in OneDrive or a SharePoint location you have access to.

From a browser, navigate to https://make.powerautomate.com/. On the left side of the screen, click on My flows. After the page refreshes, click + New flow and select Instant cloud flow.

In the Flow name field, input a name for your flow. From the Choose how to trigger this flow, select Manually trigger a flow, then click the Create button.

In the flow design canvans, click the + below the Manually trigger a flow action, and select Add an action.

When the Add an action window opens on the left side of the screen, you will notice that you have a bunch of actions to choose from. The first box can be used to search for actions; here, enter Excel list rows. Note how the actions are grouped by the connector type, in our case, Excel Online. The other key to note here is to see more blue text to the right of the Excel Online group. If you don’t see the action you are looking for, always remember to click the see more link. Doing this will disable all available actions for the group. Go ahead and click on the List rows present in a table action.

From the Location dropdown, select OneDrive if that’s where you saved your file; else, select SharePoint or wherever you saved the file. For this example, the file must be in a storage location to which the flow can connect. Next, click the dropdown for the Document Library. In my example, you’ll notice that I have several options to choose from; if you get the same result, you’ll need to select each option and then click the File dropdown to see if you are in the correct location. Yes, it’s annoying.

Once you have the correct Document Library selected, click the dropdown for the File option, navigate to where your Excel file is stored, and select it.

From the Table dropdown, select the available option. In my example, the only table from the Excel file is Table2. There is a good chance yours is named something else.

You can verify the table name by returning to your Excel file, clicking the Table Design tab, and then noting the Table Name value.

Returning to the flow design canvas, click the + Add an action below the Excel action.

From the search box, enter send email. With the list of actions narrowed down, select the Send an email (v2) option. If you are reading this in the future, the option might (V_something else), but make sure you are in the Office 365 Outlook group of action. Do NOT use the action from the Mail group.

In the Send an email action, click in the To field. If the dynamic content is not visible, click the Enter custom value text and the little lightning bolt icon.

When the dynamic content window opens, you will see a few options for fields you can select to populate the To value of the Send an email action. From the list of options, select Manager Email. This will pull in the manager’s email from the spreadsheet.

Click in the Subject field and enter some text; here, you will notice I input Example. After you’ve entered some text, click the lightning bolt icon again, but this time select Employee Name.

The last thing we will populate in this action is the Body of the email. Again, feel free to input some text here, and like the other fields, you can use values from the dynamic content menu to use values from the Excel file.

The completed email action will look like this.

You will notice that a For each was automatically added to the flow design canvas. Why? If you think about what the flow is doing from a process standpoint, it added the For each to loop over each row in the Excel file. For each row in the spreadsheet, do ____. In our example, it will send an email, and each email will reference the current item in the loop.

Below the Send an email action, click + Add an action.

From the add an action search box, enter excel update row, and select the Update a row action. We will use this action to update the spreadsheet for each email sent.

In the Update a row action, navigate to where your Excel file is stored. The steps you followed when connecting to the file a few steps back in this process will apply here. Once you’ve connected to the file, click in the Key Column and select RowID from the available choices.

Click into the Key Value field, open the dynamic content window, and select RowID. We are telling the action that we want to update the Excel file row corresponding to the current item in For each loop.

Click the dropdown for the Advanced parameters field and select Email Sent. I entered Yes in the Email Sent field.

The completed flow should look like this: We trigger the flow, get the table from the Excel file, loop over each row in the table (for each), send an email, and update the spreadsheet for each item in the loop.

At the top of the screen, click Save, then Test

Select Manually when the next window opens, click Continue, and last but not least, click Run flow.

When the flow is finished running, you should see green check marks next to each flow action.

The emails were sent to each manager with their employee in the subject and body of each email. To test sending emails, I like to use https://temp-mail.org/en/

Navigating back to the Excel file, the Email sent value is Yes for each row in the file.

That’s it! There are lots of steps, but I hope it covers everything you need to create a workflow that does the exact same thing.

Use Python to upload a LARGE file to SharePoint

In this post, I will quickly show how to use the Office365-REST-Python-Client library to upload a large file to a SharePoint library.

For this to work, you will need a certificate, Azure App registration, and access to the target SharePoint site. I outlined all the necessary parts in this post: Modernizing Authentication in SharePoint Online Note: the linked post will output a .PFX cert, and the script below will need a .PEM cert. You can use this Python command to convert the cert:

from cryptography.hazmat.primitives import serialization
from cryptography.hazmat.primitives.asymmetric import rsa
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import hashes
from cryptography import x509
from cryptography.hazmat.primitives.serialization import pkcs12

# Load the PFX file
pfx_file = open('C:\\path_to_cert\\EXAMPLE.pfx', 'rb').read()  # replace with your pfx file path
(private_key, certificate, additional_certificates) = pkcs12.load_key_and_certificates(pfx_file, None, default_backend())

with open('NewCERT.pem', 'wb') as f:
    f.write(private_key.private_bytes(
        encoding=serialization.Encoding.PEM,
        format=serialization.PrivateFormat.TraditionalOpenSSL,
        encryption_algorithm=serialization.NoEncryption()
    ))
    f.write(certificate.public_bytes(serialization.Encoding.PEM))

# install this library if needed
# pip install cryptography

Ok, with that out of the way, you can use this script to upload to a SharePoint library. In the script, I’ve commented out the line that would be used to upload to a folder within a library.

import os
from office365.sharepoint.client_context import ClientContext

cert_credentials = {
    "tenant": "abcxyz-1234-4567-8910-0e3d638792fb",
    "client_id": "abcddd-4444-4444-cccc-123456789111",
    "thumbprint": "7D8D8DF7D8D2F4DF8DF45D4FD8FD48DF5D8D",
    "cert_path": "RestClient\\NewCERT.pem"
}
ctx = ClientContext("https://tacoranch.sharepoint.com/sites/python").with_client_certificate(**cert_credentials)
current_web = ctx.web.get().execute_query()
print("{0}".format(current_web.url))

filename = "LargeExcel.xlsx"
folder_path = "C:\\code\py"

def print_upload_progress(offset):
    # type: (int) -> None
    file_size = os.path.getsize(local_path)
    print(
        "Uploaded '{0}' bytes from '{1}'...[{2}%]".format(
            offset, file_size, round(offset / file_size * 100, 2)
        )
    )

#upload to a folder
#target_url = "Shared Documents/folderA/folderB"

target_url = "Shared Documents"
target_folder = ctx.web.get_folder_by_server_relative_url(target_url)
size_chunk = 1000000
local_path = os.path.join(folder_path, filename)
with open(local_path, "rb") as f:
    uploaded_file = target_folder.files.create_upload_session(
        f, size_chunk, print_upload_progress, filename
    ).execute_query()

print("File {0} has been uploaded successfully".format(uploaded_file.serverRelativeUrl))

If you receive an error stating you don’t have access, double-check that you’ve added the App Registration to the target SharePoint site permissions. Again, this is noted in the blog post linked at the being of this post.

Consider this a workaround until MS Graph is out of its latest beta and there’s more support for easily uploading to SharePoint.

What if you need to upload a file and set a column value? When working with SharePoint via the API, you must be mindful of the column names. The column name in the UI might not be the same as the internal name, so I will use the script above as my starting point and add the following script to the end. In this example, I’m setting two fields: ReportName and ReportDate.

#get the file that was just uploaded
file_item = uploaded_file.listItemAllFields

# Define a dictionary of field names and their new values
fields_to_update = {
    "ReportName": "My TPS Report",
    "ReportDate": datetime.datetime.now().isoformat(),
    # Add more fields here as needed
}

# Iterate over the dictionary and update each field
for field_name, new_value in fields_to_update.items():
    file_item.set_property(field_name, new_value)

# Commit the changes
file_item.update()
ctx.execute_query()

print("Report fields were updated")

How do you get a list of all the columns in a list or library? The script below will output all the column’s internal and display names.

from office365.sharepoint.client_context import ClientContext

cert_credentials = {
    "tenant": "abcxyz-1234-4567-8910-0e3d638792fb",
    "client_id": "abcddd-4444-4444-cccc-123456789111",
    "thumbprint": "7D8D8DF7D8D2F4DF8DF45D4FD8FD48DF5D8D",
    "cert_path": "RestClient\\NewCERT.pem"
}

ctx = ClientContext("https://tacoranch.sharepoint.com/sites/python").with_client_certificate(**cert_credentials)
current_web = ctx.web.get().execute_query()
print("{0}".format(current_web.url))

# Get the target list or library
list_or_library = ctx.web.lists.get_by_title('TPS-Reports')

# Load the fields
fields = list_or_library.fields.get().execute_query()

# Print the field names
for field in fields:
    print("Field internal name: {0}, Field display name: {1}".format(field.internal_name, field.title))



SharePoint Audit Using Purview

Today, I had a customer ask the following question:
“How can I pull a report showing all of the files in a library that have not been viewed?”
Typically, users request a report showing all the items or files accessed, so this request was unique.

You can run an audit search on most everything in a cloud tenant using Microsoft Purview. Every file downloaded, viewed, list item opened, edited, deleted, page views, Onedrive actions, Teams actions, and the list goes on and on.

In Purview, click on the Audit link in the left nav, and it will open the audit search page.
Select the time range you want to target
Activities: Accessed file
Record types: SharePointFileOperation
Search name: this can be anything you want, i.e. SP Library Search
File, folder, or site: https://taco.sharepoint.com/sites/test/TheLibrary/*
Workload: SharePoint
The key items to note are the record type and file options. You can use a wildcard * to return results for everything in the target library. This will return much information, so filtering after the report is downloaded is needed. Once you’ve populated the fields, click Search, then wait a bit for it to complete. The amount of data in your tenant and current workloads will determine how long the search will take.


The completed search will look like this:

Clicking on the report name will open a detailed view of the audit search. From the results page, click the Export button and wait a few minutes for the file to be generated. If the page gets stuck at 0%, refresh your browser, and it should trigger the download.

Next, I needed to get all the files in the SharePoint library. To do this, I used PowerShell to connect to the target site and then downloaded the file info to a CSV.

# Connect to the SharePoint site interactively
Connect-PnPOnline -Url "https://taco.sharepoint.com/sites/test" -Interactive

# Specify the target library
$libraryName = "TheLibrary"

# Get all files from the target library
$files = Get-PnPListItem -List $libraryName -Fields "FileLeafRef", "ID", "FileRef", "Created", "Modified", "UniqueId", "GUID"

# Create an empty array to store the file metadata
$fileMetadata = @()

# Loop through each file and extract the relevant metadata
foreach ($file in $files) {
    $fileMetadata += [PSCustomObject]@{
        FileName = $file["FileLeafRef"]
        ID = $file["ID"]
        GUID = $file["GUID"]
        UniqueId = $file["UniqueId"]
        URL = $file["FileRef"]
        Created = $file["Created"]
        Modified = $file["Modified"]
    }
}

# Export the file metadata to a CSV file
$fileMetadata | Export-Csv -Path "C:\code\library_audit.csv" -NoTypeInformation

If you take anything away from this post, please take note of this: Purview uses a field named ListItemUniqueId to identify a SharePoint file or list item. My first thought was to use the GUID from the SharePoint library to match up to the Purview data. This is 100% incorrect! From SharePoint, UniqueId is the field that matches the Purview field ListItemUniqueId.

Basic logic:
SELECT SharePoint.*
FROM SharePoint
INNER JOIN Purview
ON SharePoint.UniqueId = Purview.ListItemUniqueId

I used Power BI to format and mash the exported Purview data with the SharePoint data. Power BI is unnecessary; you can easily use Power Query in Excel to do the same thing. Below, I’m including my M code statement that parses the JSON from the Purview file, and counts how many times the files were accessed and the last time they were accessed.

let
    Source = Csv.Document(File.Contents("C:\ian240329\Purview_Audit.csv"),[Delimiter=",", Columns=8, Encoding=1252, QuoteStyle=QuoteStyle.None]),
    #"Promoted Headers" = Table.PromoteHeaders(Source, [PromoteAllScalars=true]),
    #"Changed Type" = Table.TransformColumnTypes(#"Promoted Headers",{{"RecordId", type text}, {"CreationDate", type datetime}, {"RecordType", Int64.Type}, {"Operation", type text}, {"UserId", type text}, {"AuditData", type text}, {"AssociatedAdminUnits", type text}, {"AssociatedAdminUnitsNames", type text}}),
    #"Parsed JSON" = Table.TransformColumns(#"Changed Type",{{"AuditData", Json.Document}}),
    #"Expanded AuditData" = Table.ExpandRecordColumn(#"Parsed JSON", "AuditData", {"ListItemUniqueId", "SourceFileExtension", "SourceFileName", "ObjectId"}, {"ListItemUniqueId", "SourceFileExtension", "SourceFileName", "ObjectId"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded AuditData",{"AssociatedAdminUnits", "AssociatedAdminUnitsNames", "RecordId"}),
    #"Renamed Columns" = Table.RenameColumns(#"Removed Columns",{{"ObjectId", "File URL"}, {"SourceFileName", "File Name"}, {"SourceFileExtension", "File Extension"}}),
    #"Filtered Rows" = Table.SelectRows(#"Renamed Columns", each ([File Extension] <> "aspx")),
    #"Filtered Rows1" = Table.SelectRows(#"Filtered Rows", each true),
    #"Removed Columns1" = Table.RemoveColumns(#"Filtered Rows1",{"Operation", "RecordType"}),
    #"Grouped Rows" = Table.Group(#"Removed Columns1", {"ListItemUniqueId"}, {{"View Count", each Table.RowCount(_), Int64.Type}, {"Last Viewed", each List.Max([CreationDate]), type nullable datetime}})
in
    #"Grouped Rows"

Still working in Power Query, I created a new query to show what SharePoint files had not been accessed. My Purview license is limited to 6 months‘ worth of data, so this is one hindrance to painting a full picture of what has/has not been accessed.

let
    Source = Table.NestedJoin(SharePoint_Audit, {"UniqueId"}, Purview_Audit, {"ListItemUniqueId"}, "Purview_Audit", JoinKind.LeftAnti),
    #"Expanded Purview_Audit" = Table.ExpandTableColumn(Source, "Purview_Audit", {"File Name"}, {"Purview_Audit.File Name"}),
    #"Sorted Rows1" = Table.Sort(#"Expanded Purview_Audit",{{"Purview_Audit.P File Name", Order.Descending}}),
    #"Filtered Rows1" = Table.SelectRows(#"Sorted Rows1", each true)
in
    #"Filtered Rows1"

With this data, I can now report to the user what files have not been accessed in the past 6 months.


Using New-PnPSite With A Multi Geo Tenant

If you try to run the PnP PowerShell command New-PnPSite using a managed identity or App Registration, in a multi-geo tenant, it will create the site in the default geo. To get around this, you can use the PreferredDataLocation parameter to set the desired location, but you’ll also need to update your MS Graph permissions.

If you run the New-PnPSite command with the -PreferredDataLocation parameter and your permission are not correct, you will receive this error:

Code: Authorization_RequestDenied Message: The requesting principal is not authorized to set group preferred data location.

Open your App Registration and add the following MS Graph application permissions:
Group.Create, Group.ReadWrite.All, Directory.ReadWrite.All

New-PnPSite -Type TeamSite -PreferredDataLocation NAM -Title "Test" -Alias "Test0001" -Description "my test site" -Owners email@domain.com -Wait

Other people who had the same issue:
https://github.com/pnp/powershell/issues/2629
https://github.com/pnp/PnP-PowerShell/issues/2682
https://learn.microsoft.com/en-us/answers/questions/1099399/unable-to-create-modern-team-site-using-pnp-powers

Complete list of the geo codes can be found here:
https://learn.microsoft.com/en-us/microsoft-365/enterprise/multi-geo-add-group-with-pdl?view=o365-worldwide#geo-location-codes

Set of geo codes as of March 2024:

Modernizing Authentication in SharePoint Online

Starting a year or two ago, Microsoft announced it would stop supporting and/or blocking access to Azure Access Control Services (ACS) and the SharePoint Add-In model. This is important because ACS has been used for many years to grant app/script API access to a SharePoint site(s), and you likely have many sites where this has been used. Moving forward, Azure Access Control (AAC) will be used in place of ACS.

Historically, you would start the permissions journey by generating the client ID and secret at this endpoint:
_layouts/15/AppRegNew.aspx
From there, you grant the newly created identity access to a tenant, sites, lists, libraries, or a combination.
_layouts/15/appinv.aspx
The other option was to create an Azure App Registration and then grant it access to the target objects. When working with SharePoint Online and AppRegNew.aspx, the App Registration is generated automatically. Depending on what is/is not configured, this can be an issue and set off alarms in Azure.

With that out of the way, how do you wire-up a new connection to SharePoint Online, allowing PowerShell, Python, script, or app access to the SharePoint API or the Microsoft Graph API?

Overview of what I’m doing:
Create a self-signed cert
Add the cert to your personal cert store and upload it to Azure, creating an App Registration
Adjust permissions as needed
Grant the App Registration access to a specific SharePoint site
Use the newly created credentials to access the SharePoint site

#create cert with a password
New-PnPAzureCertificate `
    -CommonName "Demo_SP_Azure_2024" `
    -OutPfx c:\code\Demo_SP_Azure_2024.pfx `
    -OutCert c:\code\Demo_SP_Azure_2024.cer `
    -CertificatePassword (ConvertTo-SecureString -String "Taco@Good" -AsPlainText -Force) `
    -ValidYears 1

#import the cert. for this to work, run as Admin.
Import-PfxCertificate `
    -Exportable `
    -CertStoreLocation Cert:\LocalMachine\My `
    -FilePath c:\code\Demo_SP_Azure_2024.pfx `
    -Password (ConvertTo-SecureString -String "Taco@Good" -AsPlainText -Force)

I highly suggest not skipping the ‘-interactive’ part for the next command. It will open a browser window where you must authenticate with an account with adequate permissions to create a new App Registration in Azure. The script’s most important thing to note is SharePointApplicationPermissions Site.Selected. Why is this important? This is extremely useful if you want to limit permissions to a single SharePoint site and not every site in the tenant.

Register-PnPAzureADApp `
   -ApplicationName Demo_SP_Azure_2024 `
   -Tenant tacoranch.onmicrosoft.com `
   -Store CurrentUser `
   -SharePointApplicationPermissions "Sites.Selected" `
   -Interactive

After that runs, the output will include the Client ID and Certificate Thumprint. Take note of it, or you can open Azure, navigate to App Registrations, and select all applications. In the left nav, click Certificates & secrets, where you’ll find the thumbprint; again, in the left nav, click Overview, and you’ll see the Application ID, aka Client ID.

In the next two commands, you will connect to the SharePoint Admin site interactive, then grant the newly created App Registration write access to the target SharePoint site.

Connect-PnPOnline -Url "https://tacoranch-admin.sharepoint.com" `
    -Interactive

#grant the App Reg write access to the site
Grant-PnPAzureADAppSitePermission `
    -AppId 'a95ddafe-6eca-488a-b26a-dc62a64d9105' `
    -DisplayName 'Demo_SP_Azure_2024' `
    -Site 'https://tacoranch.sharepoint.com/sites/cert-demo' `
    -Permissions Write


Now that the App Registration can access the SharePoint site, how do you connect to it using the certificate and thumbprint?

Connect-PnPOnline `
    -Tenant "tacoranch.onmicrosoft.com" `
    -Url "https://tacoranch.sharepoint.com/sites/cert-demo" `
    -ClientId "a95ddafe-6eca-488a-b26a-dc62a64d9105" `
    -Thumbprint "5C5891197B54B9535D171D6D9DD7D6D351039C8E" 

Get-PnPList | Select-Object Title

Using the above commands, you can create a cert, register it in Azure, and grant access to a single SharePoint site.

I’ve included a copy of the full script here:
https://www.sharepointed.com/wp-content/uploads/2024/03/Azure-App-Reg-Cert-Demo.txt

Error(s) and fixes:
Error:
Grant-PnPAzureADAppSitePermission: {"error":{"code":"accessDenied","message":"Access denied","innerError":{"date":"2024-03-21T16:29:47","request-id":"","client-request-id":""}}}
Fix:
Ensure the account running this command Grant-PnPAzureADAppSitePermission , has access to the target SharePoint site.

Error:
Connect-PnPOnline: A configuration issue is preventing authentication - check the error message from the server for details. You can modify the configuration in the application registration portal. See https://aka.ms/msal-net-invalid-client for details. Original exception: AADSTS700027: The certificate with identifier used to sign the client assertion is not registered on application. [Reason - The key was not found., Thumbprint of key used by client: '6C5891197B54B9535D179D6D9DD7D6D351039D8Z', Please visit the Azure Portal, Graph Explorer or directly use MS
Graph to see configured keys for app Id 'eb7f9fbc-f4ee-4a48-a008-49d6bcdc6c40'. Review the documentation at https://docs.microsoft.com/en-us/graph/deployments to determine the corresponding service endpoint and https://docs.microsoft.com/en-us/graph/api/application-get?view=graph-rest-1.0&tabs=http to build a query request URL, such as 'https://graph.microsoft.com/beta/applications/aeb7f9fbc-f4ee-4a48-a008-49d6bcdc6c40']. Trace ID: 3e1e7ab3-60c0-4126-acb8-e2fdb6e28000 Correlation ID: 62ddc80b-aeb2-49c5-8c31-83a04e70bf6e Timestamp: 2024-04-01 12:21:59Z

Fix:
This one is easy; ensure you use the correct Client ID and thumbprint. You can get this from the app registration page in the Azure portal.

How to upload a large file to SharePoint using the Microsoft Graph API

What started as a simple question from a co-worker turned into a rabbit hole exploration session that lasted a bit longer than anticipated. ‘Hey, I need to upload a report to SharePoint using Python.’

In the past, I’ve used SharePoint Add-in permissions to create credentials allowing an external service, app, or script to write to a site, library, list, or all of the above. However, the environment I’m currently working in does not allow Add-in permissions, and Microsoft has been slowly depreciating the service for a long time.

As of today (March 18, 2024) this is the only way I could find to upload a large file to SharePoint. Using the MS Graph SDK, you can upload files smaller than 4mb, but that is useless in most cases.

For the script below, the following items are needed:
Azure App Registration:
Microsoft Graph application permissions:
Files.ReadWrite.All
Sites.ReadWrite.All
SharePoint site
SharePoint library (aka drive)
File to test with

import requests
import msal
import atexit
import os.path
import urllib.parse
import os

TENANT_ID = '19a6096e-3456-7890-abcd-19taco8cdedd'
CLIENT_ID = '0cd0453d-cdef-xyz1-1234-532burrito98'
CLIENT_SECRET  = '.i.need.tacos-and.queso'
SHAREPOINT_HOST_NAME = 'tacoranch.sharepoint.com'
SITE_NAME = 'python'
TARGET_LIBRARY = 'reports'
UPLOAD_FILE = 'C:\\code\\test files\\LargeExcel.xlsx'
UPLOAD_FILE_NAME = 'LargeExcel.xlsx'
UPLOAD_FILE_DESCRIPTION = 'A large excel file' #not required

AUTHORITY = 'https://login.microsoftonline.com/' + TENANT_ID
ENDPOINT = 'https://graph.microsoft.com/v1.0'

SCOPES = [
    'Files.ReadWrite.All',
    'Sites.ReadWrite.All'
]

cache = msal.SerializableTokenCache()

if os.path.exists('token_cache.bin'):
    cache.deserialize(open('token_cache.bin', 'r').read())

atexit.register(lambda: open('token_cache.bin', 'w').write(cache.serialize()) if cache.has_state_changed else None)

SCOPES = ["https://graph.microsoft.com/.default"]

app = msal.ConfidentialClientApplication(CLIENT_ID, authority=AUTHORITY, client_credential=CLIENT_SECRET, token_cache=cache)

result = None
result = app.acquire_token_silent(SCOPES, account=None)

drive_id = None

if result is None:
    result = app.acquire_token_for_client(SCOPES)

if 'access_token' in result:
    print('Token acquired')
else:
    print(result.get('error'))
    print(result.get('error_description'))
    print(result.get('correlation_id')) 

if 'access_token' in result:
    access_token = result['access_token']
    headers={'Authorization': 'Bearer ' + access_token}

    # get the site id
    result = requests.get(f'{ENDPOINT}/sites/{SHAREPOINT_HOST_NAME}:/sites/{SITE_NAME}', headers=headers)
    result.raise_for_status()
    site_info = result.json()
    site_id = site_info['id']

    # get the drive / library id
    result = requests.get(f'{ENDPOINT}/sites/{site_id}/drives', headers=headers)
    result.raise_for_status()
    drives_info = result.json()
    
    for drive in drives_info['value']:
        if drive['name'] == TARGET_LIBRARY:
            drive_id = drive['id']
            break

    if drive_id is None:
        print(f'No drive named "{TARGET_LIBRARY}" found')

    # upload a large file to
    file_url = urllib.parse.quote(UPLOAD_FILE_NAME)
    result = requests.post(
        f'{ENDPOINT}/drives/{drive_id}/root:/{file_url}:/createUploadSession',
        headers=headers,
        json={
            '@microsoft.graph.conflictBehavior': 'replace',
            'description': UPLOAD_FILE_DESCRIPTION,
            'fileSystemInfo': {'@odata.type': 'microsoft.graph.fileSystemInfo'},
            'name': UPLOAD_FILE_NAME
        }
    )

    result.raise_for_status()
    upload_session = result.json()
    upload_url = upload_session['uploadUrl']

    st = os.stat(UPLOAD_FILE)
    size = st.st_size
    CHUNK_SIZE = 10485760
    chunks = int(size / CHUNK_SIZE) + 1 if size % CHUNK_SIZE > 0 else 0
    with open(UPLOAD_FILE, 'rb') as fd:
        start = 0
        for chunk_num in range(chunks):
            chunk = fd.read(CHUNK_SIZE)
            bytes_read = len(chunk)
            upload_range = f'bytes {start}-{start + bytes_read - 1}/{size}'
            print(f'chunk: {chunk_num} bytes read: {bytes_read} upload range: {upload_range}')
            result = requests.put(
                upload_url,
                headers={
                    'Content-Length': str(bytes_read),
                    'Content-Range': upload_range
                },
                data=chunk
            )
            result.raise_for_status()
            start += bytes_read

else:
    raise Exception('no access token')

In the script, I’m uploading the LargeExcel file to a library named reports in the python site. It is important to note that the words drive and library are used interchangeably when working with MS Graph. If you see a script example that does not specify a target library but only uses root, it will write the files to the default Documents / Shared Documents library.

Big thank you to Keath Milligan for providing the foundation of the script.
https://gist.github.com/keathmilligan/590a981cc629a8ea9b7c3bb64bfcb417

How to Find Your Microsoft Forms Data: Locating the Linked Excel File

This started as a simple question: where is the backend Excel file for my group Forms form stored?

By default, a group form will save responses to an Excel file in the SharePoint site associated with the group. Within that site, the file is stored in the Documents, aka Shared Documents library.

Here is a quick way to track down the file:

With the form open, click on Response.

Click on Open in Excel.

Depending on how your SharePoint library is configured, the file will either download to your computer or open in the browser. Open the file and click on the name or click the down arrow next to it.

When the window opens, it will show exactly where the file is stored.

In this example, the file is stored in the Shared Documents library on the Testing site. Again, this example shows Shared Documents, but on the site, it’s actually named Documents.

STOP, I don’t see the window noted in the above screenshot! This more than likely means you are working with a personal form.

Where is the Excel file stored for personal forms? Not where you’d guess and not anywhere worthwhile. The file is more or less saved with the form and is inaccessible other than downloading it.
Personal Forms response data is now stored in the author’s OneDrive / SharePoint site.

What if I copy my personal form to a group? What will happen to the Excel file?
Don’t do this; just recreate the form from scratch. The copied form will retain the behavior of storing the file with the form, not in SharePoint.

How can I save form responses to a SharePoint list or Dataverse table? You would need to create a Flow to intercept the form response and then save it to the destination.

Will creating a Flow that saves form responses to another destination impact the form saving to Excel? No, the form will always use the backend Excel file as its data storage.

If I download a copy of the backend Excel file, will the downloaded copy be updated with new form submissions? No, the copy is disconnected from the source.