Microsoft Power Automate (Flow) has a hard time with files over 1GB, and depending on the business process; this can cause a lot of headaches. Due to this, I opted to use an Azure runbook to download files, then use a Flow to handle downstream processing. The scope of this post is to show how to download files from an SFTP and move them to blob storage.
Tooling being used:
Azure Resource Group
Azure Automation Account
Posh-SSH Module
Azure Runbook (PowerShell) + Hybrid Worker
Azure Storage Account
Overview of what the script does:
Connect to Azure
Connect to the SFTP server
Get a list of files from the server where the size is greater than 0
Loop through the files
Create a temp folder
Download the file from SFTP to the temp folder
Create a new blob using the file from the temp folder
Check if the blob was created
If yes – remove the associated file from the SFTP server
Clean up the SFTP session
#sftp items
$port = "22"
$Password = "my password"
$User = "sftp-user"
$hostName = "sftp.sharepointed.com"
$SftpPath = '/sftp_home/out/bigfiles/'
#blob storage related variables
$container = "mycontainer"
$blobFilePath = "myfolder/newfiles"
#get connection setup
$account = Connect-AzAccount -Identity -Confirm:$false | Out-Null
write-output $account
try {
$accKey = (Get-AzStorageAccountKey -ResourceGroupName "my rg name" -Name "my value")[0].Value
$context_storageAcct = New-AzStorageContext -StorageAccountName "my value" -StorageAccountKey $accKey
write-output $context_storageAcct
}
catch {
$errors += [PSCustomObject]@{Item = "azcontext"; Error = $_.Exception }
write-output "error get AZcontext"
}
#sftp setup
$secpasswd = ConvertTo-SecureString $Password -AsPlainText -Force
$Credentials = New-Object System.Management.Automation.PSCredential($User, $secpasswd)
write-output "====== Starting SFTP Session on $($hostName)"
$ssh = New-SFTPSession -ComputerName $hostName -Credential $Credentials -Port $port -Force -Verbose
write-output " +++ SFTP Session started on $($hostName)"
#get files from sftp
$listOfFiles = Get-SFTPChildItem -SessionId $ssh.SessionId -Path $SftpPath
$filteredFiles = @($listOfFiles | Where-Object { $_.Attributes.Size -gt 0 })
foreach ($file in $filteredFiles) {
try {
$blobPath = $($blobFilePath + "/" + $file.Name)
$localFolderBase = [System.Io.Path]::GetFileNameWithoutExtension($file.Name)
$localFolder = $($env:temp + "\" + $localFolderBase)
$fileToCopy = $($localFolder + "\" + $file.Name)
#create new folder to hold the downloaded file
New-Item -Path $localFolder -ItemType Directory -Force -Confirm:$false | Out-Null
#download file to local storage
Get-SFTPFile -SessionId $ssh.SessionId -RemoteFile $file.FullName -LocalPath $localFolder
#move file to container
Set-AzStorageBlobContent -File $fileToCopy -Container $container -Blob $blobPath -StandardBlobTier Cool -Context $context_storageAcct.Context -Force -Confirm:$false
#check if file was saved to the container
$blob = Get-AzStorageBlob -Blob $blobPath -Container $container -Context $context_storageAcct.Context -ErrorAction Ignore
if ($blob)
{
Write-output "Blob can be removed from SFTP"
Remove-SFTPItem -SessionId $ssh.SessionId -path $file.FullName -Force
}
}
catch {
write-output $_.Exception
}
}
#terminate the SFTP session
Remove-SFTPSession -SessionId $ssh.SessionID
The script can easily be expanded to trigger another runbook or move the files to SharePoint. One thing to note, my runbooks run on a Hybrid Worker, and this helps a lot with long-running jobs or jobs that are memory intensive.
Update / Edit –
You’ll need the Posh-SSH module to be enabled in your automation account. How-to related to enabling this: https://learn.microsoft.com/en-us/azure/automation/shared-resources/modules