March 20th, 2020


Using Azure Function Apps to process Blob Storage event triggers

Today's post is a tutorial on how to use Azure App Functions to process blob event triggers from Azure Blob Storage using only command line tools and Javascript.


Oftentimes when receiving files in Azure Blob Storage it is necessary to perform some action on each file as they are uploaded. This can include things like renaming files based on filenames, moving files to other locations, extracting data to load into a database, compressing, or decrypting. While products like Microsoft Power Automate (formerly Microsoft Flow) can accomplish some of these simpler tasks, most of the time you will need the flexibility of writing custom code. A very powerful low-cost solution to this problem is Azure App Functions coupled with blob event triggers.

For this example, we are going to be assuming that each blob that is uploaded has been PGP encrypted with a password, and we are going to decrypt it and store it on another path in the same blob container.

To complete the steps outlined below you will need to install:

Let's get started...

Part 1. Create the Azure resources

First we are going to use the Azure CLI to create the necessary resources. Note that Storage Account and Function App names must be globally unique so you may need to change those values when you run this.

Create a resource group

This resource group will hold all of our resources. At the end of the tutorial, we can delete this resource group to delete all associated resources.

az group create \
 --name AzureFunctionsTutorial \
 --location westus

Create a storage account

Each Function App requires a Storage Account. We need to create that before we can create the Function App. We will also use a blob container in this same Storage Account to store the encrypted files which will trigger our decryption function.

az storage account create \
  --name azurefunctionsblobs \
  --kind StorageV2 \
  --resource-group AzureFunctionsTutorial \
  --sku Standard_LRS

Create a blob container

We need to create a blob container in the Storage Account we created previously. This will be the blob container we upload our encrypted files to.

az storage container create \
  --name tutorial \
  --account-name azurefunctionsblobs

Create the Function App

Now create the Function App. Note that the Storage Account is the one created in the previous step and that the consumption plan location is in the same location as our resource group.

az functionapp create \
  --name AzureFunctionsTutorialBlog \
  --storage-account azurefunctionsblobs \
  --consumption-plan-location westus \
  --functions-version 4 \
  --runtime node \
  --resource-group AzureFunctionsTutorial

Configure app settings

This setting is going to be used by the function code later on. We are specifying the password we will use to decrypt incoming files.

az functionapp config appsettings set \
  --name AzureFunctionsTutorialBlog \
  --resource-group AzureFunctionsTutorial \
  --settings "DecryptionPassword=secretpass"

Part 2

Once all the required Azure resources are created we can begin developing the project. Create a folder for this project. All steps below assume that you are executing the commands from that folder.

Create an application

In the folder where you want to keep your project files, use the Azure Function Tools CLI to initiate a Node application. The application folder will contain a package.json and host.json file, as well as a sub-folder for each function in the application.

func init --worker-runtime node

Create the function

We can now initialize the blob triggered Javascript function. This will create a new folder called BlobDecrypter.

func new --language javascript --name BlobDecrypter --template "Azure Blob Storage trigger"

Configure trigger settings

The BlobDecrypter/function.json file specifies the trigger settings for that function.

In the configuration below, the path property for the blogTrigger binding myBlob indicates that we only want to trigger the function when blobs are uploaded on the path inbox/ in a container called tutorial. Since we are going to be writing the decrypted files back to the same container we need to restrict the trigger to a specific path to prevent the function from triggering again when we write our decrypted files back to the container.

We will use an output binding to write the decrypted files back to the same blob container.

The property connection specifies the connection string of the Storage Account to use. Since we are leaving it blank, it will default to the value of AzureWebJobsStorage, which is the Storage Account that the function is using.

  "bindings": [
      "name": "myBlob",
      "type": "blobTrigger",
      "direction": "in",
      "path": "tutorial/inbox/{name}",
      "connection": ""
      "name": "myOutputBlob",
      "type": "blob",
      "direction": "out",
      "path": "tutorial/decrypted/{name}",
      "connection": "",

Add the code

The code below will load a password from the Function App configuration and use it to try to decrypt each triggered blob. The decrypted blob is then uploaded to a blob container under the path decrypted/. The location of this blob container is also retrieved from the Function App configuration.

Add the function code to BlobDecrypter/index.js

const openpgp = require('openpgp');

module.exports = async function (context, blob) {
  const decryptionPassword = process.env.DecryptionPassword;

  const message = await openpgp.readMessage({
      binaryMessage: blob

  const { data: decrypted } = await openpgp.decrypt({
      message: message,
      passwords: [ decryptionPassword ],
      format: 'binary'

  // Write the decrypted blob to the container using the output binding.
  context.bindings.myOutputBlob = decrypted;

Install dependencies

We need to have all dependencies in our node_modules folder so Azure Functions Tools can package it along with our code when we publish.

npm install --save openpgp

Publish the function

That's it! We're all done. Time to package and upload the function to the Function App.

func azure functionapp publish AzureFunctionsTutorialBlog

Part 3

It's time to see the function in action. In this next part of the tutorial, we will create an encrypted file and upload it to our blob container. After about a minute we should be able to download the decrypted file.

Create an encrypted file

Create a file called create-encrypted-file.js and paste in the code below. This will create a password encrypted file from a short text string.

const openpgp = require('openpgp');
const fs = require('fs');

(async () => {
  const message = await openpgp.createMessage({
    text: 'This is a secret message.'

  const encrypted = await openpgp.encrypt({
    message: message,
    passwords: ['secretpass'],
    format: 'binary'
  fs.writeFileSync('secret.txt', Buffer.from(encrypted));

Now run the code using Node. An encrypted file called secret.txt will be created.

node create-encrypted-file.js

Upload the encrypted file

Use the Azure CLI to upload the file secret.txt to the inbox folder path of the blob container which our function is receiving events from.

az storage blob upload \
  --container-name tutorial \
  --account-name azurefunctionsblobs \
  --file secret.txt \
  --name inbox/secret.txt

Download decrypted

Now wait about a minute or so for the trigger to get processed, then download the decrypted file. At this point you can open the file and verify that it contains the string we encrypted in the previous step.

az storage blob download \
  --container-name tutorial \
  --account-name azurefunctionsblobs \
  --file secret.txt.decrypted \
  --name decrypted/secret.txt


That's it. We are done. Let's delete our resource group to clean everything up.

az group delete --name AzureFunctionsTutorial


In this blog post, you saw how to use the command line tools provided by Azure to quickly deploy a javascript function to process Blob Storage events. I hope you have enjoyed this post. Thanks for reading. If you are a Azure Blob Storage user keep scrolling to find out how FileMage Gateway could help you.

FileMage Gateway is an FTP and SFTP server that streams transfers directly to Azure Blob Storage, Amazon S3, and Google Cloud Storage.