Automating Customer Payload Transaction Information Transfer to Azure SQL PaaS and Blob Storage using Azure Trigger Functions

Madhuvanthi Sridhar
11 min readMay 27, 2021

Whether we’re going to a store, making an online purchase, visiting a doctor, or renting a car, etc., the details of the transactions we’ve made are crucial to any business in order to maintain proper records and additionally provide future notifications or discounts to us to revisit them. In this article, we’ll be looking at a simple way to automate retrieval of generic transaction details using Azure Service Bus (SB) Queue Trigger Functions and uploads to Azure Blob Storage and Azure SQL PaaS Tables respectively. There is no authentication required for this process if Azure Portal is used to write the function app code (in C# .NET), which is what I will be demonstrating in this tutorial. If you are interested in writing the function code in Java or Python, you need to download the Azure Functions SDKs for these languages and perform authentication using an Azure Active Directory (AAD) app in order to access the services on Portal from your program. Check out my blog Retrieval of Performance Metrics from Azure Data Factory and Integration with New Relic to view the steps involved in setting up and using an AAD app for authentication to Azure services.

Figure 1 below shows a process flow schematic that provides a high-level representation of the workflow to be discussed. The below sections explain each part of this entire workflow in great detail:

  • Setting up an Azure Function App using a Dedicated App Service Plan
  • Creating a new Azure Service Bus Queue Trigger Function and specifying an output binding to Azure Blob Storage
  • Adding C# .NET code to the Azure Function to retrieve and parse messages from the Azure Service Bus Queues in JSON format
  • Uploading data to a particular path in Azure Blob Storage
  • Creating an Azure SQL PaaS client connection object and inserting data to tables via queries written within the Function code
Figure 1. Process Flow Schematic for Automating Data Uploads to Azure Blob Storage and SQL PaaS

Check this out before reading ahead if you’re new to Azure Functions…

To create and start using Azure Functions to perform various activities, the first step is to set up an Azure Function App. A Function App can almost be described as a container to store all of the different functions we develop on Azure. To create a Function App, we need to first determine a subscription, resource group, and storage account that we want to associate with the Function App. This will allow it to be added as a standard Azure resource to our existing list of resources. Further, we must also choose a plan that we want to pair with our Function App. Azure Functions offers three types of plans we can choose from — a consumption plan, premium plan, and a dedicated app service plan. The consumption plan is the most basic and inexpensive plan, while the premium and dedicated app service plans are more expensive but more effective in terms of offering lesser latency and timeout. The premium plan also comes with pre-warmed function instances to immediately start the execution as soon as it is invoked by the user. Now that we have a basic working understanding of how this service works, we can move forward to see how I’ve set this up for the task at hand.

Setting up an Azure Function App using a Dedicated App Service Plan

Timeout and delay are major factors to consider when working with crucial data transfer like customer transactional information, so I have chosen the dedicated app service plan that I had already created when setting up web apps onto Azure App Service. [1] explains how to set this up if you’re completely new to Azure or still figuring out your way around the resources. Figures 2–4 demonstrate how to set up an Azure Function App step by step with a dedicated app service plan using Portal. Figure 3 shows the newly created Function App with all of its properties including the main URL it will be hosted on, the subscription/subscription ID it is associated with, and the plan it utilizes to run functions.

Figure 2. Setting up the Azure Function App (Step 1)
Figure 3. Setting up the Azure Function App (Step 2)
Figure 4. Setting up the Azure Function App (Step 3)
Figure 5. New Azure Function App Created on Portal

Creating a new Azure Service Bus Queue Trigger Function and specifying an output binding to Azure Blob Storage

Once the Function App has been created, we can go ahead and add a new Azure Function to the app in the Functions tab for the resource. There are a list of different options representing different types of functions to choose from, including HTTP triggers, Timer triggers, and resource/event-based triggers. Since I would like to trigger my function as soon as customer transaction details have been uploaded as individual messages to an Azure Service Bus Queue, I will choose my function type as a Service Bus Queue Trigger Function. Once the type of function has been selected as SB Queue Trigger, we must also provide the connection details (via a shared access policy that is either a default (Root Managed)/custom implementation on the Service Bus namespace or particular queue we are requesting information from) as well as queue name. Supplying all of these details will allow us to create a new function of this type inside of our Function App. Figure 6 shows how to create an SB Queue Trigger Function visually on Portal as detailed in this section. Figure 7 shows the shared access policy that had been created by default for the Service Bus namespace before it had been linked to the SB Queue Trigger Function represented in Figure 6.

Figure 6. Creating an Azure Service Bus Queue Trigger Function
Figure 7. Default Shared Access Signature (SAS) Policy for a Service Bus Namespace

Post creating a function, the most important thing is to specify an output binding. Output bindings allow us to “connect” or bind our function output to a specific resource on Azure or some other platform like SendGrid or Twilio. To create an output binding, head over to the Integration tab on the function and on the flow schematic under the Outputs tab, click on Add Output. This will present a list of options of the different types of bindings that can be created. In this case, I will be using Azure Blob Storage as my output binding so that I can have important customer identity information such as full name, email, and id available for rewards programs and discounts, a tactic followed by most well-established organizations. When creating the binding, I specify my exact blob path (container name/blob name) and the particular storage account I’m connecting my data to, so that the function knows precisely where to upload the blob for my usage. Figures 8–11 below show how to perform the sequence of steps detailed above on the function in Azure Portal. Now that we’ve determined how to create an SB Queue Trigger function and bind its output to blob storage, the next step is retrieving and parsing out information from the queue to make uploads to the blob.

Figure 8. Creating an Azure Functions Output Binding to Blob Storage (Step 1)
Figure 9. Creating an Azure Functions Output Binding to Blob Storage (Step 2)
Figure 10. Creating an Azure Functions Output Binding to Blob Storage (Step 3)
Figure 11. New Output Binding to Blob Storage Created on Azure Function

Adding C# .NET code to the Azure Function to retrieve and parse messages from the Azure Service Bus Queues in JSON format

When working with payload information, most often we’ll be dealing with JSON (Java Script Object Notation), a very useful key/value pair format to separate details by type. The good thing is that as with any other programming language, C# has libraries that can handle and work with these types of objects, to parse values from particular keys, etc. When a Service Bus Queue Trigger Function is created, by default, the queue item or service bus message object to be even more explanatory, is brought into the function as a string. This is fine even in cases similar to ours where the messages are in JSON format, since the string queue item can be parsed as a JSON object to retrieve values. Figure 12 shows a code segment via which the above can be accomplished. Here, the queue item is parsed by the JObject class, which is part of the Newtonsoft Json package in C# for dealing with JSON objects. As is always the case with JSON due to its defined hierarchical structure, we need to sort out the data going from outside to inside the message (in our case, a JSON array (orders) to a JSON object, back to an array (customerInformation) and an object, and lastly a field(customerId, etc.)). This will provide us the information we are exactly looking for, which we can store as a variable for later use.

Figure 12. Parsing out Azure Service Bus Queue Messages to Upload Customer Information to Blob Storage

Uploading data to a particular path in Azure Blob Storage

Keeping less confidential and more often used data at our fingertips is something firms have been doing for a long time since the boom of the IT industry. With the introduction of long term storage facilities like Azure Blob Storage and Amazon S3 Buckets, its easy to store the most crucial information for immediate use, like basic customer details or promotions/offers, in a safe place where it won’t be accidentally deleted or require an insane level of authorization to access for store managers and marketers. To upload the information parsed out from the SB message as shown in Figure 12, we will use the return keyword and format our message to be uploaded in the manner we would like to store the details in. For simplicity and ease of understanding, I’ve done it here in a comma-separated sentence format, which provides the customer’s first and last name, email, and customer Id. Just specifying the return keyword on its own will not upload the message automatically to blob storage. We need to specify the type (or name) of input accepted as return in the blob storage output binding on the function.json file for this to happen automatically, as shown in Figure 13. We can see here that the rest of the parameters have already been populated for us like direction (out), type (blob), path (container name/blob name), and connection (storage account name). These were defined as soon as we created the binding on the Integration tab of the function, and do not need to be changed in any way. While we’re still looking through the function.json file, we can also see the service bus trigger binding created above, in a similar fashion but opposite direction to the blob storage binding. This is also created by default when we define the trigger function, and hence requires no further change. Figure 14 shows the blob that is created when this function is triggered, displaying a message in the same format we specified within our function.

Figure 13. Function.json file for the Azure Service Bus Queue Trigger Function
Figure 14. Output Blob Created from Triggering Azure Function

Creating an Azure SQL PaaS client connection object and inserting data to tables via queries written within the Function code

While blobs are useful for storing message format data, relational data such as all the factors related to an order or payment, are best to be stored in database tables, like Azure SQL PaaS. In our case, it makes sense to combine the order and payment information for the customer into one database table since we are dealing with a very simplified payload file. In more complex scenarios however, one would scale and divide the number of tables accordingly based on the magnitude and diversity of the information provided to them. Figure 9 shows the fields tracked by the orderPayments table. The fields in this table include the order number, order date, address, city, state, zip code, payment type, cardholder name, and last but not least, transaction amount. This is to ensure that all aspects of the purchase have been stored into the system as records for later use.

Figure 15. Fields Present in the OrderPayments Database Table (Azure SQL PaaS)

In order to insert into the table created above, it is necessary to understand how Azure Functions interface with Azure SQL PaaS. Since the latter is a completely independent platform resource on the Microsoft Azure cloud unlike Blob Storage, Service Bus Queues, etc., it cannot be bound to the Azure Function. Instead, a SQL connection and command object must be created and provided the connection string and query to insert records respectively, inside the main body of the function code. Figure 16 shows the remaining function code through which the above mentioned is accomplished.

Figure 16. Connecting to Azure SQL Database Table to Insert Order/Payment Records

Here, the connection string to the particular database I’m using (which can be found under the connection strings tab for your DB on Portal) is provided as an input to a SQL Connection object. This will allow me to access the database from the function through my regular user account credentials for the DB. After obtaining and storing all the information needed from the payload file as variables along with formatting a basic insert query to the DB, I’ve created a SQL Command object and input both the query and the connection object. This will execute the insert query on the table specified within the database that we have already established a connection to earlier in our code. Once the code has executed, we can open our table on the Azure SQL PaaS resource on Portal and view the record that has been newly added, as shown in Figure 17. All the fields have been populated with their respective values from our sample payload file as expected, indicating a successful data upload to Azure SQL PaaS from the Azure Function.

Figure 17. New Customer Order/Payment Record Created on Azure SQL Database Table

Putting it all into perspective…

Maintaining records and information from any transaction we make or our customers make is highly crucial to every developing and well-established business. To the average customer, buying a loaf of bread from the grocery store or purchasing a high-end SUV may seem like just an expenditure where money is spent for the goods they’ve bought. But from a business side of things, every customer is treated as a valuable source of income and promise, where the thought process is aimed toward bringing the customer back to purchase more and more for better revenue by performing a variety of sales tactics like offering promotions, discounts, rewards, etc. To aid in this work flow, businesses utilize cloud providers such as Microsoft Azure and their enumerable resources, like Azure Blob Storage, Service Bus, SQL PaaS, etc. since they can easily store this data in a pay-as-you-go model and make use of it when they need to attract a customer towards their product again. The main focus of this blog is to present a small-scale solution to automate the storage of records related to customer payload transactions for easy retrieval later on; a template that can be used by enterprises when they work with larger sets of customer data.

Useful Resources

  1. https://docs.microsoft.com/en-us/azure/app-service/app-service-plan-manage#:~:text=Create%20an%20App%20Service%20plan%201%20In%20the,the%20pricing%20tier%20of%20the%20new%20plan.%20

--

--

Madhuvanthi Sridhar

BSMS Mechanical Engineering Grad from Georgia Tech now working as a DevOps Developer at Warner Media. I have a passion towards both cars and coding alike.