Building and Scaling IoT solutions with Microsoft Azure

In this story, I will walk you through the concepts and steps involved in building and scaling an IoT solution by leveraging the power of the cloud IoT services provided by Microsoft Azure.

Karthik Kalyanaraman
Bits and Pieces

--

source: xkcd comics

If you stumbled across this story while you were searching the web for good resources on Azure IoT or any cloud IoT solution or if you’re someone who has built an awesome IoT device and looking to build a scalable backend for a web/mobile app by making use of the data provided by the device, look no further because there is a very high chance you will get all your questions answered here. Stuff you’ll learn from this story:

  • Interface the Azure IoT Hub APIs into a physical IoT Device and transfer the device data to Azure cloud.
  • Build a highly scalable serverless backend to store and manage the data from all IoT Devices using Azure Functions and Azure CosmosDB.
  • Build a functional IoT control center client application using ReactJS and ExpressJS to interact with the data on the cloud.
  • If you’re here just for the source code, skip to the end for a link to the public GitHub repository.

Useful tip: Use Bit to share and collaborate on components across applications. Encapsulate components with all their dependencies and setup and share them to build truly modular applications with better code reuse, simpler maintenance and less overhead.

Let’s jump right on!

If you don’t have an Azure account, go ahead and create one. At the time of writing this, Azure provides one year of free services and $260 in free azure credits.

Before I explain how exactly the Azure IoT services can be made use of, lets brainstorm and see how we can architect a cloud based solution for an IoT device. For the sake of this explanation, let’s assume our IoT device is a 360 degree rotating camera that is sitting on a moving car and streaming high definition street view images and GPS coordinates to the cloud.

In order to effectively capture all the images that are being fed into the cloud, we need an efficient ingress storage node. To achieve this, we are going to use an intelligent Queue. One of the features of this Queue is, anybody can consume the data published on this Queue by subscribing to the updates through a built-in endpoint.

Since the Queue can only be used as a temporary storage, we need a means by which we can store the data permanently. In other words, we need a database. Having said that, we also need a way to automate the process of copying data from the Queue to the database when new data arrives at the Queue.

Now, let’s say we also need a web application that displays metrics about the data being processed by our system. In order to build this, all we need is a frontend that is built using the data directly from the database.

On paper, this system looks like quite a bit of work. But, let’s explore Azure and see how we can leverage the managed services to build this architecture out so that our final system looks something like this:

Final System using Azure Cloud Services

Let’s get started. Go to https://portal.azure.com and search for ‘IoT’. To start with, we need a Queue. So, let’s look for something that collects data from an IoT device. As we can see, we have IoT Hub and other services provided by Azure. Let’s click on IoT Hub.

Azure portal search bar

IoT Hub

IoT Hub home page

From the help text, it seems like this might be what we are looking for. Let’s go ahead and create it.

IoT Hub create page

Think of ‘Resource Group’ as a self-contained logical unit housing all the services that we plan to use. This is useful when we want to manage all the services from one place(ex: one button to kill all services deployed for a project). For this project, I created a new one called, ‘learningiot’. ‘Region’ reflects the physical location of the data centers where our service instances will be created. I generally like to keep it consistent across all the services I use so that it reduces the chances of network latencies. Now, click ‘Review+Create’. Don’t worry about other options as they’re not really relevant at this point.

But, what about our intelligent Queue?

IoTHub is a flavor of another Azure service called EventHubs. And EventHub is, at a very basic level, an intelligent Queue.

From the official Azure docs, “EventHubs is a fully managed, real-time data ingestion service that’s simple, trusted, and scalable. Stream millions of events per second from any source to build dynamic data pipelines and immediately respond to business challenges. Keep processing data during emergencies using the geo-disaster recovery and geo-replication features.”

Alright! We have found the first piece of our jigsaw puzzle. Let’s move on.

But, how do I connect our 360 camera IoT device to this IoT Hub?

IoTHub let’s us create logical IoT device endpoints that can be used for interfacing with real IoT devices. Let’s jump right on and create a device on the IoTHub.

IoT Device creation

IoT Device

‘360cameragps01' device created

Now that we have created a logical device on our IoT Hub, let’s look at how we can interface this logical end point to an actual device. For the purpose of this story, let’s create a simulated 360camera that is streaming images and GPS data.

In order to stream the data to our IoT Hub, we are going to write a light-weight nodeJS script running on a server/lightweight computer connected to the camera, streaming data as it arrives.

device.js

Line 6, 7 & 8 shows how to use the Azure IoT device SDK modules. In line 13, we have a variable called connectionString. This is the unique string that identifies the logical device that we created on our IoT Hub. You can find this string when you select the IoT device on the Azure portal.

Primary Connection String — Azure portal

In line 16, we are using the .fromConnectionString() API to make a connection to the IoT Hub and get an instance of the device so that we can send and receive data. Now jump to line 53. Here, we are using the .open(callback) to open a websocket connection to the logical device on the IoT Hub. The callback gets executed when the connection is invoked.

The callback function is implemented from line 18. In order to simulate the behavior of a 360 camera, we are generating random unique CDN urls simulating the location of the HD images on a CDN where the images are actually stored. We are also generating random GPS coordinates. Then on lines 35 to 38, we are constructing this data into a JSON object and sending it to the IoT Hub using the API, .sendEvent().

That’s it! Now let’s run and see if this script is doing what it says,

device.js console output

Great! Our simulated 360camera device was able to successfully connect to the IoT Hub and send data. Let’s see if our hub reflects this,

IoT Hub home reflects spike in device to cloud messages

As we can see from the above image, there is a spike in device to cloud messages which is exactly what we wanted to see. Now, let’s move on to the next piece, which is, transferring these messages to a persistent storage.

As we discussed earlier, IoT Hub is a temporary storage Queue, we could in fact set policies for data persistence on the Queue from the portal as shown below,

Built-in endpoints — events page

Azure Functions and CosmosDB

In order to persistently store the data, we are going to make use of another managed service provided by Azure called as CosmosDB. Think of CosmosDB as something like MongoDB but running on Azure cloud and a highly scalable pay-as-you-go solution to handle large amounts of data.

We are also going to use another service called Azure Functions. Azure Functions is a Functions as a Service(FaaS) offering from Azure. Think of it as a piece of code on the cloud that runs when an event happens. The number of functions running at any point of time dynamically grows or shrinks based on the traffic in the system.

As we can see, it’s quite easy to piece these services together to complete our puzzle. Our IoT device sends data to the IoT Hub which is basically an Event Hub. In other words, this can be viewed as events getting processed at the IoT hub. All that we need is an Azure function that responds to these events by copying this data to the CosmosDB.

Let’s roll! Let’s go to the portal and enable CosmosDB first.

CosmosDB — Creation page

Make sure you select Core (SQL) as the API. Other APIs are not supported for Azure functions at the time of writing this story. Now, let’s create Azure functions app.

Azure — Function app

At the time of writing this, Function app with NodeJs runtime has better support for Windows OS than Linux OS. So, let’s use Windows OS for this project. Now, let’s create a new function,

Azure — Functions App

The VSCode option is great when you are writing a complex function. VSCode also has an Azure functions extension that lets you deploy the functions right from VSCode. For this project, let’s use the In-portal option since our function is going to be quite simple. After you select In-portal, in the next page, select more templates and click Finish and view templates. Now search and select the IoTHub template and install the extension.

Azure — IoT Hub template

The IoTHub extension automatically gives you the hooks necessary to connect the IoTHub to this functions app.

IoT Hub selection

Select the right IoT Hub and choose the Events Endpoint as that’s where our IoT Device is publishing its data. Let’s go ahead and run the functions app to see if it can successfully pull the data from the IoT Hub. Let’s run our simulated device.js script and then launch the functions app using the Run command.

Azure functions — console

As we can see from the screenshot above, the console shows that the functions app has successfully processed the messages. index.js is the entry point for the functions app and when we selected the IoT Hub template, index.js is pre-populated with the code that we are running.

Okay! So far, we have an Azure functions app that is connected to our IoT Hub. Now, we need to hook our CosmosDB up to this app so that it can copy the data to the DB.

To connect the database to our functions app, we are going to make use of the Integrate option of the functions app.

Azure functions — Output trigger

Select the Azure Cosmos DB output trigger(Output because data is flowing out of the functions app). The portal will prompt you to install the CosmosDB integration. Go ahead and install it.

Azure functions — CosmosDB Output trigger

Make sure to select ‘If true, creates the Azure CosmosDB database…’ option so that it auto creates the collection on the database for you. Also, click new and connect the correct instance of Cosmos DB. Now, let’s modify the index.js script to connect to the CosmosDB.

Functions app entry point — index.js

Line 21 is the key change here. The context object has access to the output trigger using the bindings call. This publishes the data to our CosmosDB which is integrated as an output trigger. Now, let’s save and run our functions to see if our CosmosDB is reporting any new data.

Azure CosmosDB — Data explorer

Works like a charm! That’s it! As you can see from the screenshot above, CosmosDB has new documents with the data that’s being published into the IoT Hub by our simulated 360camera IoT Device.

At this point, we have a fully functional, highly scalable IoT solution built entirely using Azure cloud services.

Let’s make this project fancier by building a client application for managing more than one IoT device from a single place. This is quite useful if you want to manage and control the provisioning and deployment of multiple devices.

IoT Control Center

We are going to make use of the IoT Device’s digital twin data interface provided by the IoT Hub in order to enable communication between the control center and the devices through cloud. We could even get more creative and build client applications using the CosmosDB APIs where the actual data of the devices are persisted.

Let’s see how we can leverage the APIs of IoT Hub and CosmosDB to build something that a user can interact with. To build this solution, we are going to use an ExpressJS server to write REST API endpoints that will talk to the cloud by making use of the IoT Hub and CosmosDB APIs. To build the frontend, we are going to write a simple application using ReactJS that will consume the ExpressJS server’s REST APIs.

Device Digital Twin

As the name suggests, Device Digital Twin is a digital copy of the device’s properties and state on the IoT Hub. You can see the device digital twin on the IoT device page inside the IoT Hub.

Device Digital Twin

We can use the digital twin of each device to store useful metadata about the device like querying the state, deviceId etc.

ExpressJS Server

We are going to define endpoints for:

  • Get data from CosmosDb
  • Get list of devices from IoT Hub
  • Get a specific device’s digital twin
  • Update a specific device’s digital twin with a state variable (ON/OFF)
server.js

In order to query the IoT Hub and CosmosDB, we are using the azure-iothub and @azure/cosmosdb npm packages. Now, we just need to build a frontend using ReactJS that consumes these APIs. Let’s move on to the final part of this project.

ReactJS Frontend

For the client application, I have used the design templates from Materialize to build a simple table based frontend displaying the list of devices and a bunch of properties pulled from the device’s digital twin. There is also a toggle radio button that updates the state of the device on the cloud which can used by the IoT device to update its state.

React based Control center updating the device state on Azure portal

To see the source code of this application, refer to the client/ directory inside my public GitHub repository here:

https://github.com/kakaly/azure_iot

That’s it! What are you waiting for? Go ahead and create that Azure account to bring your crazy IoT ideas to life!

source: giphy

Hope you enjoyed reading this story. For feedback, questions and concerns, feel free to leave a comment under the comments section.

Learn More

--

--