In this series
In the previous article, we went through the Envisioning and Planning phases for delivering a Mixed Reality IoT application. In this article, we will start building the components of the architecture that we developed in the previous phase. Since we have a clear objective with all the plans laid out, let’s kick off the development phase.
Source Code
Before we start, I want to point you to the source code of the application that we are building. You can find the source code of this application in my GitHub repository.
Inside the Unity folder, you will find the MR Application that renders the device telemetry and inside the VS folder, you will find the mock device and WebAPI that make up the backend of the system.
Development Phase
In this phase, we will cater to a few of the scenarios identified for the first phase of development. Let’s start with the user scenario that using the application, the user should be able to view a field and the sensors installed in it. On air-tapping a sensor hologram, the user should be able to see the last recorded telemetry generated by the sensor. To realize this scenario, we will carry out the following activities in this phase.
- Build the hologram.
- Build the backend infrastructure.
- Build the Azure IoT solution.
- Build the holographic application.
- Connect the holographic app to Azure.
- Configure and stabilize the connected holographic app.
Building the hologram
In this phase, the 3D artists create 3D assets for the field and sensors from the 3D model sketch drawn by the artist. The 3D artists would supply us with prefabs that we can import in our application. As I am not an artist (at best I can draw stick men), I will use some of the assets from the Unity Store which are available for free. I have used a Unity Store asset for rendering the field landscape and used a 3D capsule asset for representing the sensors. Following is how our assets would look like in Unity. We will discuss building the MR app later in this article.
Developing the Solution
Next, we are going to build an IoT solution, which receives data from connected devices, stores the data in Cosmos DB and exposes the data to the Holographic application through a WebAPI. We will start the process of building the solution by going through the Logical Design Diagram which presents all the components using the same architecture design that we previously developed.
We will use Event Hub instead of IoT Hub for this sample because our scenario requires only unidirectional message flow (device to the cloud). However, for enterprise scenarios, I recommend that you use IoTHub service which is built specifically for IoT device connectivity.
Provisioning Backend Services
We will provision the following Azure services to build the ingestion and processing components of the solution.
- Azure Event Hub
- Azure Cosmos DB
- Azure Stream Analytics
We will use the portal experience (https://portal.azure.com) to build all the services for this solution. However, for enterprise scenarios, you should invest time in developing ARM templates for provisioning the infrastructure. You should also try to consolidate all the resources in a single resource group to make managing the resources simpler. Following is a screenshot of the resources that I created in my Azure subscription.
Provision Azure Event Hub
From New, select the Event Hubs services option and create a new Event Hub namespace by providing a name, resource group, and other required details.
Once the namespace is created, add a new Event Hub in the namespace by clicking on the + icon and then supplying the necessary information to provision the Event Hub.
Once the Event Hub is created, open the Event Hub instance and navigate to Settings → Shared Access Policies.
Create a Policy by selecting all claims that are Manage , Send , and Listen.
Note down the connection string and event hub name for further reference.
Create Azure Cosmos DB
Cosmos DB acts as a data store for the telemetry generated by the devices. To create an instance of Azure Cosmos DB, follow these steps.
- From New, select Azure Cosmos DB service and create a new Cosmos DB by providing ID, resource group, location, and your subscription. Make sure that you select SQL as the query language from the API Dropdown:
- Once the Azure Cosmos DB account is created, add a collection to it by clicking on the “Add Collection” button and add a database to it.
- Note down the connection details for further reference, which you can get from Settings → Keys.
Create Stream Analytics Job
Stream Analytics Job takes in data from Event Hub, processes it, and transfers it to Cosmos DB. Follow these steps to provision a Stream Analytics job.
From New , select Stream Analytics Job. Enter necessary details such as job name, resource group, and others. Let the number of streaming units , which is the number of units required to process the query stay at its minimum value, i.e., 1.
Select the Input tab and add a new input of type Event Hub , apply connection settings to the Event Hub which you just created above.
Select the Output tab and add a new output of type DocumentDB. Apply connection settings to connect to CosmosDB which you just created above.
Select the query tab and add a pass-through query to connect the input and output streams that you just configured.
From the Overview tab, start the job so that the Stream Analytics job begins processing data immediately.
Building the Device Simulator
Now that we have provisioned the infrastructure and created the plumbing between all the systems, we now need to develop a mock device that sends telemetry to the EventHub. Once the data lands at the Event Hub, it will be pushed to our Cosmos DB instance by the Stream Analytics Job that we just provisioned.
In Visual Studio, create a new console application and add the following code in the Program file.
internal class Program
{
private const string EhConnectionString = "Endpoint=sb://EVENTHUBNAMESPACE.servicebus.windows.net/;SharedAccessKeyName=Key";
private const string EhEntityPath = "EVENT HUB NAME";
private static void Main(string[] args)
{
while (true)
{
Console.Write("Number of messages? ");
var numberOfMessages = Convert.ToInt32(Console.ReadLine());
SendMessagesToEventHub(numberOfMessages).Wait();
}
}
private static async Task SendMessagesToEventHub(int numMessagesToSend)
{
var connectionStringBuilder = new EventHubsConnectionStringBuilder(EhConnectionString)
{
EntityPath = EhEntityPath
};
var eventHubClient = EventHubClient.CreateFromConnectionString(connectionStringBuilder.ToString());
Random random = new Random();
for (var i = 0; i < numMessagesToSend; i++)
{
try
{
var sensor1Record = new HumidityRecord
{
Id = "1",
TimestampUtc = DateTime.UtcNow,
Value = random.NextDouble() * 100
};
var sensor2Record = new HumidityRecord
{
Id = "2",
TimestampUtc = DateTime.UtcNow + TimeSpan.FromSeconds(random.Next(500)),
Value = random.NextDouble() * 100
};
var message = $"Message {i}: {Newtonsoft.Json.JsonConvert.SerializeObject(sensor1Record)} & {Newtonsoft.Json.JsonConvert.SerializeObject(sensor2Record)}";
Console.WriteLine($"Sending message: {message}");
await eventHubClient.SendAsync(new
EventData(Encoding.UTF8.GetBytes
(Newtonsoft.Json.JsonConvert.SerializeObject(sensor1Record))));
await eventHubClient.SendAsync(new
EventData(Encoding.UTF8.GetBytes
(Newtonsoft.Json.JsonConvert.SerializeObject(sensor2Record))));
}
catch (Exception exception)
{
Console.WriteLine($"{DateTime.Now} > Exception:{ exception.Message}");
}
await Task.Delay(1000);
}
Console.WriteLine($"{numMessagesToSend} messages sent.");
}
}
The code has a dependency on EventHub package, therefore, add the NuGet Package Microsoft.Azure.EventHubs to the project. Add two constant strings to define the Event Hub connection string and Event Hub name.
private const string EhConnectionString = "Endpoint=sb://EVENTHUBNAMESPACE.servicebus.windows.net/;SharedAccessKeyName=Key";
private const string EhEntityPath = "EVENT HUB NAME";
You can now execute the application and send telemetry data to EventHub. The Stream Analytics job will copy all the data that this device generates to Cosmos DB collection.
Building the WebAPI
The WebAPI component queries the data stored in Cosmos DB and exposes a REST API which can be consumed by other applications. To create a Web API follow these steps:
Create a WebAPI project by selecting either the Azure API App or Web API project template.
Create a controller named FieldSensorController. Add the following code to the controller to fetch data from Cosmos DB and return it as a response to the request.
public class FieldSensorController : ApiController { private const string EndpointUri = "https://COSMOSDBNAME.documents.azure.com:443/"; private const string PrimaryKey = "COSMOS DB KEY"; private const string DatabaseName = "sensordata"; private const string CollectionName = "data"; // GET api/FieldSensor/5 [SwaggerOperation("GetById")] [SwaggerResponse(HttpStatusCode.OK)] [SwaggerResponse(HttpStatusCode.NotFound)] public IHttpActionResult Get(int id) { var client = new DocumentClient(new Uri(EndpointUri), PrimaryKey); FeedOptions queryOptions = new FeedOptions { MaxItemCount = -1 }; // Here we find the Building via its Id/building number IQueryable<HumidityRecord> buildingQuery = client.CreateDocumentQuery<HumidityRecord>(UriFactory.CreateDocumentCollectionUri(DatabaseName, CollectionName), queryOptions).Where(f => f.Id == id.ToString()); if (buildingQuery != null) { return Ok(buildingQuery); } return NotFound(); } }
Publish your Web API to Azure. If you get stuck, follow the steps mentioned here.
You can test the response of the API by to the controller as shown below.
Here is the format of response that you will receive in plain text.
[
{
"Id": "1",
"TimestampUtc": "2018-03-14T03:31:14.7994193Z",
"Value": 54.227751239308972
}
]
Conclusion
In this post, we created and deployed the backend infrastructure for our IoT solution. In the next article, we will build a Mixed Reality application that will render the data that it consumes from the API. We will also discuss how we can deploy the Mixed Reality application using the various deployment mechanisms and how we can extend the application to support more usage scenarios.
Did you enjoy reading this article? I can notify you the next time I publish on this blog... ✍