Run Azure Digital Twins ADT-Explorer in Docker

This post walks you through running the ADT-Explorer tool, part of our Azure Digital Twins solution, in a docker container.

My apologies, but it’s been a while since I blogged. Life’s been getting in the way.

I’ve been doing more and more lately with our Azure Digital Twins solution. The ADT engineering team has a nice visualization tool for visualizing your models, twins, relationships, and query results. One of the biggest challenges for me and my customers, has been getting my environment set up: right version of node, right version of the code, authentication, etc.

Conveniently, the ADT team has some instructions for running the ADT-Explorer tool in a docker container, so you don’t have to worry about the pre-reqs. It generally works great. However, there’s one downside. The ADT-Explorer tool uses the Microsoft.Identity provider, and specifically the DefaultAzureCredential setting to do your authentication. This is usually a great thing, in that it just automatically picks up any cached local azure credentials you may already be logged in with (Azure CLI, VS Code, Visual Studio, ENV vars, etc) and uses them.

The challenge, however, is that ADT-Explorer running in a docker container cannot leverage those local cached credentials. So, what do you do?

The easiest way to handle this is to install the azure cli inside the docker container, do an ‘az login’ in it, and then start ADT explorer. The rest of this post walks you through this. This post assumes you have docker installed on your machine with linux containers enabled.

The first step is to clone the repo with git.


git clone https://github.com/Azure-Samples/digital-twins-explorer

After you do that, navigate to the digital-twins-explorer and create the adt-explorer docker container (Note the dot/period at the end of the command – don’t forget it!)


docker build -t adt-explorer .

So far this follows the instructions provided by the ADT team, but this is where we will deviate. The ADT team provided instructions have you just run the container as-is. However, if you do that, the DefaultAzureCredentials won’t work. Instead, we need to install the azure cli and login into azure before we start the adt-explorer app. The first thing we need to do is run the adt-explorer docker container, but we are going to start in a bash command prompt. Run the following command


docker run -it -p3000:3000 adt-explorer bash

Note that we added ‘bash’ to the end of the command line, to start the container but override the default entrypoint and give us a bash prompt. It should look like this:

Now that we have a prompt, we need to install the azure cli. To install it, run this command


curl -sL https://aka.ms/InstallAzureCLIDeb | bash

This should install the CLI for us. After a successful install, we can now do an ‘az login’ to authenticate to Azure. You will get a ‘code’ to use, and a URL to visit in your browser as shown below

Open your browser, navigate to http://microsoft.com/devicelogin, enter the code, then your azure subscription credentials as shown below

The authentication process will be slightly different for every environment based on your IT department’s set up (i.e. 2-factor auth, etc), but the result should eventually be a successful login

After a successful login, you’ll see a list of your azure subscriptions in the docker container.

Now we can start the adt-explorer app. To do so, run


npm run start

You’ll see this in your container.

You can now open up a browser, navigate to http://localhost:3000. Click on the People icon in the upper right corner, enter your ADT instance URL, and off you go.

Enjoy, and as always, hit me up in the comments if you have any issues.

Azure IoT Edge local and offline dashboards

This post will cover a common ask for customers. How to display, report, or dashboard data locally, even when offline, from Azure IoT Edge

One of the main uses for IoT is to be able to have dashboards and reports showing the latest information off of IoT devices. Often that happens with data sent from the devices to the cloud. In many IoT scenarios, especially in manufacturing, those dashboards also serve the needs of local operators or supervisors in the very plants or locations that supply that IoT data. So often that data “round trips” from the device, up to Azure, then the reports back down to screens at the plant.

But what happens if you need lower latency reporting? Or you lose your Internet connection frequently because your plant is in an area with slow or unreliable connectivity?

A common ask from customer is to, in addition to sending the IoT data to Azure for centralized reporting, ML model training, etc to also have the data ‘reported/dashboarded’ locally. That allows for lower latency display, as well as continued operation in the event of an Internet failure.

To address this need, I’m happy to announce that Azure IoT Global Black Belt counterparts have published guidance and a sample implementation on how to do just this. The sample implementation happens to be a common manufacturing KPI called Overall Equipment Effectiveness (OEE), but could be adapted to many other scenarios such as retail stores, warehouses, etc.

Please check it out at our github repo -> https://github.com/AzureIoTGBB/iot-edge-offline-dashboarding

Enjoy! And, as always, hit me up in the comments if you have questions.

IoT Transformers podcast

So… I did a thing.

My co-workers, Dani Diaz and Deb Oberly, host a very cool podcast called IoT Transformers. They typically host our Azure IoT customers and partners to talk about how they are transforming their businesses with IoT. It’s a great series full of really useful information about these customer’s and partner’s digital transformation journey.

For this most recent episode, they decided to interview me (“Insights from Busbyland” šŸ™‚ ). As one of the founding members of our IoT Global Black Belt team, we talked about changes I’ve seen in the industry, cool projects, and the IoT-related tech I’m most excited about.

I’ve also created a cool circular reference. The podcast references the blog which references the podcast which references the blog which references the podcast………

Anyway, if you have interest, and want to hear my thoughts on IoT and just how horrible my electronically recorded voice is, how many times I say ‘ummm’, and a little southern drawl, check it out. And don’t forget to go back and give the entire series a listen. You won’t regret it.

-Steve

Install IoT Edge on Red Hat Enterprise Linux (RHEL) – 7.x

This post demonstrates how to get Azure IoT Edge to work on Red Hat Enterprise Linux (RHEL)

Hi all.   Sorry for the lack of content lately.  Between some (minor) personal stuff and the coronavirus stuff with both myself and my customers, itā€™s been a bit of a goat-rodeo around Busby Manor lately. 

Smile

Recently I needed to help a customer get IoT Edge installed on a box running Red Hat Enterprise Linux (RHEL).  In this case, it was version 7.5, but this should work for other 7.x based versions too..   I think .  Iā€™m about as far away from a RHEL expert as you can get.

NOTE:  credit for most of this info goes to Justin Dyer, a peer of mine on the Azure IoT pre-sales team!

First off, if you look at the ā€œplatform supportā€ documentation for IoT Edge, youā€™ll notice that RHEL is a ā€œTier 2ā€ supported platform.Ā  Thatā€™s a fancy way of saying that either MSFT or someone we know has gotten it working on that platform and it generally works.Ā  However, it also means that it is not a ā€œgatingā€ platform for us, meaning itā€™s not a platform that we test extensively every release on before we release it.Ā  In other words, not working on RHEL will not block or gate a release.Ā  Thatā€™s not because we donā€™t like it, or donā€™t want to ā€œtier 1ā€ support it, but rather itā€™s just one that we havenā€™t gotten around (yet) to doing all the necessary work to get it fully integrated into our extensive testing platform.Ā  We love all Linux!Ā  Weā€™ve just prioritized based on how often we run into various platforms in the field with our customers.Ā 

Now, with all the caveating out of the way, IoT Edge on RHEL DOES work, and seems to work fine, and we DO provide RPM packages for it whenever we do a release.

Pre-requisites

Ok..  enough pre-amble, letā€™s jump in.   For RHEL, we provide RPM packages that you can install with YUMā€¦  The actual IoT Edge install is reasonably straightforward, once you get through the big pre-req, which is container-selinux.

The big issue is that the moby engine (i.e. open source docker) underneath IoT Edge needs a newer version of container-selinux than was installed on RHEL 7.5.   We need version 2:2.95 or greater.  If you have it already, great ā€“ proceed.

Smile

If you donā€™t, you can manually download from here and update.  Updating that package will be left as an exercise to the reader  (remember:  Iā€™m not a RHEL expert, but hopefully you are )

If you are running your own RHEL install, you can skip this next section and jump down to the ā€œInstall IoT Edgeā€ section

A note about RHEL on Azure VMs

Most of the testing I did here was on RHEL running in an Azure VM built with our ready-made RHEL images.  If you are running it on your own, you can skip this section.

container-selinux is found in the ā€œrhel-7-server-extras-rpmsā€ repo, which our Azure RHEL VMs do not have access to.  There are instructions on how to ā€œremove the version lock and install the non-eus reposā€ in order to get access to it.

But, if you donā€™t want to read all that, these are the net instructions that you need to run:


sudo rm /etc/yum/vars/releasever
sudo yum --disablerepo='*' remove 'rhui-azure-rhel7-eus'
sudo yum --config='https://rhelimage.blob.core.windows.net/repositories/rhui-microsoft-azure-rhel7.config' install 'rhui-azure-rhel7'
sudo yum install container-selinux

Once those are complete, you can proceed with the ā€œinstall IoT Edgeā€ section below

Install IoT Edge

Finding the right packages

Before we install IoT Edge, a short note about how to release IOT Edge.  For all of the ā€œnon-docker-basedā€ parts of the runtime (i.e. ignoring edgeAgent and edgeHub for the moment), there are really four major components of the runtime:

  • the moby engine:  the open-source version of docker, basically
  • the moby CLI:  gives you the ā€˜dockerā€™ commands
  • libiothsm:  MSFT provided library that implements the security abstraction layer that letā€™s the edge runtime talk to various security hardware (like TPMS)
  • iotedged:  the IoT Edge ā€œSecurity Managerā€, which is the daemon based part of IoT Edge and really the component that ā€˜bootstrapsā€™ all the rest of IoT Edge

When we do a ā€˜releaseā€™ (in the github sense of ā€˜releaseā€™) of IoT Edge, we only provide new packages for those components that changed with that release.  So, for example, in the 1.0.8 release, we had changes in all four components and youā€™ll see (under ā€œassetsā€) new *.deb and *.rpm packages for all of them.

But in 1.0.9, only libiothsm and iotedged changed, so you only see new packages for those two components.

Unfortunately, that complicates the edge install for us, just a little bit.  So, for a given IoT Edge release, you need to spelunk a little to get the latest versions.  For the moby engine and CLI, you can usually find the latest version on the packages.microsoft.com site.  Thatā€™s the easier one.   For the iot edge components, unfortunately that requires a little more digging.  For the release you want to install, say 1.0.9, you have to work backwards through the releases to find the latest one in which we updated the libiothsm and iotedge components, in this case 1.0.8.  So, you need to go find those links, under ā€˜assetsā€™ of each release, and capture the latest URLā€™s to the libiothsm and iotedge packages.

Sorry about that.  The good news is, thatā€™s the hard part.

finally, install iotedge

Ok, finally, we can install IoT Edge.

The first step is to download the packages.  Make a folder on your device to hold them, CD into that folder, and then run


wget https://packages.microsoft.com/centos/7/prod/moby-cli-3.0.10%2Bazure-0.x86_64.rpm
wget https://packages.microsoft.com/centos/7/prod/moby-engine-3.0.10%2Bazure-0.x86_64.rpm
wget https://github.com/Azure/azure-iotedge/releases/download/1.0.9/libiothsm-std_1.0.9-1.el7.x86_64.rpm
wget https://github.com/Azure/azure-iotedge/releases/download/1.0.9/iotedge-1.0.9-1.el7.x86_64.rpm

Those URLā€™s are valid as the ā€˜latestā€™ releases of each component as of the 1.0.9 version of IoT Edge.  As future versions ship, youā€™ll need to see if the various components of them have updated, and replace the URIā€™s appropriately.

Next, we just install IoT Edge components with the following commands (run them one at a time, as they ask a y/n question in the middle):


sudo yum install moby-cli-3.0.10+azure-0.x86_64.rpm
sudo yum install moby-engine-3.0.10+azure-0.x86_64.rpm
sudo rpm -Uhv libiothsm-std_1.0.9-1.el7.x86_64.rpm
sudo rpm -Uhv iotedge-1.0.9-1.el7.x86_64.rpm

Obviously if you had to download newer package names, replace them.

Once those packages finish installing, all you need to do is open config.yaml and add in your connection string or DPS information and restart iotedge with:


sudo systemctl restart iotedge

There you go.  Enjoy.  As always, if you have issues, feel free to hit me up in the comments!

Azure Device Provisioning Server over MQTT using x509 Certificates

In a previous post, I showed you how to register a device with Azureā€™s Device Provisioning Server (DPS) over raw MQTT.Ā  A reader/commenter asked how the process would differ if we used x.509 certificate based authentication vs. the SAS-token based authentication that the article was based on.

Since itā€™s inevitable that Iā€™ll run across this in a customer situation, I thought Iā€™d tackle it.Ā  Based on the knowledge from the previous article, as well as my article on DPS over the REST APIs, it was pretty straightforward.Ā  The process was nearly identical except for a few fields in the connection information, specifically specifying the iothub root cert, the device cert/key, and leaving off the SAS token.Ā  Iā€™ll cover the details below.

Generating Certs

The steps for generating the device certificates and creating the enrollment in DPS is the same process as outlined in my DPS over REST API article.Ā  Specifically the sections titled ā€œPrep work (aka ā€“ how do I generate test certs?)ā€ and ā€œX.509 attestion with individual enrollments- setupā€, so I wonā€™t repeat them hereā€¦Ā Ā Ā  For the screenshots below, I called my enrollment registration id ā€˜dpstestdev01ā€™.

The only other thing you need is the IoT Root CA cert.Ā  This is the Baltimore-based root ca cert from which all the IoT Hub and DPSĀ  ā€œserver-sideā€ TLS certificates are generated.Ā Ā  The client needs this to validate that it is indeed talking to the genuine microsoft endpoint and not a ā€˜man in the middleā€™.Ā Ā  The easiest way to get this cert, is to open this file from the Azure IoT C SDK, copy everything from (and including) line 23 to (and including) line 43, then strip out the quotes at the beginning and end of each line, and strip off the ā€˜\r\nā€™ off the ends.Ā  Save the file with a .pem extension.Ā  We will call that the ā€œDPS-root-CAā€ cert.

Client Setup

You can leverage any MQTT 3.1.1 client to talk to DPS, however, like in previous articles, Iā€™m going to use MQTT.fx, which is an excellent MQTT GUI tool for ā€˜manuallyā€™ doing MQTT.Ā  It allows you to get a really good feel for whatā€™s happening under the covers without writing a bunch of code.

Through a series of screenshots below, Iā€™ll show you my configuration.

The first step is to open Mqtt.Fx, click on the little ā€˜gearā€™ button next to the Connect button, and in the bottom right, click on the ā€œ+ā€ button to create a new connection.Ā Ā  You can call it anything (I called mine ā€˜dpscertā€™ in the screenshots below)

general-settings

This screenshot shows the ā€˜generalā€™ settingsā€¦

  • The type is MQTT Broker
  • The broker address is the global DPS endpoint (global.azure-devices-provisioning.net)
  • The port is the MQTTS (tls) port 8883
  • The client ID is the ā€˜registration idā€™ from DPS, specifically in this instance is the CN/Subject name you used for your device cert when you generated it
  • the only other change from the defaults is to explicitly choose MQTT version 3.1.1

user-credentials

This screenshot shows the user credentials.Ā Ā  For DPS, the user-id is of the form of:

{idScope}/registrations/{registration_id}/api-version=2019-03-31

where {idScope} is the idScope of your DPS instance.

Note that, unlike the SAS-Token case, the password is BLANK for x.509 authentication.

ssl-tls-certs

This screenshot is the most important one and the biggest difference from the SAS-Token case.

  • Make sure you explicitly select TLS version 1.2 (we donā€™t support older versions)
  • in our use case, we are using self-signed certificates, so choose that option
  • For the ā€œCA filesā€, this the DPS-root-CA cert we captured from github earlier.Ā  (the baltimore root cert)
  • For the Client Certificate file, this is the device certificate we created earlier
  • For the Client Key file, this is the private key for the device cert that we generated earlier.
  • Make sure and check the ā€œPEM formattedā€ checkbox, as thatā€™s the format our certs are in.

All the other tabs are just left default.

Click Ok to close this dialog.Ā  Click the ā€œConnectā€ button to connect to DPS.

From this point on, you subscribe and publish exactly like you did in the previous article and/or as specified in the official DPS documentation here.

Enjoy ā€“ and as always, let me know if you run into any issue.Ā  Hit me up on Twitter (@BamaSteveB), email (steve.busby ( at ) microsoft.com) or in the comments below.

Azure IoT Device Provisioning Service (DPS) over MQTT

Continuing the theme of ā€œdoing things on Azure IoT without using our SDKsā€, this article describes how to provision IOT devices with Azure IoTā€™s Device Provisioning Service over raw MQTT.

Previously, I wrote an article that describes how to leverage Azure IoTā€™s Device Provisioning Service over its REST API, as well as an article about connecting to IoT Hub/Edge over raw MQTT.Ā  Where possible, I do recommend using our SDKs, as they provide a nice abstraction layer over the supported transport protocols and frees you from all that protocol-level detailed work.Ā  However, we understand there are times and reasons where itā€™s just a better fit to do things over the raw protocols.

To support this, the Azure IoT DPS engineering team has documented the necessary technical details to register your device via MQTT.Ā Ā  This document may provide enough details for you to figure out how to do it, but since I needed to test it for a customer anyway, I thought Iā€™d capture a real-world example in hopes it can help others.

To make the scenario simpler, I chose to just use symmetric key attestation, but this would still work with any of the attestation methods supported by DPS.

Create individual enrollment

The first step is to create the enrollment in DPS.Ā  In the Azure portal, in your DPS instance, from the ā€˜overviewā€™ tab, grab your Scope ID from the upper right of the ā€˜overviewā€™ tab as shown below (Iā€™ve blacked out part of my details, for obvious reasons)

dps-scope-id

Once you have that, copy it somewhere like notepad or equivalent, weā€™ll use it later.Ā  Once we have that, we can create our enrollment.Ā  On the left nav, click on ā€œManage Enrollmentsā€ and then ā€œAdd Individual Enrollmentā€.Ā  For ā€œMechanismā€, choose Symmetric Key, enter a registration ID of your choosing (for the example further below, I used ā€˜my-mqtt-dev01ā€™)

create-individual-enrollment

Click Save.Ā  Then drill back into your enrollment in the portal and copy the ā€œPrimary Keyā€Ā  and save it for later use.

Generate SAS token

Once youā€™ve created the enrollment and gotten the device key, we need to generate a SAS token for authentication to the DPS service.Ā  A description of the SAS token, and several code samples for generating one in various languages can be found here.Ā  Some of the inputs (discussed below) will be different for DPS versus IoT Hub, but the basic structure of the SAS token is the same.

For my purposes, I used this python code to generate mine:

————-

from base64 import b64encode, b64decode
from hashlib import sha256
from time import time
from urllib import quote_plus, urlencode
from hmac import HMAC

def generate_sas_token(uri, key, policy_name, expiry=3600000000):
ttl = time() + expiry
sign_key = “%s\n%d” % ((quote_plus(uri)), int(ttl))
print(sign_key)
signature = b64encode(HMAC(b64decode(key), sign_key, sha256).digest())

rawtoken = {
‘sr’ :Ā  uri,
‘sig’: signature,
‘se’ : str(int(ttl))
}

if policy_name is not None:
rawtoken[‘skn’] = policy_name

return ‘SharedAccessSignature ‘ + urlencode(rawtoken)

uri = ā€˜[dps URI]ā€™
key = ā€˜[device key]’
expiry = [SAS token duration]
policy=’registration’

print(generate_sas_token(uri, key, policy, expiry))

——–

where:

  • [dps URI] is of the form [DPS scope id]/registrations/[registration id]
  • [device key] is the primary key you saved earlier
  • [SAS token duration] is the number of seconds you want the token to be valid for
  • policy is required to be ā€˜registrationā€™ for DPS SAS tokens

running this code will give you a SAS token that looks something like this (changing a few random characters to protect my DPS):

SharedAccessSignature sr=0ne00055505%2Fregistrations%2Fmy-mqtt-dev01&skn=registration&sig=gMpllKo7qS1VR31vyfsT6JAcc4%2BHIu2gQSyai0Uz0KM%3D&se=1579698526

Now that we have our authentication credentials, we are ready to make our MQTT call.

Example call

The documentation does a decent job of showing the MQTT parameters and flow (read it first!), so Iā€™m not going to repeat that here.Ā  What I will show is an example call with screenshots to ā€˜make it realā€™.Ā Ā  For my testing, I used mqtt.fx, which is a pretty nice little interactive MQTT test client.

Once you download and install it,Ā  click on the little lightning bolt to switch from localhost to allow you to create a new connection to an MQTT server.

mqtt-lightning

After that, click on the settings symbol next to the edit box to open the settings dialog that lets you edit the various connection profiles:

mqttfx-settings-icon

On the ā€œEdit Connection Profilesā€ dialog, in the very bottom left hand corner, click the ā€œ+ā€ symbol to create a new connection profile.

Give your connection a name and choose MQTT Broker as the Profile Type

mqtt-profile-settings-general

Enter the following settings in the top half of the dialog:

  • for ā€œBroker Addressā€, use ā€˜global.azure-devices-provisioning.netā€™
  • for ā€œBroker Portā€, use ā€œ8883ā€
  • for Client ID, enter your registration ID you used in the portal for your device

Click on the General ā€˜tabā€™ at the bottom.Ā  As in the screenshot above, for MQTT Version, uncheck the ā€œUse Defaultā€ button and explicitly choose version 3.1.1.Ā  Leave other settings on this tab alone.

click on the ā€œUser Credentialsā€ tab’

  • for ā€œUser Nameā€, enter [DPS Scope Id]/registrations/[registration id]/api-version=2019-03-31Ā  (replacing the scope id and registration id with your values)
  • for ā€œPasswordā€, copy/paste in your SAS token you generated earlier

mqtt-profile-user-creds

Move to the SSL/TLS tab.Ā Ā  Check the box for ā€œEnable SSL/TLSā€ and make sure that TLSv1.2 is chosen as the protocol

mqtt-profile-tls

leave the proxy and LWT tabs alone.

Click Ok to save the settings and return to the main screen

Click on the Connect button and you should get a successful connection (you can verify by looking at the ā€œlogā€ tab)

Once connected, navigate to the ā€œSubscribeā€ tab.Ā  We will set up a subscription on the dps ā€˜responseā€™ MQTT topic to receive responses to our registration attempts from DPS.Ā  On the ā€œSubscribeā€ tab, enter ā€˜$dps/registrations/res/#ā€™ into the subscriptions box, choose ā€œQoS1ā€ from the buttons on the right, and click ā€œSubscribeā€.Ā  You should see an active subscription get set up and waiting on responses.

mqtt-subscription-setup

Click back over on the ā€œPublishā€ tab and we will make our registration attempt.Ā  In the publish edit box, enter $dps/registrations/PUT/iotdps-register/?$rid={request_id}

replace {request_id} with an integer of your choosing (1 is fine to start with).Ā  This lets us correlate requests with responses when we get responses back from the service.Ā  For example, I entered:

$dps/registrations/PUT/iotdps-register/?$rid=1

in the big edit box beneath the publish edit box, we need to enter a ā€˜payloadā€™ for the request.Ā  For DPS registration requests, the payload takes the form of a JSON document like this:Ā  {ā€œregistrationIdā€:ā€<registration id>ā€}

for example, for my sample itā€™s:

{“registrationId”: “my-mqtt-dev01”}

mqtt-reg-publish

Hit the ā€œPublish buttonā€

Flip back over to the Subscribe tab and you should see on the right hand side of the screen that weā€™ve received a response from DPS.Ā  You should see something like this:

mqtt-registration-assigning

This indicates that DPS is in the process of ā€˜assigningā€™ and registering our device to an IoT Hub.Ā  This is a potentially long running operation, so to get the status of it, we have to query for that status.Ā  To do that, we are going to publish another MQTT message to check on the status.Ā  For that, we need the ā€˜operationIdā€™ from the message we just received.Ā  In the screenshot above, mine looks like this:

4.22724a0213a69c4d.9750f5e6-b4c3-4760-9b15-4e74d6120bd1

Copy that ID as weā€™ll use it in the next step.

To check on the status of the operation, switch back over to the Publish tab and replace the values in the publish edit box with this

$dps/registrations/GET/iotdps-get-operationstatus/?$rid={request_id}&operationId={operationId}

replacing {request_id} with a new request id (2 in my case) and the {operationId} with the operationId you just copied. For example, with my sample values and the response received above, my request looks like this:

$dps/registrations/GET/iotdps-get-operationstatus/?$rid=2&operationId=4.22724a0213a69c4d.9750f5e6-b4c3-4760-9b15-4e74d6120bd1

Delete the JSON in the payload box and click ā€œpublishā€

Switch back over to the Subscribe tab and you should notice that youā€™ve received a response to your operational status query, similar to this:

mqtt-registration-status

Notice the status of ā€œassignedā€, as well as details like ā€œassignedHubā€ that gives the state of the successful registration and connection details.

If you navigate back over to the azure portal and look at the enrollment record for your device (refresh the page.. you may have to exit and re-enter), you should see something like this:

mqtt-registration-success

This indicates that our DPS registration was successful.

In the ā€œreal worldā€, in your application, youā€™ll make the registration attempt and then poll the operational status until it gets to the state of ā€˜assignedā€™.Ā  There will be intermediate states while it is being assigned, but doing this manually through a GUI, Iā€™m not fast enough to catch them Smile

Enjoy ā€“ and let me know in the comments if you have any questions or issues.

Connect MXChip DevKit to Azure IoT Edge

A customer of mine who is working on an IoT POC to show their management wanted to connect the MXChip Devkit to IoT Hub via IoT Edge. This turned out to be trickier than it should be, as the connection between the MXChip and the IoT Edge box is, like all IoT Hub/Edge connections, TLS-encrypted. So you have to get the MXChip to trust the “tls server” certificate that IoT Edge returns when a client tries to connect. Thanks to some great ground-laying work by Arthur Ma, I wrote a hands-on lab walking you through this scenario. The lab can be found on my team’s github site here

Enjoy, and let me know if you have any problems.

Azure IoT Device Provisioning Service via REST–part 2

This is part 2 of a two part post on provisioning IoT devices to Azure IoT Hub via the Azure IoT Device Provisioning Service (DPS) via its REST API.  Part 1 described the process for doing it with x.509 certificate attestation from devices and this part will describe doing it with Symmetric Key attestation. 

I wonā€™t repeat all the introduction, caveats, etc. that accompanied part 1, but you may want to take a quick peek at them so you know what I will, and will not, be covering here.

If you donā€™t fully understand the Symmetric Key attestation options for DPS, I recommend you go read the docs here first and then come backā€¦

Ok, welcome back!

So letā€™s just jump right in.  Similarly to part 1, there will be a couple of sections of ā€˜setupā€™, depending on whether you choose to go with Individual Enrollments or Group Enrollments in DPS.  Once that is done, and the accompanying attestation tokens are generated, the actual API calls are identical between the two.

Therefore, for the first two sections, you can choose the one that matches your desired enrollment type and read it for the required setup (or read them both if you are the curious type), then you can just jump to the bottom section for the actual API calls.

But, before we start with setup, thereā€™s a little prep work to do.

Prep work (aka ā€“ how do I generate the SAS tokens?)

Symmetric Key attestation in DPS works, just like pretty much all the rest of Azure, on the concept of SAS tokens.  In Azure IoT, these tokens are typically derived from a cryptographic key tied to a device or service-access level.  As mentioned in the overview link above (in case you didnā€™t read it), DPS have two options for these keys.  One option is an individual key per device, as specified or auto-generated in the individual enrollments.  The other option is to have a group enrollment key, from which you derive a device-specific key, that you leverage for your SAS token generation.

Generating SAS tokens

So first, letā€™s talk about and prep for the generation of our SAS tokens, independent of what kind of key we use.   The use of, and generation of, SAS tokens is generally the same for both DPS and IoT Hub, so you can see the process and sample code in various languages here.  For my testing, I pretty much shamelessly stole re-used the python example from that page, which I slightly modified (to actually call the generate_sas_token method).

from base64 import b64encode, b64decode
from hashlib import sha256
from time import time
from urllib import quote_plus, urlencode
from hmac import HMAC

def generate_sas_token(uri, key, policy_name, expiry=3600):
     ttl = time() + expiry
     sign_key = “%s\n%d” % ((quote_plus(uri)), int(ttl))
     print sign_key
     signature = b64encode(HMAC(b64decode(key), sign_key, sha256).digest())

    rawtoken = {
         ‘sr’ :  uri,
         ‘sig’: signature,
         ‘se’ : str(int(ttl))
     }

    if policy_name is not None:
         rawtoken[‘skn’] = policy_name

    return ‘SharedAccessSignature ‘ + urlencode(rawtoken)

uri = ā€˜[resource_uri]ā€™
key = ā€˜[device_key]ā€™
expiry = [expiry_in_seconds]
policy=ā€™[policy]ā€™

print generate_sas_token(uri, key, policy, expiry)

the parameters at the bottom of the script, which I hardcoded because I am lazy busy, are as follows:

  • [resource_uri] ā€“ this is the URI of the resource you are trying to reach with this token.  For DPS, it is of the form ā€˜[dps_scope_id]/registrations/[dps_registration_id]ā€™, where [dps_scope_id] is the scope id associated with your DPS instance, found on the overview blade of your DPS instance in the Azure portal, and [dps_registration_id] is the registration_id you want to use for your device.  It will be whatever you specified in an individual enrollment in DPS, or can be anything you want in a group enrollment as long as it is unique.  Frequently used ideas here are combinations of serial numbers, MAC addresses, GUIDs, etc
  • [device_key] is the device key associated with your device.  This is either the one specified or auto-generated for you in an individual enrollment, or a derived key for a group enrollment, as explained a little further below
  • [expiry_in_seconds] the validity period of this SAS token in secā€¦   ok, not going to insult your intelligence here
  • [policy] the policy with which the key above is associated.  For DPS device registration, this is hard coded to ā€˜registrationā€™

So an example set of inputs for a device called ā€˜dps-sym-key-test01ā€™ might look like this (with the scope id and key modified to protect my DPS instance from the Russians!)

uri = ‘0ne00057505/registrations/dps-sym-key-test01’
key = ‘gPD2SOUYSOMXygVZA+pupNvWckqaS3Qnu+BUBbw7TbIZU7y2UZ5ksp4uMJfdV+nTIBayN+fZIZco4tS7oeVR/A==’
expiry = 3600000
policy=’registration’

Save the script above to a *.py file.   (obviously, youā€™ll need to install python 2.7 or better if you donā€™t have it to run the script)

If you are only doing individual enrollments, you can skip the next section, unless you are just curious.

Generating derived keys

For group enrollments, you donā€™t have individual enrollment records for devices in the DPS enrollments, so therefore you donā€™t have individual device keys.  To make this work, we take the enrollment-group level key and, from it, cryptographically derive a device specific key.  This is done by essentially hashing the registration id for the device with the enrollment-group level key.  The DPS team has provided some scripts/commands for doing this for both bash and Powershell here.  Iā€™ll repeat the bash command below just to demonstrate.

KEY=[group enrollment key]
REG_ID=[registration id]

keybytes=$(echo $KEY | base64 –decode | xxd -p -u -c 1000)
echo -n $REG_ID | openssl sha256 -mac HMAC -macopt hexkey:$keybytes -binary | base64

where [group enrollment key] is the key from your group enrollment in DPS.  this will generate a cryptographic key that uniquely represents the device specified by your registration id.  We can then use that key as the ā€˜[device_key]ā€™ in the python script above to generate a SAS key specific to that device within the group enrollment.

Ok ā€“ enough prep, letā€™s get to it.  The next section shows the DPS setup for an Individual Enrollment.  Skip to the section beneath it for Group Enrollment.

DPS Individual Enrollment ā€“ setup

The setup for an individual device enrollment for symmetric key in DPS is pretty straightforward.  Navigate to the ā€œmanage enrollmentsā€ blade from the left nav underneath your DPS instance and click ā€œAdd Individual Enrollmentā€.  On the ā€˜Add Enrollmentā€™ blade, for Mechanism, choose ā€œSymmetric Keyā€, then below, enter in your desired registration Id (and an option device id for iot hub if you want it to be different).  It should look similar to the below (click on the pic for a bigger version).

dps-symkey-individual-setup

Click Save.   Once saved, drill back into the device and copy the Primary Key and remember your registration id, weā€™ll need both later.

Thatā€™s it for now.  You can skip to the ā€œcall DPS REST APIsā€ section below, or read on if you want to know how to do this with a group enrollment.

DPS Group Enrollment ā€“ setup

The setup for an group enrollment for symmetric key is only slightly more complicated than individual.  On the portal side, itā€™s fairly simple.  In the Azure portal, under your DPS instance, on the left nav click on ā€˜manage enrollmentsā€™ and then ā€œAdd Group Enrollmentā€.  On the Add Enrollment page, give the enrollment a meaningful name and set Attestation Type to Symmetric Key, like the screenshot below.

dps-symkey-group-setup

Once you do that, click Save, and then drill back down into the enrollment and copy the ā€œPrimary Keyā€ that got generated.  This is the group key referenced above, from which we will derive the individual device keys.

In fact, letā€™s do that before the next section.  Recall the bash command given above for deriving the device key, below is an example using the group key from my ā€˜dps-test-sym-group1ā€™ group enrollment above and Iā€™ll just ā€˜dps-test-sym-device01ā€™ as my registration id

dps-symkey-derive-device-key

You can see from the picture that the script generated a device-specific key (by hashing the registration id with the group key).

Just like with the individual enrollment above, we now have the pieces we need to generate our SAS key and call the DPS registration REST APIs

call DPS REST APIs

Now that we have everything setup, enrolled, and our device-specific keys ready, we can set up to call the APIs.  First we need to generate our SAS tokens to authenticate.  Plug in the values from your DPS instance into the python script you saved earlier.  For the [device key] parameter, be sure and plug in either the individual device key you copied earlier, or for the group enrollment, make sure and use the derived key you just created and not the group enrollment key.

Below is an example of a run with my keys, etc

dps-symkey-generate-sas

the very last line is the one we need.  In my case, it was (with a couple of characters changed to protect my DPS):

SharedAccessSignature sr=0ne00052505%2Fregistrations%2Fdps-test-sym-device01&skn=registration&sig=FKOnylJndmpPYgJ5CXkw1pw3kiywt%2FcJIi9eu4xJAEY%3D&se=1568718116

So we now have the pieces we need for the API call. 

The CURL command for the registration API looks like this (with the variable parts bolded).

curl -L -i -X PUT -H ‘Content-Type: application/json’ -H ‘Content-Encoding:  utf-8’ -H ‘Authorization: [sas_token]‘ -d ‘{“registrationId”: “[registration_id]“}’ https://global.azure-devices-provisioning.net/[dps_scope_id]/registrations/[registration_id]/register?api-version=2019-03-31

where

  • [sas_token]  is the one we just generated
  • [dps_scope_id] is the one we grabbed earlier from the azure portal
  • [registration_id] is the the one we chose for our device.
  • the ā€“L tells CURL to follow redirects
  • -i tells CURL to output response headers so we can see them
  • -X PUT makes this a put command
  • -H ā€˜Content-Type:  application/jsonā€™ and ā€“H ā€˜Content-Encoding: utf-8ā€™ are required and tells DPS we are sending utf-8 encoded json in the body  (change the encoding to whatever matches what you are sending)

dps-symkey-registration-call_results

Above is an example of my call and the results returned.

Note two things.. One is the operationId.  DPS enrollment in an IoT Hub is a (potentially) long running operation, and thus is done asynchronously.  So to see the status of your IoT Hub provisioning, weā€™ll need to poll for status.  Iā€™ll get to that in a minute.  The second thing is the ā€œstatusā€ field, which begins in the ā€˜assigningā€™ status.

The next API call we need to make is get the status.  Youā€™ll basically do this in a loop until you either get a success or failure status.  The valid status values for DPS are:

    • assigned
      ā€“ the return value from the status call will indicate what IoT Hub the device was assigned to
    • assigning
    • disabled
      ā€“ the device enrollment record is disabled in DPS, so we canā€™t assigned
    • failed
      ā€“ assignment failed.  There will be an errorCode and errorMessage returned in an registrationState record in the returned JSON to indicate what failed.
    • unassigned ā€“ ummm..  no clue.

To make the afore-mentioned status call, you need to copy the operationId from the return status above.  The CURL command for that call is (with variables bolded):

curl -L -i -X GET -H ‘Content-Type: application/json’ -H ‘Content-Encoding:  utf-8’ -H ‘Authorization: [sas_token]’ https://global.azure-devices-provisioning.net/[dps_scope_id]/registrations/[registration_id]/operations/[operation_id]?api-version=2019-03-31

use the same sas_token and registration_id as before and the operation_id you just copied.

A successful call looks like this:

dps-symkey-operation-status

Unfortunately, Iā€™m not a fast enough copy/paste-r to catch it in a status other than ā€˜assignedā€™  (DPS is just too fast for me).  But you can try this all programmatically or in a script to do it.

Viola

Thatā€™s it.  Done.  You can check the status of your registration in the azure portal and see that the device was assigned.

dps-symkey-done

enjoy, and as always, if you have questions or even suggested topics (remember, it has to be complex, technical, and not well covered in the docs), hit me up in the comments

Azure IoT Device Provisioning Service via REST–part 1

This will be a two-part article on how to provision IoT devices using Microsoftā€™s Azure IoT Device Provisioning Service, or DPS, via its REST API.Ā  DSP is part of our core IoT platform.Ā  It gives you an global-scale solution for near zero touch provisioning and configuration of your IoT Devices.Ā  We make the device-side provisioning pretty easy with nice integration with our open-source device SDKs.

With that said, one of our core design tenants for our IoT platform is that, while they do make life easier for you in most instances, you do not have to use our SDKs to access any of our services, whether thatā€™s core IoT Hub itself, IoT Edge, or in this case, DPS.

I recently had a customer ask about invoking DPS for device registration from their field devices over the REST API.Ā  For various reasons I wonā€™t dive deep in, they didnā€™t want to use our SDKs and preferred the REST route.Ā  The REST APIs for DPS are documented, but..Ā  well..Ā  weā€™ll just say ā€œnot very wellā€ and leave it at thatā€¦ā€¦..

ahemā€¦ā€¦..

anyway..

So I set out to figure out how to invoke device registration via the REST APIs for my customer and thought I would document the process here.

Ok, enough salad, on to the meat of the post.

First of all, this post assumes you are already familiar with DPS concepts have maybe even played with it a little.Ā  If not, review the docs here and come back.Ā  Iā€™ll waitā€¦ā€¦ā€¦.

Secondly, in case you didnā€™t notice the ā€œpart 1ā€ in the title, because of the length, this will be a two-part post.Ā  The first half will show you how to invoke DPS registration over REST for the x.509 certificate attestation use cases, both individual and group enrollments, and part 2 will show for the Symmetric Key attestation use cases.

NOTE- but what about TPM chip-based attestation?, you might ask..Ā  Well, Iā€™m glad you asked!Ā Ā  Using a TPM-chip for storing secrets and working with any IoT device is a best practice that we highly recommend!Ā  with that said, Iā€™m not covering it for three reasons, 1) the process will be relatively similar to the other scenarios, 2) despite it being our best practice, I donā€™t personally have any customers today it (shame on us all!) and 3) I donā€™t have an IoT device right now that has a TPM to test with Smile

One final note ā€“ Iā€™m only covering the registration part from the device side.Ā  There are also REST APIs for enrolling devices and many other ā€˜back endā€™ processes that Iā€™m not covering. This is partially because the device side is the (slightly) harder side, but also because itā€™s the side that is more likely to need to be invoked over REST for constrained devices, etc.

Prep work (aka ā€“ how do I generate test certs?)

The first thing we need f0r x.509 certificate attestation is, you guess it, x.509 certificates.Ā  There are several tools available out there to do it, some easy, and some requiring an advanced degree in ā€œx.509 Certificate Studiesā€.Ā  Since I donā€™t have a degree in x.509 Certificate Studies, I chose the easy route.Ā  But to be clear, I just picked this method, but any method that can generate valid x.509 CA and individual certs will work.Ā 

The tool I chose is provided by the nice people who write and maintain our Azure IoT SDK for Node.js.Ā  This tool is specifically written to work/test with DPS, so it was ideal.Ā  Like about 50 other tools, itā€™s a friendly wrapper around openssl.Ā  Instructions for installing the tool are provided in the readme and I wonā€™t repeat them here.Ā  Note that you need NodeJs version 9.x or greater installed for it to work.

For my ā€˜devā€™ environment, I used Ubuntu running on Windows Subsystem for Linux (WSL)Ā  (man, thatā€™s weird for a 21-year MSFT veteran to say), but any environment that can run node and curl will work.

Also, final note about the cert gen scripts..Ā  these are scripts for creating test certificatesā€¦Ā  please..Ā  pretty please..Ā  donā€™t use them for production.Ā  If you do, Santa will be very unhappy with you!

In the azure portal, Iā€™ll assume youā€™ve already set up a DPS instance.Ā  At this point, on the DPS overview blade, note your scope id for your instance (upper right hand side of the DPS Overview blade), weā€™ll need it here in a sec.Ā Ā  From here on out, Iā€™ll refer to that as the [dps_scope_id]

The next two sections tell you how to create and setup the certs weā€™ll need for either the Individual or Group Enrollments.Ā  Once the certs are properly setup, the process for making the API calls are the same, so pick the one of the next two sections that apply to you and goĀ  (or read them both if you are really thirsty for knowledge!)

x.509 attestation with Individual Enrollments ā€“ setup

Letā€™s start with the easy case, which is x.509 attestation with an Individual Enrollment.Ā  The first thing I want to do is to generate some certs.

For my testing, using the create_test_cert tool to create a root certificate, and a device certificate for my individual enrollment, using the following two commands.

node create_test_cert.jsĀ  root “dps-test-root”

node create_test_cert.js device “dps-test-device-01” “dps-test-root”

The tool creates the certificates with the right CN/subject, but when saving the files, it drops the underscores and camel-cases the name.Ā  I have no idea why.Ā  Just roll with it.Ā  Or feel free to create a root cert and device cert using some other tool.Ā  The key is to make the subject/CN of the device cert be the exact same thing you plan to make your registration id in DPS.Ā  They HAVE to match, or the whole thing doesnā€™t work.Ā  For the rest of the article, I will refer to the registration id as [registration_id]

so my cert files look like thisā€¦.Ā  (click on the pic to make it larger)

dps-certs-individual

Back in the Azure portal, under ā€œManage enrollmentsā€ on the left nav, click on ā€œAdd Individual Enrollmentā€.Ā Ā  On the ā€œAdd Enrollmentā€ blade, leave the default of ā€œX.509ā€ for the Mechanism.Ā  On the ā€œPrimary Certificate .pem or .cer fileā€, click on the folder and upload/choose the device certificate you generated earlier (in my case, itā€™s dpsTestRoot_cert.pem).

The top half of my ā€œAdd Enrollmentā€ blade looks like this (everything below it is default)

dps-cert-individual-dps-setup

I chose to leave the IoT Hub Device ID field blank.Ā  If you want your device ID in Hub to be something different than your registration id (which is the CN/subject name in your cert), then you can enter a different name here.Ā Ā Ā  click SAVE to save your enrollment and we are ready to go.

x.509 attestation with Group Enrollments ā€“ setup

Ok, if youā€™re reading this section, you are either a curious soul, or decided to go with the group enrollment option for x.509 attestation with DPS.

With the umbrella of group enrollment, there are actually two different options for the ā€˜groupā€™ certificate, depending on whether or not you want to leverage a root CA certificate or an intermediate certificate.Ā  For either option, for test purposes, weā€™ll go ahead and generate our test certificates.Ā  The difference will primarily be which one we give to DPS.Ā  In either case, once the certificate is in place, we can authenticate and register any device that presents a client certificate signed by the CA certificate that we gave to DPS.Ā  For details of the philosophy behind x.509 authentication/attestation for IoT devices, see this article.

For this test, I generated three certificates, a root CA cert, an intermediate CA cert (signed by the root), and an end IoT device certificate, signed by the intermediate CA certificate.Ā  I used the following commands to do so.

node create_test_cert.js root “dps-test-root”

this generated my root CA certificate

node create_test_cert.js intermediate “dps-test-intermediate” “dps-test-root”

this generated an intermediate CA certificate signed by the root CA cert I just generated

node ../create_test_cert.js device “dps-test-device-01” “dps-test-intermediate”

this generated a device certficate for a device called ā€œdps-test-device-01ā€ signed by the intermediate certificate above.Ā Ā  So now I have a device certificate that ā€˜chains upā€™ through the intermediate to the root.

At this point, you have the option of either setting up DPS with the root CA certificate, or the Intermediate Certificate for attesting the identity of your end devices.Ā  The setup process for each option is slightly different and described below.

Root CA certificate attestation

For Root CA certificate attestation, you need to upload the root CA certificate that you generated, and then also provide proof of possession of the private key associated with that root CA certificate.Ā  That is to keep someone from impersonating the holder of the public side of the root CA cert by making sure they have the corresponding private key as well.

The first step in root CA registration is to navigate to your DPS instance in the portal and click ā€œCertificatesā€ in the left-hand nav menu.Ā  Then click ā€œAddā€.Ā  In the Add Certificate blade, give your certificate a name that means something to you and click on the folder to upload our cert.Ā  Once uploaded, click Save.

At that point, you should see your certificate listed in the Certificates blade with a status of ā€œunverifiedā€.Ā  This is because we have not yet verified that we have the private key for this cert.

dps-cert-group-root-unverified

The process for verifying that we have the private key for this cert involves having DPS generate a ā€œverification codeā€ (a cryptographic alphanumeric stringā€ and then we will take that string and, using our root CA certificate, create a certificate with the verification code as the CN/Subject name and then sign that certificate with the private key of the root CA cert.Ā  This proves to DPS that we possess the private key.Ā  To do this, click on your newly uploaded cert.Ā  On the Certificate Details page, click on the ā€œGenerate Verification Codeā€ button and it will generate a verification code as shown below.

dps-cert-group-root-verification-code

Copy that code.Ā  Back on the box that you are using to generate the certs, run this command to create the verification cert.

create_test_cert.js verification [–ca root cert pem file name] [–key root cert key pem file name] [–nonce nonce]

where –ca is the path to your root CA cert you uploaded, –key is the path to itā€™s private key, and–nonce is the verification code you copied from the portal, for example, in my case:

node ../create_test_cert.js verification –ca dpsTestRoot_cert.pem –key dpsTestRoot_key.pem –nonce 07F9332E108C7D24283FB6B8A05284E6B873D43686940ACE

This will generate a cert called ā€œverification_cert.pemā€.Ā Ā  Back on the azure portal on the Certificates Detail page, click on the folder next to the box ā€œVerification Certificate *.pem or *.cer fileā€ and upload this verification cert and click the ā€œVerifyā€ button.

You will see that the status of your cert back on the Certificates blade now reads ā€œVerifiedā€ with a green check box.Ā Ā  (you may have to refresh the page with the Refresh button to see the status change).

Now click on ā€œManage enrollmentsā€ on the left-nav and click on ā€œAdd enrollment groupā€.Ā Ā  Give it a meaningful name for you, make sure that ā€œCertificate Typeā€ is ā€œCA Certificateā€ and choose the certificate you just verified from the drop-down box, like below.

dps-cert-group-root-setup

Click Save

Now you are ready to test your device cert.Ā  You can skip the next section and jump to the ā€œDPS registrationĀ  REST API callsā€ section

Intermediate CA certificate attestation

If you decided to go the Intermediate CA certificate route, which I think will be the most common, luckily the process is a little easier than with a root CA certificate.Ā  In your DPS instance in the portal, under ā€œManage enrollmentsā€, click on Add Enrollment Groupā€.Ā  Make sure that Attestation Type is set to ā€œCertificateā€ and give the group a meaningful name.Ā  Under ā€œCertificate Typeā€, choose ā€œIntermediate Certificateā€ and click on the folder next to ā€œPrimary Certificate .pem or .cer fileā€ and upload the Intermediate Certificate we generated earlier.Ā  For me, it looks like this..

dps-cert-group-intermediate-setup

click Save and you are ready to go to the next section to try to register a device cert signed by your Intermediate Certificate.

The DPS registration REST API calls

Ok, so the moment youā€™ve all been waiting (very patiently) forā€¦

As mentioned previously, now that the certs have all been created, uploaded, and setup properly in DPS, the process and the API calls from here on out is the same regardless of how you set up your enrollment in DPS.

For my testing, I didnā€™t want to get bogged down in how to make HTTP calls from various languages/platforms, so I chose the most universal and simple tool I could find, curl, which is available on both windows and linux.

The CURL command for invoking DPS device registration for x.509 individual enrollments with all the important and variable parameters in []ā€™s..

curl -L -i -X PUT –cert ./[device cert].pem –key ./[device-cert-private-key].pem -H ‘Content-Type: application/json’ -H ‘Content-Encoding:Ā  utf-8’ -d ‘{“registrationId”: “[registration_id]“}’ https://global.azure-devices-provisioning.net/[dps_scope_id]/registrations/[registration_id]/register?api-version=2019-03-31

That looks a little complicated, so letā€™s break each part down

  • -LĀ  :Ā  tells curl to follow HTTP redirects
  • – i :Ā  tells curl to include protocol headers in output.Ā  Not strictly necessary, but I like to see them
  • -X PUT : tells curl that is an HTTP PUT command.Ā  Required for this API call since we are sending a message in the body
  • –certĀ  : this is the device certificate that we, as a TLS client, want to use for client authentication.Ā  This parameter, and the next one (key) are the main thing that makes this an x.509-based attestation.Ā  This has to be the same cert you registered in DPS
  • –key : the private key associated with the device certificate provided above.Ā  This is necessary for the TLS handshake and to prove we own the cert
  • -H ā€˜Content-Type: application/jsonā€™ : required to tell DPS we are posting up JSON content and must be ā€˜application/jsonā€™
  • -H ā€˜Content-Encoding:Ā  utf-8ā€™ :Ā Ā  required to tell DPS the encoding we are using for our message body.Ā  Set to the proper value for your OS/clientĀ  (Iā€™ve never used anything other than utf-8 here)
  • -d ‘{“registrationId”: “[registration_id]”}’ :Ā Ā  the ā€“d parameter is the ā€˜dataā€™ or body of the message we are posting.Ā  It must be JSON, in the form of “{registrationIdā€:ā€[registration_idā€}.Ā  Note that for CURL, I wrapped it in single quotes.Ā  This nicely makes it where I donā€™t have to escape the double quotes in the JSON
  • Finally, the last parameter is the URL you post to.Ā  For ā€˜regularā€™ (i.e not on-premises) DPS, the global DPS endpoint is global.azure-devices-provisioning.net, so thatā€™s where we post.Ā  https://global.azure-devices-provisioning.net/[dps_scope_id]/registrations/[registration_id]/register?api-version=2019-03-31.Ā  Note that we have to replace the [dps_scope_id] with the one you captured earlier and [registration_id] with the one you registered.

you should get a return that looks something like thisā€¦

dps-cert-individual-return-val

Note two things.. One is the operationId.Ā  DPS enrollment in an IoT Hub is a (potentially) long running operation, and thus is done asynchronously.Ā  So to see the status of your IoT Hub provisioning, weā€™ll need to poll for status.Ā  Iā€™ll get to that in a minute.Ā  The second thing is the ā€œstatusā€ field, which begins in the ā€˜assigningā€™ status.

The next API call we need to make is get the status.Ā  Youā€™ll basically do this in a loop until you either get a success or failure status.Ā  The valid status values for DPS are:

    • assigned
      – the return value from the status call will indicate what IoT Hub the device was assigned to
    • assigning
    • disabled
      – the device enrollment record is disabled in DPS, so we canā€™t assigned
    • failed
      – assignment failed.Ā  There will be an errorCode and errorMessage returned in an registrationState record in the returned JSON to indicate what failed.
    • unassigned ā€“ ummm..Ā  no clue.

To make the afore-mentioned status call, you need to copy the operationId from the return status above.Ā  The CURL command for that call is:

curl -L -i -X GET –cert ./dpsTestDevice01_cert.pem –key ./dpsTestDevice01_key.pem -H ‘Content-Type: application/json’ -H ‘Content-Encoding:Ā  utf-8’ https://global.azure-devices-provisioning.net/[dps_scope_id]/registrations/[registration_id]/operations/[operation_id]?api-version=2019-03-31

where [dps_scope_id] and [registration_id] are the same as above, and [operation_id] is the one you copied above.Ā Ā  The return will look something like this, keeping in mind the registrationState record will change fields based on what the returned status was.

dps-cert-individual-status-return

Unfortunately, Iā€™m not a fast enough copy/paste-r to catch it in a status other than ā€˜assignedā€™Ā  (DPS is just too fast for me).Ā  But you can try this all programmatically or in a script to do it.

Ta-Da!

Thatā€™s it.Ā  You can navigate back to DPS, drill in on your device, and see the results

dps-cert-success

Enjoy ā€“ and as always, hit me up with any questions in the comments section.

Monitor IoT Edge connectivity with Event Grid

Hi!  Long time, no see.  Itā€™s been a few months since I posted here.  I wonā€™t bore you with the many ā€œIā€™ve been busyā€ excuses (but I have!).  Anyway, enough assuaging my guilt over not having shared with you guys in a while.   Letā€™s get to why you came here (since google.com probably sent you)

One of the frequent questions we get related to any IoT device, but especially IoT Edge is ā€œhow do I know if my IoT device/edge is up and running and talking to IoT Hub?ā€  I was recently researching this topic for a customer and ran across some cool stuff.   I even did a quick little sample/demo Iā€™ll share at the bottom involving IoT Edge and Microsoft Teams. 

IoT Edge/Hub and Event Grid

As you may know, IoT Hub integrates with Azure Event Grid.  Azure Event Grid is a fully managed event routing service that uses a publish-subscribe model.  Event Grid integrates with a bunch of Azure Services (a pic is on that link above), including IoT Hub.    As shown from the list below, stolen from that same link, IoT Hub publishes several event types to Event Grid.

  • Microsoft.Devices.DeviceCreated – Published when a device is registered to an IoT hub.
  • Microsoft.Devices.DeviceDeleted – Published when a device is deleted from an IoT hub.
  • Microsoft.Devices.DeviceConnected – Published when a device is connected to an IoT hub.
  • Microsoft.Devices.DeviceDisconnected – Published when a device is disconnected from an IoT hub.
  • Microsoft.Devices.DeviceTelemetry – Published when a device telemetry message is sent to an IoT hub

Note that Device Connected and Device Disconnected events will be raised when the devices connect or disconnect.  That can be an explicit connection and disconnection, or a disconnect after a certain number of missed ā€˜heartbeatsā€™, meaning the device or network may have committed seppuku (not to be confused with Sudoku) .  You can set Event Grid subscriptions to these events and then respond however you want to (Logic Apps, Azure Functions, Text, Email, etc)

So, what about IoT Edge?

Ok, Steve, but google sent me here because I searched for information about monitoring IoT *Edge*connectivity.  If it pleases the court, are you going to talk about that any time soon?

Yes!  The reason this is relevant to IoT Edge is that the connection/disconnection events not only work for IoT end-devices, but it also works for IoT Edge modules!   So, leveraging that, you can tell if any of your IoT Edge modules disconnect and donā€™t reconnect (did they crash?) or also if the IoT Edge runtime itself loses connectivity (crash?  network issue?) by looking for those events for the $edgeHub and $edgeAgent modules. 

Ok skippy, prove it!

Fine!  I willā€¦   The IoT Hub/Event Grid teams have a nice little example of using the integration between IoT Hub and Event Grid to persistently track the status of your device connectivity in a CosmosDB database, via a LogicApps.  It will have a record for all of your devices, and their last connect/disconnect time.   Itā€™s a pretty nice little sample and could be the beginning of a connectivity monitoring solution  You can build a UI, have it pull the status from CosmosDB, etc.

Check it out if you have the time or inclination.

However, I was too lazy busy to create a CosmosDB, stored procedure, UI, etc.   So I took the quicker way out.  I created a Channel in Microsoft Teams and, each time a device/module connected or disconnected, I just post a message to the channel.

I basically followed that article and all the pieces of it, except everywhere it referenced the stored procedure and CosmosDB, I just instead leveraged the built-in LogicApps connector for Microsoft Teams, which is super-easy to use.

The result is shown below.  I have a raspberry pi running IoT Edge sitting on my desk.  It has a SenseHat accessory in top of it and was running a module that I wrote that talks to it (uncreatively named ā€œsensehatā€ ā€“ they donā€™t pay me enough for naming creativity).   At about 10:55 local time, I reached over the yanked the power plug out of the raspberry pi.  2-3 minutes later (after some missed heartbeats), viola, I got a post to my Teams channel telling me that both my module (sensehat) as well as the Edge runtime ($edgeHub and $edgeAgent) had disconnected.   (rpihat2 is the device name for my IoT Edge device)

IoT_Hub_Event_Grid_Teams

(full disclosure ā€“ I made ZERO effort to make the message look pretty)

I could just as easily, via LogicApps again,  have leveraged SendGrid, Twilio, etc to send me email alerts, TXTs, or whatever for these events.  If you do use Microsoft Teams, you can set alerts on the Channel when new events are posted too.  If you use the mobile app, you even get push notifications.

I didnā€™t write step-by-step instructions for doing this, as I hope itā€™s not too hard to follow the Azure IoT teamā€™s cool instructions above and just substitute Teams/SendGrid/Twilio, etc for the output..   but feel free to let me know if I need to do step-by-step.

Leave a comment and let me know if this, or any of my other content, is useful (or if you need/want step-by-step instructions)

Enjoy,

–Steve