While working with IoT and website deployment and testing, we will be in need of certificates. We end up using OpenSSL to generate strong certificates (mostly Microsoft blogs will guide us on how to do that). Most of these blogs will guide us on how to get that done.
Recently working on a web deployment to IoT edge containers, I was trying to secure the website with a self signed certificate. It was working all fine locally. But strangely when deployed to container, the container starts and shuts down immediately.
After a bit of good old trouble shooting and isolating the problem, I realized that it works from my Windows machine and not Mac machine! to be specific when the certificates were generated from Mac container version of my website didn’t load. Why?
Windows had OpenSSL by default. Whereas my Mac had LibreSSL by default and the encryption on that didn’t help.
Face -palm moment..
“Ensure that you have proper libraries for SSL before you dockerize and copy certificates to your container.”
Though the blog shows that path, that is not ideal from a security standpoint. But I wanted to troubleshoot, why wouldn’t the Kestrel hosted website not work inside the container. So was trying to isolate the problem.
It is quite a known pattern to use Azure Stream Analytics (ASA) to create data pipelines to store ingress IoT data to an output location be it a SQL Server or EventHub or Azure Storage etc. This is all the more important when push ASA to the edge and use that as a data transformation and storage orchestration engine esp. with SQL Server on local on premises being used like a Historian. This seems to a straight-forward process but here comes a small catch and technical/product limitation of Azure. It took us a few precious hours to understand this aspect.
Short Answer (if you have scrolling like me):
“If you want to connect ASA with SQL server, ensure that you have a trusted CA certificate with proper certificate chain installed in the SQL server VM”.
For the patient ones who need the backstory 🙂 read along..
What were we doing?
We were trying to wire up an Azure IoT edge module with a SQL server on a VM! This seemed quite easy as per the documentation but I ended up with a curious certificate error.
As a troubleshooting step, I tried to create this on ASA on cloud and connect with the same SQL server on VM to rule out any Edge VM certificate issues. This should be quite quite simple if we follow this blog.
Set Force Protocol Encryption Client Setting to Yes
For secure connectivity, ensure that the client and server both require encryption. Also ensure that the server has a verifiable certificate, and that the TrustServerCertificate setting on the client is set to FALSE.
Created self signed CA certificates and installed them as well. But then still the issue seems to be coming back and back.
Finally, we found out from Microsoft product team that we need proper CA certificates with certificate chain from well known authority to make ASA and SQL work together.
One requirement for SQL server on VM as output to work is that the SQL server needs to be configured with an SSL certificate issued by a trusted CA. There is no workaround with this. You can’t use a self signed certificate or use TrustServerCertificate=True and change SQL Server settings.
1- Regarding SSL Certificate – Make sure to use the DNS based FQDN for the CN. Here are the full requirements listed.
2- SSL Setup in the VM – Follow steps here. If using SQL 2016 , Also put the certificate’s thumbprint in the registry key mentioned in the “Wildcard Certificates” section.
Now for me who is just doing a dev setup and doesn’t have the luxury of client CA certificates, there are quite limited options.
For IoT Edge, I used a custom .NET code with SQL DB client to communicate with SQL server VM using the TrustSeverCertificate = True flag in connection string for dev code until I get a CA cert.
But for Azure Stream Analytics PaaS service, we can’t enter connection string. So there is no way to enter TrustSeverCertificate=TRUE during development. Sure seems like a restriction.
One another way which is to use services like Let’s Encrypt and generate a chained certificate for your use temporarily. Something which I am yet to try. I think that should work.
If you have used that and worked, please let me know in the comments.
Off late, I have been working with Industrial IoT projects with Kepware Server, Azure IoT Edge and IoT Hub. During one of those, there was an ask to use an excel file with hand crafted data to be used as input data source at the edge. While investigating options including writing custom module at the edge to read the excel and send row by row, I realized that Kepware Server can be used along with Advanced Simulator. Being a noob on Kepware this was quite a surprise.
As I started configuring this, it took sometime to get this up and running and definitely a few steps and caveats along the way. So I thought why not give the screenshots and walk you through the steps and configuration needed.
Before I got ahead thanks to my friend Akhila (firstname.lastname@example.org) for troubleshooting and figuring out some of these specific roadblocks which a noob like me was trying to figure out from manuals.
Let’s us go ahead and deep dive into the Kepserver Ex and learn the specifics of using Advanced Simulator driver.
Let us create a new channel and set the type of the channel to be Advanced Simulator
Provide a useful/meaningful name to the channel and proceed.
Since we won’t be changing much of the default settings for this demo, let us leave the next pages as is and proceed.
Let us add the data source.. this is where csv is used. Click on the “Configure DSN” and Hit the Add Button on the right of the pop up window.
We need to select the driver for data source. We have an xls file with multiple columns. We converted that into a csv file, so we could use a Microsoft Text Driver (*.txt,*.csv).
Give a meaningful name to the data source and after that select the directory where you have stored your converted csv files, with columns having your simulated data. Ensure that your headings are meaningful as that become your tag values. Also, make sure it doesn’t have space between the text or leading spaces. That will make the tag reading erroneous.Use “Define Format” feature to select the columns and guess the tag rows. That will help you know if the file is being read correctly. If there are changes to delimiters and stuff we can give all those customizations here. We had a standard csv.
Click on Ok and we are done with the Data Source Name creation with the files specified in the folder.
Select the data source that just got created under the “Add Channel Wizard” and click Next.
Click “Finish” to complete adding the Channel.
We need to add device that can now read the files from the data source. Right Click on the Channel that got created previously and click on “New Device” to get started.
Setting up a device
Choose a meaningful name for the device and leave the next window as default in the wizard.
Setting the “Initial Updates from Cache” to “Enabled” helped us create those tag values automatically. Leave the Next wizard as default until choosing TableClick on the three dots and select the soecific csv for this device from the data source mapped earlier while setting up channel.
Specify the file read rate in the record selection interval. Please note that the time is milliseconds. Enable looping so that you get continuous stream of data once the cycle gets over. Please note that you need to repeat these steps for all the devices and its CSVs. but those devices timestamps may not be synchronized necessarily. it starts counting whenever it is setup and running.Click Finish and the device is successfully added, the driver should be able to “Generate” the TAG names from the column names. In case, the logs indicate an error as below, you may want to change OPC configuration settings to resolve it.
Ensure the user has administrator access on the Kepware so that he can access the kepware administration application as shown below. [It is accessible over the tray icon at the right bottom of the windows ]. Click on the “Settings” and Find and Navigate to the “Runtime Process” Tab. Change the selected mode to “Interactive”
Apply the settings and “Re-initialize” / Stop and Start the Kepware server instance.
The Tags should now be auto generated, and the simulator should be able to read the data from the files successfully.
This can now be pushed to either to the IoT Hub or Edge device to process it further!