The Industrial Metaverse, Digital Twins & More – Beyond the Narrative!

Much has been written about Metaverse over the last few months. So called the “New Internet” or the “New Application Experience” has become a part of our extended lexicon more that what we may want, but, the truth is, there many more hurdles before it becomes an indispensable part of our daily life, like mobile apps and websites. Perhaps that is why Gartner marked it a good 10 years away from becoming part of our behaviours.

From a component architecture perspective, there are a lot more moving parts to this stack. Yet there is one specific aspect of this new concept that deserves attention – The Industrial Metaverse. This is spoken in conjunction with IIoT, Industry 4.0 and 5.0 with much fervor.

Just like me, you may wonder, we always had digital twins and what is the new angle here! Let us understand a few things before we peel the architectural onion of Industrial Metaverse.

Everyone has been talking about – “let us build a Digital Twin” quite synonymous with metaverse and plain simple 3D model visualization of machines and plants, which is half the truth. A good looking, interactive 3D model or an ability to show some data on top of 3D models or a simulator with 3D models attached is not a Digital Twin! At least, I don’t subscribe to that definition.

How do I see the Industrial Metaverse forming? - From an evolution of Virtual models and OT technology solutions.

Product Engineering teams always had high resolution, detailed 3D models of various formats, with companies like Dassault systems giving good workbenches to import, build and simulate pre-created scenarios in order to test faults on these machines – Virtual Twins. These concepts started moving further into plants and processes with the question – why are we not using these models to monitor actual plants and their operations. But then they were siloed with singular models from the vendor platforms whereas the processes definitely sat outside of those platforms in the plant ecosystem. They had sophisticated simulators but then they were not a software representation of the actual plants, assets and processes. Also, it didn’t work hand in hand with other vendor machines and their simulators. Most importantly, real data could never stream into these workbenches.

On the other side, manufacturing execution platforms and hyperscaler platforms like Azure and AWS were wondering about helping factories and plants with real time representation of their shop floor data, securely, at scale – to model assets & process, to predict events, to prevent the scenarios using Industrial Platforms. They were expanding their plant or cloud knowledge moving from OT to IT, or IT to OT.

Value chains were getting extended. Both of them were searching for a / set of software services to bridge this chasm of real time data and process handshake, each partnering or building full stack solutions.

Note: All the while, manufacturing plants were looking for value and not expensive technologies!

Let us call the middle region as Digital Twin – a software representation of the assets, processes, and behaviours that memoise the system, contextualize data inside the system, and raise events based on the connected behaviour of system workflows, like a point in time. I often call this as the working memory of the system with a limited playback capability. This crucial component adds to the real time nature of the system in between.

This Digital Twin is the secret sauce to having a good Virtual Twin on the top and a low latency (autonomous or condition based) control feedback to Industrial Platform below.

Paraphrasing one of the Siemens executive’s statement (will be attributed once I figure out the details)

Consumer metaverse may not need real time data but industrial metaverse needs real time data to succeed.

Hence, the platform in the middle with the bindings between the layers should bring the real time data view of the plant / factory / industry to the virtual world seamlessly (making the metaverse possible).

You could see from the earlier rough sketch that there are broadly 2 interfaces needed to make this happen. One that elevates the virtual twin experiences to make it closer to Metaverse experience continuum mentioned in the earlier blog of mine. And the next, of getting real time information from the OT / physical ecosystem within factory. These 2 are integral to changing the current status quo of plain 3D models with data to a metaverse application.

  1. The Visualization Interface for Metaverse Apps

Virtual Twin with the Digital Twin binding is what creates the entire universe of Metaverse Apps which are highly responsive, rendering of the plant that enables us to seamlessly access a particular machine in a plant and troubleshoot remotely. Definitely that needs infrastructure support like 5G and 6Gs but also some like of spatial software support for seamless collaboration and space sensing. The component architecture I can think of is Microsoft Mesh. I don’t want to repeat Microsoft documentation here.

https://techcommunity.microsoft.com/t5/mixed-reality-blog/microsoft-mesh-a-technical-overview/ba-p/2176004

I believe that the link above gives a very good idea of how that platform should evolve to give the seamless B2B2C experience that I was mentioning in my previous blog. It talks of the immersive presence, spatially anchored maps, rendering acceleration, multi user shared experiences, streaming data, security, an Out of the Box SDK that enables application layer acceleration with the data binding from the API layer below (digital twin layer).

I would like to add that, this experience should be as simple as listening to Microsoft Teams on mobile MS Teams App and transferring the call to a desktop MS Teams application with just a simple button click, without any latency. Also, in the context of Industrial Metaverse, this would be much easier to setup as you can have dedicated infrastructure like 5G, Active Directory based secure logins inside wearables, controlled devices, limited number of users simultaneously logging in and the like. This can ensure that we guarantee similar infrastructure at network, connectivity, device, security layers with ample governance like today’s internet and the devices used to access information from within a manufacturing facility.

2. The Data Interface for Industrial Data

With numerous firewalls and air-gapped networks from factory floor to internet, this was one is a tougher interface. Good news – This is almost a solved problem, thanks to many of the private and hybrid networks, high security interfaces, localized secure processing on purpose made hardware, limited data exposure, one way communication that avoid command and control interfaces etc. This limits the exposure of OT, at the same time gives as much data and processing needed, at speeds and latency acceptable to populate the Digital Twins in the cloud middleware. Also, the fact is that many of the hyperscalers have software runtimes and stack that start from the cloud and extend to the edge within OT. MES systems are upping the game with their stack starting from the Edge synchronizing to the cloud. This gives opportunity to run reactive, predictive and preventive workloads within Edge and cloud, with visualization and actions that can go back and forth Physical Space and Digital Twin.

With these 2 interfaces maturing, you can possibly have the Industrial Metaverse sooner than a Consumer Metaverse. May be it is already here.

There are a lot of innovative solutions from PTC, Azure, AWS, Siemens etc. their numerous partners and many Solution Integrators etc. showcased and deployed to clients. Many such light solutions, or wannabe Metaverse solutions are already PoC-ed or used at Industrial facilities, for example, check out this build session. towards the 20th minute. This detailed Microsoft blog (image below) with examples give the full stack view of their Industrial Metaverse. Here, the digital twin layer, abstracted in my blog, holds the Azure AI and autonomous systems, Synapse Analytics, Azure Digital Twins together to build the middle stack. Azure IoT is the Data Interface for factory, Microsoft Mesh is Visualization Interface for Metaverse Apps built on HoloLens like devices. Power Platform is part of the Digital Twin layer but the application interface called PowerApps is still in Metaverse. I am sure an expert on another stack will have something similar.





As per the case studies presented here, helping us build simple digital representations, virtual worlds, autonomous self correcting systems and more.

To summarize, I believe the Industrial Metaverse is already here in many forms, in its many light weight avatars.

Our joint effort should be directed towards creating the right experience continuum within the use cases, to create tangible and usable solutions for the end personas helping them – to remotely collaborate, to improve the first time fix rates ,to visualize the impact of predicted changes, to build muscle memory of operators and the like.

Disclaimer: Author works for Accenture and uses Microsoft services to architect and build Manufacturing platforms and Applications on top of it.

Media Images generated by DALL.E 2

Previous blog – https://3logr.com/2022/09/27/moving-from-met-averse-to-metaverse-apps/

Advertisement

Working with EHealth Sensor from Libelium

I have been meaning to try out the cooking hacks EHealth Sensor kit for sometime now. My first attempts didn’t go well thanks to the changing supporting libraries and my impatience browsing all forums!

e-Health Sensor Platform V2.0 for Arduino and Raspberry Pi [Biometric /  Medical Applications]

https://www.cooking-hacks.com/documentation/tutorials/ehealth-biometric-sensor-platform-arduino-raspberry-pi-medical.html

Though it was interesting, I was facing a lot of compilation issues since the version I had for the sensor, arduino and the libraries shared by cooking hacks makers were incompatible.

Off late with the second wave of covid, I started using the sensor again. So I thought why not give it a try again to get the data onto internet. This time of course, I was able to get the errors fixed with the below actions

  1. Arduino version – I had to delete my Arduino 1.8 version and move back to 1.0.6 old version.
  2. Refer to the link and pick up the right version for EHealth and PinChangeInt Libraries based on your time of purchase.

with the above two steps properly done, I was able to compile and upload my code correctly for Arduino Leonardo board.

Now comes the bad part! 😦

  1. ReadPulseOximeter() methods was not working! So the interrupt trigger was not working for some reason. I tried changing the pininterrupt library to the latest version. Somehow didn’t help.
  2. So I moved the code to Loop() and tried called the C++ method from here directly. While the device display showed the SP02 and BPM accurately, the data I got from the library on Arduino Serial monitor was highly inaccurate!!! I tried changing the libraries and the delay in the eHealth.cpp but with that didn’t change the accuracy of output. Al this while though the device showed proper readings on its native display. I believe it is somehow related to segToNumber method and the delay.
Serial monitor data

Source code is here (not much different from the existing source code) except for the direct call to Cpp library..

So what next?

-> Investigate the data accuracy issue by updating the Arduino library EHealth file, esp the segToNumber & the delay.

-> Interface the shield to Raspberry PI, use its libraries and see if the accuracy is better than Arduino libraries

Sources:

https://forum.arduino.cc/t/ehealth-with-mega-2560/278026/5

https://www.cooking-hacks.com/forum/viewtopic.php?f=20&t=9984

Azure Stream Analytics to SQL Server – Isn’t that simple? or May be not..!

It is quite a known pattern to use Azure Stream Analytics (ASA) to create data pipelines to store ingress IoT data to an output location be it a SQL Server or EventHub or Azure Storage etc. This is all the more important when push ASA to the edge and use that as a data transformation and storage orchestration engine esp. with SQL Server on local on premises being used like a Historian. This seems to a straight-forward process but here comes a small catch and technical/product limitation of Azure. It took us a few precious hours to understand this aspect.

Short Answer (if you have scrolling like me):

“If you want to connect ASA with SQL server, ensure that you have a trusted CA certificate with proper certificate chain installed in the SQL server VM”.

For the patient ones who need the backstory 🙂 read along..

What were we doing?

We were trying to wire up an Azure IoT edge module with a SQL server on a VM! This seemed quite easy as per the documentation but I ended up with a curious certificate error.

As a troubleshooting step, I tried to create this on ASA on cloud and connect with the same SQL server on VM to rule out any Edge VM certificate issues. This should be quite quite simple if we follow this blog.

https://docs.microsoft.com/en-us/azure/stream-analytics/sql-database-output

No big deal. So you thought.

But, I still got the famous certificate chain error.

So I started doing the below documented steps

Using Encryption Without Validation – SQL Server Native Client | Microsoft Docs

  1. Set Force Protocol Encryption Client Setting to Yes
  2.  For secure connectivity, ensure that the client and server both require encryption. Also ensure that the server has a verifiable certificate, and that the TrustServerCertificate setting on the client is set to FALSE.

Created self signed CA certificates and installed them as well. But then still the issue seems to be coming back and back.

Solution

Finally, we found out from Microsoft product team that we need proper CA certificates with certificate chain from well known authority to make ASA and SQL work together.

One requirement for SQL server on VM as output to work is that the SQL server needs to be configured with an SSL certificate issued by a trusted CA. There is no workaround with this. You can’t use a self signed certificate or use TrustServerCertificate=True and change SQL Server settings.

1- Regarding SSL Certificate – Make sure to use the DNS based FQDN for the CN. Here are the full requirements listed.

2- SSL Setup in the VM – Follow steps here. If using SQL 2016 , Also put the certificate’s thumbprint in the registry key mentioned in the “Wildcard Certificates” section. 

Now for me who is just doing a dev setup and doesn’t have the luxury of client CA certificates, there are quite limited options.

For IoT Edge, I used a custom .NET code with SQL DB client to communicate with SQL server VM using the TrustSeverCertificate = True flag in connection string for dev code until I get a CA cert.

But for Azure Stream Analytics PaaS service, we can’t enter connection string. So there is no way to enter TrustSeverCertificate=TRUE during development. Sure seems like a restriction.

One another way which is to use services like Let’s Encrypt and generate a chained certificate for your use temporarily. Something which I am yet to try. I think that should work.

If you have used that and worked, please let me know in the comments.

Back to Sensing, Streaming and Storing..

IoT Data Ingestion strategies (quickly pressed..)

Having worked on data ingestion part of IoT for 6 months with IoThub, what I have understood is that a few strategies should be applied while during data ingress to ensure smooth and meaningful data ingress.

  • Understanding the frequency of data ingress

If you look at the tiers given by Azure itself for data ingress Total Messages per day and each message packet size determine the one you choose among the 4 available. The more the messages, the more difficult it is to stuff everything into a database and then parse to make it meaningful. The ideal solution is to remove certain data which are off the chart or send it specific repositories. For example, if you want to remove data from event 1 processed and moved to datastore 1 and data from event 2 moved into datastore 2 at the outset itself.

  • Annotation metadata to data

Now that your IoT Device has collected billions of data and that you have stored it in some format, to build a relationship and history, would you search through the entire tables and build that hierarchy. I guess, you are better off adding metadata on the run

  • Stream Analytics is a must

If you want to raise an alarm when there is a temperature spike or fall, or if you want to start another process when the process 1 is continuously failing / not sending data, we would want a Stream analytics solution whomsoever provides in your ecosystem, be it Azure, Amazon, Google, IBM and the like. No one wants to build analytics and wait 10 mins for the instant alerts.

 

(Article in KDNuggets helped me verify that my understanding is in tune with what the industry and academia in general thinks about data ingress in general.)

Integrating Chafon RFID tags into your C#/.NET projects

Couple of months back I started working on a simple RFID application integrated to the Internet to do some interesting gamified use cases.

Well, if you ask what use cases did I have in mind, the easiest analogy that I have is Disney MagicWand.

The videos  of magic band usage encouraged me to make a silly version of the application where I can take some data from the swipe of a band and then put it on a mobile notification to do something interesting, something worthwhile, something though provoking.

So I ordered a Chafon RFID reader and a few TK4100 re-programmable RFID tags.

 

In the first version here, I setup Chafon Reader to work with a Windows 10 laptop, .NET code using a bit of managed code wrapping up over the Chafon dll.

After the setup mentioned here below or in GitHub page, we can download and run this application.

The application scans all the ports, finds out the port on which our reader is available, gets ready to read tag. Once tag is scanned it just displays details in the console.


Steps to follow:

Chafon reference code is available for free download here

TK4100 specs are available here http://www.smartcardchn.com/uploadfile/single/TK4100.pdf

  1. Download Chafon RFID application for your corresponding LHF Device.
  2. Install the Prolific driver
  3. Follow the steps given by Chafon to update old driver in device manager (the pdf is inside the zip downloadable from Chafon site)
  4. Now use their given application to read and write into the tag
  5. you may want to make some changes to their code like I did – by adding a few decorators for the managed dll we are going to use
  6. Copy that dll into your executing code folder so that code can find it.

You may want to download and install Microsoft Visual C++ 2005 Redistributable Package (x86)https://www.microsoft.com/en-in/download/details.aspx?id=3387 to ensure that you are able to develop in a Win 10 x64 machine.

 


Uses dynamic port finding to ensure that you always get port where your device is connecting to. It could be COM3 or COM4 or COM8 based on what your laptop assigns it.

 

There are 2-3 sources which has contributed to this code. Please find the first one below where I understood what are the possibilities and how to get it done.

  • Blog 1 from Rob and related tools list
  • WMI code generator from Microsoft
  • The initial code generated from the tool took me to this link which gave me a detailed industrial code snippet which could be reused in my case with a few modifications.

These changes ensured that irrespective of how and where I plugin my Chafon reader it gets picked up correctly as long as my simple string search for “”


 

In the next parts, I am planning to upload this data to an Azure IoTHub and also configure a simple RAW push notification channel to ensure that you get this as a RAW message in your mobile phone.

Once these two steps are done, then the possibilities are numerous in terms of how you want to play with this.

Be it a simple gamified application or a complex use case.

  • for your kid inside your house – make her run into living room, tap it on her chair,
  • increase your dogs happiness quotient by fastening it on the collar – (ensure the devices and tags are human and animal test and friendly, do take care of things like choking hazard etc.)
  • Asset tracker like Tile App: Tie TK4100 to your keychain, build a custom key box with reader underneath your box [and a connected raspberry pi sending data our (here the program is on a laptop but)], use it to find out which key is taken out when and what time do you return etc.Use this data to manage your activities better or even simply to find whether your key is in the keybox or not.