In an ever-faster evolving world, enterprises need to be able to gain insights on productivity to improve and reduce, or even eliminate, incidents on their production floors. This was traditionally an isolated operation that was limited by technology and the ability to collect data in real time. The advances in sensor capabilities, connectivity, edge computing, data collection, and machine learning have changed this landscape dramatically over the last decade.
The journey to these improvements is however perilous as many companies fall into the pitfalls of sensory data overload, connectivity bottlenecks, and incorrect data interpretation. To avoid these traps, sound understanding of the end goal, the technology available, and how to connect it to your existing system is a must.
Architecting today's connected enterprise solutions involves a range of disciplines, also known as Operational Instrumentation, from hardware design to networking/connectivity to database management and machine learning. To understand the complexity of designing such a system, one needs to understand the journey from a sensor to the proper insights, so let’s examine it.
Traditionally, connected sensors deployed in the enterprise were mainly focused on security and safety. They ranged from environmental control (temperature, humidity, smoke, etc.) sensors to video cameras. On production lines, you had the ability to measure dwell time, but it required the assets to go through specific stations to do so. Today, almost any sensor can be connected to the “Internet.” From occupancy sensors installed on chairs, to infrared or radar people counters, to light measurement sensors, to proximity sensors in phones, to vibration sensors on motors, to water leak detection sensors, to noise and voice sensors, to object recognition cameras, just to name a few, almost any thinkable parameters/variable/object can be measured and tracked in REAL TIME in the Enterprise.
While in the past the challenge with deploying such technologies was mainly about connectivity and how far behind in time one needed to be able to perform an analysis, today’s challenge includes the selection of the proper sensor technology, how much processing needs to be done locally (e.g., in the sensor itself, at the edge, or “in the cloud”), how often a sensor should report data back, how big the pipe between the sensor and backend systems needs to be, how to correlate the data provided by multiple sensors to obtain the proper insight, and how to interconnect the sensors databases with the Enterprise ERP, Personnel Management and Security systems.
Before sensor selection, it is fundamental to understand the use case and clearly state the end goal of the application. For instance, some temperature sensors may not be able to detect temperature rise fast enough and should not be used for fire detection, or an object recognition camera or a radar base sensor may not be necessary for a people counter application when a simple mat can provide the same results. The cost of a sensor can vary from a few dollars to thousands of dollars and this may impact whether a solution is worth deploying, as the return on investment may never be achieved, hence performing a thorough analysis is key to a successful Connected Enterprise deployment.
Network connectivity plays a significant role in the design of the system and influences the cost of the solution not only from a CAPEX but also from an OPEX perspective. Here are the various ways to connect a sensor to IoT applications:
Securing access and provisioning devices is “easy” when sensors are wired: the devices cannot join a network unless they are plugged into it. When it comes to wireless devices, the challenge is twofold: provisioning the devices into the network and securing them. Using an MDM (Mobile Device Management) is mandatory in most of the cases.
You may be able to avoid it for BLE, Zigbee, and Z-wave networks but you will not be able to provision devices without one in all of the other wireless protocols. The last challenge is securing the network. Contrary to wired solutions, in which the data remains private and local (as long as you have not exposed it to the world via the Internet), wireless networks can be spied on with non-expensive tools. Man-in-the-middle attacks are “easy” to carry, especially in Wi-Fi, Zigbee, Z-Wave, and BLE networks in which the security keys are exposed in the clear or not rotated at all. Make sure you understand these trade-offs when building your solution.
Here are some of the considerations to take in account on how to architect your data collection and edge computing infrastructure:
One of the examples I usually use to visualize my problem is vibration sensors. I have a motor that runs at 2,000 rpm, which means I need to be able to sample accelerometric data on 3 axes at 4,000 Hz, e.g., 12,000 samples per seconds. Depending on the precision I want I may need 8,12,16 or up to 48 bits. Let’s assume 24 bits. So, I need to send 12,000 samples of 24 bits every second: 288,000 bits = 281 kbit/s.
This means that I can only use Wi-Fi or a wired connection or wait for 5G to transmit raw data. What if the motor runs at 10,000 rpm? In this case what is important is that the motor oscillates at a specific frequency and the only thing that matters is that this translates by a deviation in the acceleration in one or more of the axis that can be synthetized in the sensor (using a dedicated ASIC/MCU) and only the alarm can be sent to the “cloud” when a problem occurs.
If you need to deploy a solution at large scale and sensor price becomes an issue, you could also imagine that you can collect the data at the Edge, compute the expected results with COTS hardware, and only send the alarms or the synthesized data to the cloud.
In a more common scenario, in which you want to have a more distributed infrastructure and perform processing at the Edge (for instance at a site or at a production line level), you can rely on your own micro-services built on top of Kubernetes or Docker Containers or use off-the-shelf solutions such as AWS Greengrass or Azure IoT Edge. All of these applications can run on COTS hardware, which can reduce your deployment costs. You may also consider Edge computing when required to provide an “instant” feedback or require immediate action (for instant robotic systems, alarms, threat identification systems, etc.).
If you still need to process significant amounts of data at a central location and whether the data collection happens in your data center using open source data lakes such as Apache Hadoop or in the cloud with solutions such as AWS Kinesis, AWS Lake Formation, or Azure Data Lake, it is key to perform the proper data engineering and ETL (Extract, Transform, Load). Otherwise, you will end up with the swamp. You can refer to our blog “Operational Intelligence in the Next Normal” for how to develop these strategies.
Last but not least, once you have collected and ETL the data, you need to analyze it so you can deduce the proper and valid insights.
Depending on the type of data you are analyzing you will need to decide on the proper visualization and search tools. This can go from your homebrewed ELK stack (Elastic Search – Kibana) or Apache Spark to off-the-shelf solutions such as Tableau, Teradata, Splunk, etc. Again, remember what you need to visualize and what are the conditions you are monitoring. For instance, you do not need to develop a complex ELK stack or use Splunk when you only need to display a time series for a temperature sensor. If correlation is necessary between sources (e.g., multiple sensor data), for instance when doing predictive maintenance, you may also want to rely on Machine Learning libraries and build the proper algorithms to establish these dependencies.
It is important to separate the hype from new technologies such as 5G from your needs. As this journey highlights, there are many aspects to consider when building a Connected Enterprise solution. Before you start architecting your system, you need to ask yourself the following questions:
Answering these questions will help you understand which sensor, communication, edge computer, databases/data lakes, search engines, and visualization tools you will need. It may also highlight some of your current networking and infrastructure bottlenecks and help you resolve them.
IoT at pureIntegration
pureIntegration enables companies to flourish in a digitally connected world. Our IoT expertise includes connected enterprises and homes, asset management, and tracking. Our practice covers device integration, data collection, sensory telemetry and analytics, and the development, deployment, and management of applications, portals, and systems.