Sensor Processing
Sensor Processing
Fueled by growing Cloud Deployments and Sensor Networks, according to Datamation, the Global Industrial IoT (IIoT) market reached $761.4 billion in 2020 and is expected to reach $1.39 trillion by 2026 at a 13.8% growth rate. The largest growth in the past several years has been in the manufacturing, retail, and health care sectors, where IIoT technology has been deployed to automate processes, handle logistics, and track people. This exceptional growth rate can be further examined when we look back at 2018, where, according to All the Research, the market was valued at USD 85.4 billion and was expected to reach USD 515.8 billion in 2026, at an expected Compound Annual Growth Rate (CAGR) of 25.2%. Although the events of the last several years have been extraordinary, Signal Edge maintains that over time this underestimation of expected IIoT market performance has always been a notable trend.
Worsening Situation
The biggest issue in sensory processing and the long-time focus of Signal Edge Principals has been the network bottleneck. Network resources are the most limited, failure prone, and expensive operational component within a distributed computer process. For many years, the industry relied on hardware upgrades to solve their capacity and reliability problems. Unfortunately, the exponential growth of IoT devices has pushed network data capacity beyond its limit and the result has been escalating costs, slower response times, reduced functionality, low scalability, and higher error rates.
Customers have attempted to solve the bottleneck problem using a variety of techniques. The most common method is compression. A compression algorithm removes repeated byte patterns from the data payload. While compression may reduce individual payloads by 10% to 40%, it does nothing to reduce the number of data packets. So, “10,000 sensors still send 10,000 packets, just slightly smaller ones”. Failure to solve the problem has forced the industry to adopt the following mitigations:
Upgrade Capacity
Upgrading capacity was centered on networks upgrades where many industry players put their faith in 5G and faster routers to solve the problem. These systems while sufficient for solving bandwidth issues for applications such as streaming video, did nothing for large-scale data transactions. The sheer volume of sensory data creates a capacity problem that no simple network upgrade is going to solve. Also, network hardware upgrades are very expensive and take significant time to implement.
One solution that the industry has gravitated towards is Edge Computing. Edge Computing involves deploying a large hardware server to process sensory data at the network edge. Unfortunately, this goes against the business trend to move all IT operations towards Cloud Services. The Cloud allows corporations to eliminate the increasingly excessive costs of remote data centers. Customers are going to be highly resistant to paying the high price of going backwards.
Downgrade Capability
Failure to solve the bottleneck problem and the costs of Edge computers have driven many customers to significantly limit a system’s capability. For example, utility companies place severe data restrictions on their Smart Meters. These meters which are capable of producing hundreds of sensory readings per second are restrained to one or two measurements a day because of the millions of meters used in a standard network. They, like many others, suppress data production to manage IT costs. Costs that would reach prohibitive points, beyond the business budgets ability to handle it, in order to gain full IoT implementation.
Unfortunately, by doing this, much of the application’s capability is lost. Smart meters are relegated to billing as opposed to real-time diagnostics where meter issues and dangerous data trends, indicating risk to life, property, assets, could be immediately identified and remedied. All this functionality lost because there is just too much sensory data for existing IT capacity. We have seen similar issues in mining, transportation, and security where the infrequency of the measurements actually compromises functionality, projecting a false sense of well-being.
While data bottlenecks have existed since the inception of computers and computer networks, the requirements of sensor processing are the elephant that broke the camel’s back. Where companies may have 2,000 human users producing transactions every couple of minutes for 8 hours a day, now they have 10,000 sensors producing transactions every few seconds for 24 hours a day. The human users produce 480,000 transactions, while the sensor system produces 432,000,000 transactions. World-wide computer/network capacity is not currently or near-term capable of satisfying such escalation requirements.
*The Internet of Things (IoT) is the network of devices such as vehicles, and home appliances that contain electronics, software, sensors, actuators and connectivity which allows these things to connect, interact and exchange data. The IoT involves extending Internet connectivity beyond standard devices, such as desktops, laptops, smartphones and tablets, to any range of traditionally dumb or non-internet-enabled physical devices and everyday objects. Embedded with technology, these devices can communicate and interact over the Internet and they can be remotely monitored and controlled.