For several reasons, the peripheral part of an IoT technologies presents major challenges to information and infrastructure security.
- The large number of devices and their vast distribution create a huge attack surface that is difficult to monitor;
- Some sensors and communication protocols were born to be used on closed private networks, and do not address cybersecurity satisfactorily, or ignore it altogether;
- Many devices are in practice small computers, with Linux distributions reduced to the bone and whose manufacturers are difficult to reach and tend to be not very active in the release of security updates;
- Not seeing them as real computers, consumers (but also business operators) do not pay the same attention to security for this category of devices as they usually reserve for computers.
It is no coincidence that some of the most devastating cyber attacks in recent years have used botnets built with malware that have attacked hundreds of thousands of devices such as routers, cameras and printers to generate huge amounts of internet traffic to be used for DDoS attacks .
IoT Data Collection And Storage
If collecting and storing a numerical value every minute from about fifty sensors of an industrial plant is an operation that does not present particular technical difficulties, acquire large volumes of real time data in a short time or a modest amount of data, but from thousands or millions of consumer devices or sensors, presents significant challenges in terms of bandwidth, writing speed and data structuring.
The main producers of databases and ERP systems , as well as cloud providers, generally have specific solutions to acquire and process IoT data, and there are open source solutions that integrate with distributed file systems and NoSQL databases used for the management of Big Date.
In many cases, the characteristics of the data needed “in the field” are different – in quantity and latency – from that needed for centralized analytical processing.
Let’s take an example. On the production floor of a factory it is necessary to have a sensor reading every second; a variation of the values compared to the norm must immediately trigger an alarm or close a valve.
The same data is also used by the Erp system and by the quality control in the central office. These systems, however, can be satisfied with a data every minute, or every hour, and can afford to receive them in the night to have an early morning report distributed to the managers concerned. Perhaps the only data that is needed is that relating to anomalies.
In this case, transmitting the entire amount of data to central systems involves a waste of bandwidth, computational resources and storage for data that will not be used. The procedure will also involve a delay, so that the alarm for an anomalous value risks reaching the machine operator with a few seconds of delay which can prove to be problematic.
In these and other cases, it is convenient that the data are initially collected and managed by a small server on the production floor, which analyzes them, reacts to anomalies, and then transmits only the relevant data, or a pre-analysis thereof, to the central system. time needed.
This type of architecture is called Edge Computing, because it moves a part of the processing and analytics phase from the Date Center to the edge (edge) of the network, near the sensors. In addition to the case described, edge computing is fundamental in those situations where connectivity with central systems is absent or limited. For example, systems that collect and process analytical data on airplanes, ships or means of transport and download them once they arrive at their destination can be seen as Edge Computing systems.
Analytics And Presentation
The IoT platforms of large IT vendors, such as Oracle, SAP, IBM, Microsoft, etc., all have an analytics and data presentation component , and everyone – who better who worse – are introducing advanced technologies such as machine learning to generate interpretations from data. and information that is directly useful to the business, integrated into the production and accounting processes.
Contacting the same vendor that, for example, provides ERP or production solutions certainly favors data integration, reduces intervention times in the event of problems and simplifies the attribution of responsibility in the event of a dispute.
However, it is not a necessary path , especially when the size of the data collected or the number of CPUs required to process them in useful time for the business push the costs of licenses to unsustainable levels. In this case, a separation of the IoT Analytics platform, maybe realized with low-cost and server cluster software open source, from the management of the business can be a more viable choice, although necessarily more complicated.