The first personal computer didn’t have the capability of storing data, any document had to be saved on a floppy disk. Later, the first hard drives arrived and, with them, the capacity of storing on the computer 10 to 20 MBs, depending on the model. You had to be quite selective with which documents or applications you wanted to keep on your computer. Today, years later, any cellphone has a bigger process capacity that any computer in the past. Nevertheless, Internet of Things (IoT) is making us live a similar challenge.
Either the 50.000 million connected objects in 2020 estimated by Cisco, or the 26.000 million that Gartner predicts, it’s certain: our buildings, factories, data centers, cities, even vineyards are deploying networks of intelligent and connected devices that monitor all types of systems generating terabytes and terabytes of data. Data that, connected to one another, allow us to optimize the processes in our factories, improve our city traffic, increase our field’s productivity, reduce electricity consumption and, in many cases, like in machine and data centers maintenance, anticipate possible failures. Data that is able to predict the future and improve people’s quality of life.
How can we manage all these data? We are experiencing a moment of growth of data centers, and this data is usually stored in large complexes in remote areas. What is perhaps not so well known is that the large cloud data centers are not being able to build at the necessary speed demanded, and that in many cases, and by using a clear analogy, we are building highways, necessary for the Big Data, but there are times when it’s best to stay in the neighborhood to avoid excessive latency (an important factor for critical infrastructures) and optimize resources. This data can be later uploaded to the cloud to be analyzed and processed. The increase of data volume -from the quality of that first video that Jawed Karim uploaded to Youtube on April 23, 2005 we have already switched to 4K technologies-, the internet connection not only of a person but of any object – clothes, cars, refrigerators … – applications in which response times are critical, coupled with the need to adjust costs – are encouraging IT systems to become closer to the user and the data source. This phenomenon is known as edge computing.
A couple examples to illustrate: in a vineyard, grapes have a complex process of growth if you truly want a wine with character. In the past, they estimated the fallen rain, the hours of sun the strains had received, the temperature … and looking at the strains and grapes they could estimate the amount of water and fertilizer to use. Today, instead, you can install sensors and programs that help you take the decision, and you will have to decide how much data you need to keep for the future, how often you need data to be taken, and in some cases you will choose to have everything, hour by hour, in others, a fortnightly summary may be sufficient.
If we focus on industry cases, connectivity between IT and OT is critical to ensure future competitiveness due to the increasing pressure on flexibility, productivity and cost reduction. This convergence requires bringing the data process as close as possible to the point where data is generated and consumed, especially in highly flexible, competitive environments in which smart information management is critical.
The IT Infrastructures are evolving alongside to these new profitability, flexibility, and speed needs demanded by the fourth Internet revolution. In vineyards, emblematic buildings such as the Sagrada Familia in Barcelona, without going any further, we are betting on the installation of data centers directly in a prefabricated container, a totally modular solution that allows to cover all the data processing needs in a few hours. In the case of factories, in addition to the large data centers they may have at a global level, the tendency is to provide the plants with their own small processing centers that can control the critical information locally and, at the same time transfer information to your corporate cloud for more advanced analytical projections.
In other countries, some residential buildings already have some small data centers that process the information of different intelligent devices such as the elevator, heating system, video cameras and that allow their residents to enjoy a safer, more comfortable, and sustainable home.
It’s not too crazy to think that, in a nearer future, our industries and data centers’ systems and machines will be able to learn from their environment and take decisions that will improve their functioning, that artificial intelligence will end up being more regular. A leap in which Edge Computing and decentralized IT infrastructure will have a lot to say.
This article is a translation of an opinion article published in a tier 1 media in Spain, signed with by the IT VP.
Soko Directory is a Financial and Markets digital portal that tracks brands, listed firms on the NSE, SMEs and trend setters in the markets eco-system.Find us on Facebook: facebook.com/SokoDirectory
and on Twitter: twitter.com/SokoDirectory