Moving data processing from the cloud to the edge

Technical Manager - European Projects
Abstract

The collection of large amounts of data originating from a device, their immediate processing, and their moving to a centralized database, introduces latency each time. However, most data is critical, that of latency-sensitive applications, which generate triggers, alarms or real-time responses. They should therefore be analyzed on the spot, where detected.

A decentralized computational model, such as edge computing, responds to this need and thus allows data, already partially reprocessed locally, to be sent to the centralized Data Center or Cloud infrastructure, where information can be the subject of more complex analyses – Big Data Analytics – and historical data stored over time.

Companies and bodies from PA are progressively adopting a new architectural model that leverages the potential of the latest generation wireless broadband networks to improve analytical capabilities. This is a trend that finds its rationale in the feature that more and more digital applications have to require an immediate analysis of the data – with response times in the order of a few seconds or fractions of a second – even in environments where it is not possible to rely on a capillary and stable connection.

According to a Gartner study, by 2024 most Cloud service platforms will provide at least some services where needed.

We are changing the data management typical of the services of the past: Now it is no longer sufficient to collect records from sensors and machines and transfer them to a centralized Data Center, where we begin processing the case. Edge computing becomes the engine of the most innovative applications, from virtual and augmented reality, to artificial intelligence and IoT, from telemedicine to structural monitoring, to smart city services.

Different use cases require the distribution of applications on different sites. Edge computing, offering extreme adaptability, can be placed inside business premises, for example in industrial buildings, but also in homes and vehicles, including trains or aircrafts.

Usually, an edge computing architecture collects several enabling technologies: Wireless networks, Peer-to-Peer communication infrastructure, and virtualization solutions, and of course the Cloud.

Does the edge replace the Cloud?

The edge and the Cloud are not competing. Edge computing integrates closely with the Cloud to provide a very flexible mixed solution that meets the data collection and analysis needs of a wide variety of use cases.

In real-time gathering and analysis, edge is the ideal solution for certain workloads, the Cloud provides a central computing unit for large-scale analysis. Take as an example the case of self-driving vehicles: While edge computing has real-time management of driving safety – at crossroads, the ability to avoid collisions with other vehicles, route planning is managed in the Cloud.

The combination of edge and Cloud is very powerful: The two systems provide both historical and real-time information on performance and allow processes such as machine learning and resource performance management. In addition, data can be filtered or processed so as to only transmit the necessary data between business systems and the Cloud. In this way, by moving processing resources (from gateways to multi-purpose devices, to computers) and positioning them on the edge, on the one hand bandwidth limits and intrinsic latency are reduced and on the other hand, system security and reliability are improved.

Edge computing is not a novelty, but it is thanks to the Cloud element that it can make a difference, because it is the Cloud that gives the opportunity to take advantage of edge systems as if they were centralized. This ensures real-time data processing capabilities, which are essential for innovative applications.

Companies and public administrations that intend to implement advanced complex services, must necessarily do so with solutions that include infrastructure Cloud components and peripheral processing systems.

Ehealth, Smart City and Industry 4.0: The contribution of research

The above-mentioned technical advantages inherent in edge – reduced latency, secure decentralized processing and archiving, scalability to less complexity, versatility to adapt the nodes to the underlying application for maintenance and greater reliability – find applicability in very different scenarios: From eHealth to smart cities and industry 4.0 (in the fields of supply chain, robotics and Digital Twin).

These are areas where research is actively contributing to the progress of innovation. An example of this is the Braine project (Big Data Processing and Artificial Intelligence at the Network Edge), a research project involving 27 partners from research institutes, universities, SMEs and companies from 14 different European countries.

The objective of the Braine project is to develop an “Edge Framework” that foresees the design and implementation of a heterogeneous Edge Micro Data Center (EMDC) in terms of computing capacity and energy efficiency, and of a software system that integrates artificial intelligence technologies, able to process Big Data on the edge of the network, guaranteeing its security, privacy and sovereignty. Braine is based on technologies that optimize the use of available resources, providing new methodologies for allocation of workloads at the edges of the network that take into account different parameters including optimization of data management and processing to ensure, where necessary, low latency applications and real-time management in the case of mission-critical applications.

In the Braine context, four use cases are specifically developed using the functionality of Artificial Intelligence offered by the Edge Computing Framework, carried out by the project.

The application scenarios are:

  • Healthcare Assisted Living
  • Hyper-connected Smart City
  • Robotics for Factory 4.0
  • Supply Chain for Industry 4.0

BRAINE project use cases

Italtel participates actively, together with other partners, in the design and implementation of the use case Hyper-connected Smart City.

The use case highlights how the use of the solutions offered by the Braine project can guarantee the realization of services in the large-scale Smart City area, even in case of low-latency applications and with stringent requirements in terms of bandwidth, which also need to manage large quantities of input data in real-time. The solution creates services using distributed audio and video analysis techniques, based on a scalable, heterogeneous and multi-tenant infrastructure. The application scenarios cover traffic analysis, active surveillance, intelligent transport and emergency response.

The aim is to demonstrate how the use of cameras, distributed throughout the city, can allow the creation of different types of services of interest to Smart City. In fact, cameras become one of the most versatile sensors if supported by artificial intelligence techniques as they allow the extrapolation of different types of information from the analysis of audio and video flows. In this context, the use of the Framework developed in Braine enables the ability to process large volumes of audio and video flows to obtain a large amount of heterogeneous information that can be used for the implementation of different services such as: Traffic management, logistics planning, urban space planning, assessment of pollution levels, active management of security and emergency response issues, crowd management and maintenance of city infrastructure.

Braine plays a key role in positioning Europe at the forefront of smart Edge Computing, thanks to funding from ECSEL JU under Grant Agreement nr. 876967. For the financing of this project, ECSEL JU is supported by the European Union’s Horizon 2020 research and innovation program and by the national authorities of Italy, Poland, the Netherlands, Israel, Ireland, Hungary, Germany, Switzerland, Finland and the Czech Republic.