No, we didn’t make a mistake in the title — this time it’s not exactly about clouds, but about the fog computing that happens literally right next to you. It’s a robot vacuum cleaner that sends you a message on your phone that it’s stuck in a corner, or a car that helps you navigate, defining how many cars and pedestrians are near you.

It is predicted that in 2024 the Internet of Things will cross the $1 trillion mark. And it cannot do without the fog technologies, which are really inconspicuous, as if “fogged”. Let’s find out where this trend came from and what processes are hidden behind the usual online video viewing.

History

A fog is a cloud that is very close to the ground, hence the analogy for technology that is closer to the end user than a cloud.

In fact, it is the development of several employees of the American corporation Cisco. Flavio Bonomi described the concept itself in 2011, after which Ginny Nichols, his colleague, suggested calling it Fog Computing. Cisco products included fog solutions in 2014, and in 2015 the director of corporate strategic innovation, Helder Antunes, created the OpenFog Consortium, a group dedicated to implementing fog technologies.

The group was also co-founded by Dell, Intel, Microsoft, ARM, and the Princeton University’s Edge Computing Laboratory, as well as over 50 companies (AT&T, General Electric, Mitsubishi among them) which later became its members. The main idea is to jointly develop a new type of computing.

As Cisco defines it, fog technologies are a model of distributed computing between end devices, during which data processing takes place in a decentralized manner, involving the equipment that is the source of the data. If we return to atmospheric analogies, fog is similar to clouds in everything, but it is distinguished by proximity to end users, dense geographical coverage and mobility.

When you need to “let the fog out”

Networks, houses and entire cities are becoming smart. In order to quickly analyze information about accidents, traffic, and sensor readings, it is necessary for it to travel as little as possible to the data centers. And this is possible thanks to fog computing, which has comparatively lower signal latency, because the system is located closer to the data source, so it does not have to be sent completely to the main cloud.

For example, let’s take the autopilot in a car. All the data provided by the motion sensors, the camera, when making a decision — say, where exactly to turn and when to stop — is sent not to the cloud, but to the fog capacities. If you’re on autopilot in your Tesla and a deer suddenly jumps out onto the road while you’re driving, the fog technology allows you to quickly identify the obstacle and avoid a collision with it. But updating the system and saving the history of your trip takes place already in the cloud.

To quickly recognize pedestrians, other cars, traffic lights, etc., Tesla uses fog computing

Another common field of use is medicine. Checking moles or tumors for malignancy in real time, early disease detection, and development of an individual course of treatment — patient monitoring can be carried out remotely, through gadgets and telemedicine. In order for all this to happen quickly, so that, i.e., it is possible to notify a doctor about a patient’s heart attack in a few seconds and call for help in time, a minimum delay in data transmission is required. Fog computing provides this and also allows data privacy to be maintained because on-premises devices are easier to monitor and secure.

Fog technologies can also send pre-processed information to the cloud for long-term storage and further analysis, which can offload cloud computing and optimize IT efficiency. This is especially useful for streaming platforms, file sharing, AR applications, etc., which require constant data exchange every 15–20 milliseconds. The cache, which is created so that you can access data that you have already viewed before or download videos faster, is also more convenient to store on fog nodes.

System architecture

Fog computing has several layers:

  • The first is things, that is, all the sensors, mobile devices, cameras that collect the necessary information.
  • The second is fog, these are routers, gateways, hubs, separate servers that partially process information before sending it to the cloud.
  • The third is cloud-based, which accepts pre-formatted data and can store it for a long time, analyze it, etc.

Everything seems to be clear in theory, but in practice the question arises — where are all these fog nodes located, if not in data centers?

Fog construction plan (image from the PubNub website)

Some capacities are integrated into the telecom infrastructure — this model has already been dubbed TelcoFog, and it can work in connection with 5G coverage. Others are built into common devices — in particular, in bus stops, light poles, traffic lights. In some cases, there are smart traffic lights that use cameras to define an ambulance or fire service vehicles and switch the lights accordingly if these cars need to pass faster.

The user device can also share its processing power with the system.

Ways to further improve the efficiency of fog computing are constantly being developed. One of the problems faced by users of this system is the static nature of fog nodes. Sometimes there is an increase in the number of requests in one area that need to be processed instantly — say, a sudden traffic jam on a normally empty street, where there is a congestion of cars sending and receiving data from a single server. In such cases, drones can be used, to which fog nodes are attached — the drone quickly flies to the desired location and provides additional power for the system.

But according to Cisco, 99.4% of devices that could potentially become part of the Internet of Things are still not connected to the network. So fog technologies are very much at the start of their development.