1 Introduction

More than 8 billion connected devices will be in use worldwide in 2017, up 31% from 2016 [1]. The forecasts of the world’s largest research institutes agree to over 20 billion of IoT by 2020 with a growth of 140% in just three years. In 2017 more than 4 billion passengers will concentrate in the airports with an average of two connected devices for each passenger [2]. Current technology infrastructures and architectures have not been designed to process in real time the great amount of information data that is being made available from such a number of devices with a so high concentration. As one of the first technical response to such technological challenges Fog Computing is emerging as an architectural model that places itself between the Cloud and the IoT, expanding Cloud Computing and Services to the IoT objects.

This paper is structured as follows. Section 2, introduces the Fog Computing concept and how a Fog layer can address some of the challenges of connecting the IoT to the cloud. Section 3 draws the proposed use case in the airport and Sect. 4 identifies main expected benefits. Finally, Sect. 5 concludes the paper.

2 The Fog Computing Scenario

The Internet of Things (IoT) is the set of objects within electronic devices, sensors and actuators that are widely diffused and capable of communicating with the Internet and other devices through communication protocols that do not request human intervention. By way of example, we can mention:

  • Web-cameras (for surveillance, traffic detection, pollution, etc.);

  • Appliances (refrigerators, washing machines, kettles, etc.), door openers or shutter controllers;

  • Lifts and other smart building equipment;

  • Various avionics devices such as flight recorders;

  • Card readers or RFID (for logistic applications);

  • Biomedical appliances and sensors;

  • Meters, thermostats, digital regulators for electricity, gas, water, heat;

  • Wearable devices such as bracelets and watches.

  • Environmental and territorial sensors (light and humidity sensors, air quality and parking sensors, etc.)

The amount of such objects grows at dizzying rates, and the volumes, variety and speed of the data they produce also grow considerably. According to McKinsey [5], by 2020 some predictions suggest over 20 billion edge devices are to be connected collecting more than 1 trillion GB of data. The information that is made available therefore constitutes a Big Data generator of great potential value.

To take advantage of the quantity and variety of such data and generate value and services from them, the most immediate approach was to connect the IoT directly with the Cloud. Cloud service providers (Amazon AWS, Google Compute Engine, Microsoft Azure) today enable customers to quickly deploy a myriad of private and corporate services to be a competitive alternative to buying and maintaining their own infrastructure.

Although this may in some cases be feasible, a more careful analysis has immediately underlined that the direct “Cloudification” of IoT is generally problematic since the approach of transferring all data from the device to the datacenter generates considerable latency and a large computational load and storage at non-negligible economic costs. There are also regulatory constraints that limit the use of personal data in the cloud and infrastructure complications such as the use of dedicated communication gateways between the IoT and the cloud.

In summary, the “Cloudification” of IoT makes an unwise use of the precious and expensive resources of the Cloud computing system, namely the transmission capacity, data storage and processing capabilities. There are also a large number of services that require a nearly instant response to the reception of IoT data and this is not compatible with the ability to transfer and process the huge amount of data of IoT devices in the cloud.

A first technical response to these issues has been given by Cisco introducing the Fog Computing concept as an architectural model that, between the Cloud and the IoT, extends Cloud Computing and Services to IoT objects to the ends of the network [3]. In the same way as Cloud, Fog provides data, computing, storage, and end-user application services, but supports a dense geographic distribution, aimed at approaching IoT devices and providing support for object mobility. In this way the Fog reduces latency in services, which in the case of critical services can be decisive, by improving their Quality and Overall Experience of End Users.

The great interest in this new Fog architecture has given place to the establishment of the Open Fog Consortium [4] consisting of research and industry giants such as ARM, Cisco, Dell, Intel, Microsoft, Princeton University, which recently released the first Reference Architecture. Fog Computing is defined as a horizontal-level system architecture that deploys more computing, storage, control, and networking functions closer to end-users in the continuum from cloud computing to IoT objects.

The architecture expects that deployments can reside on multiple layers, while retaining all the benefits of cloud computing, such as containerization, virtualization, orchestration, and resource-efficient management. Processes are moved by the cloud to the edges of the network, near the IoT sensors and actuators, on elements called Fog Nodes, consisting of autonomous processing, storage and IP communications. These computing elements can be deployed anywhere, such along a railway, lighting poles or a car, or can be capable of acting in mobility.

Fig. 1.
figure 1

(from [4])

OpenFog Reference Architecture for Fog Computing

The peculiarities of OpenFog’s architectures are to enable cloud-to-cloud and cloud-to-fog interfaces and communication flows and thus offer, with respect to other approaches, particular advantages represented by the SCALE acronym:

  • Security: data generated by IoT devices must be protected both in transfers and in storage to ensure secure and trusted transactions; The integrity and availability of infrastructure and data should not be questioned;

  • Cognition: awareness of end-user and surrounding environment goals; It also has the ability to adapt connections and computing resources even in the unavailability of some of them, so that architecture is autonomous and adaptive, starting with objects at the extremes of the network;

  • Agility: rapid innovation and scalability within a common infrastructure where choosing the most suitable node depends on various factors such as the speed of decision making required; So, for example, for instant responses, the node may be at the generating device, while for other cases it may be transferred to a fog layer or cloud;

  • Latency: real-time processing and cyber-physical control systems, data analysis is done close to the device that generated it for immediate response;

  • Efficiency: dynamic, commonly used local resources not utilized by participating end-user devices, for orchestrated and optimized computational resources.

Applications will need to be redesigned, starting with the gathering of elementary information from sensors, for a new distribution of functions, previously thought only for the cloud, and now placed between Cloud and Fog levels in the environment. The most critical response time data will be processed in the first fog layer, while less critical data may be transferred to higher aggregation layers for analysis and treatment. The less critical data will then be brought to the cloud for historical analysis, big data analytics, and long-term storage.

The Fog Computing model is not a compulsory choice in all situations. Various scenarios can be better managed with just Cloud Computing, but many other scenarios will be better implemented with fog extensions. Cloud backend remains a key part of architecture even with the introduction of Fog Computing layers. Deploying tasks between Fog and Cloud depends on specific applications; it can be originally planned, but also dynamically adapted to the status of key resources such as processing load, communication link saturation, storage capacity, security threats detected, unavailability of resources, batteries consumption, cost targets, etc.

Recent business research, remarkably by McKinsey [5] examined the economic impact that IoT based applications can bring by analyzing the potential benefits, including productivity improvements, time savings, improved asset utilization, as well as the value coming from reduced diseases, accidents, and deaths. The outcome focused on the importance on the analysis of applications in the context of settings, the physical environments where these systems are deployed. The most relevant findings include how much IoT value is being created in business-to-business vs. consumer markets, and which players in the value chain will capture the most value from IoT applications. To name the most relevant we mention:

  • Interoperability among IoT systems plays a major role, most of the expected value to be unlocked requires multiple IoT systems to work together, then to integrate and analyze data from various IoT systems;

  • Most of IoT data generated is merely collected but not stored nor used, so the potential value contained in these data is not exploited. So there is a key source of big data that can be leveraged to capture value, and open data, to be used in several scenarios;

  • B2B applications of IoT have greater economic potential than consumer applications. While consumer uses of IoT have obtained a lot of attention and show a tremendous potential for creating value, there is even greater potential value from IoT use in business-to-business applications. A great deal of additional value can be created when consumer IoT systems, such as connected consumer health-care products are linked to B2B systems, such as services provided by health-care providers and payers. This happens more frequently in environments with high concentration of IoT devices.

Consequently typical scenarios to be considered with particular attention are:

  • smart IoT objects in mobility (cars, ships, drones, airplanes, smartphones)

  • the directions on which smart objects move (roads, railways, nautical routes, airways)

  • aggregation points for smart objects (ports, airports, railway/metro/bus stations, malls, parking areas, factories, hospitals, schools/universities, building/houses)

The adoption of the Fog offers the following benefits:

  • a globally distributed network improves fault tolerance and resilience, minimizing downtime,

  • better interconnection and balancing of processing loads,

  • better system scalability also guaranteed by virtualized and containerized systems,

  • better use of the network bandwidth, reducing transfers and avoiding congestion and bottlenecks,

  • a reduction in latency will also result in better service quality,

  • optimizing operating costs by streamlining the use of processing, storage, and network resources,

  • more efficient security by encoding data to the source and reducing transfers, thus reducing risk exposure,

  • better flexibility and agility of business models.

3 The Smart Fog-Hub Service (SFHS)

The EC Horizon 2020 program has recently funded a new research initiative (mF2C) bringing together relevant industry and academic players in the cloud arena, aimed at designing an open, secure, decentralized, multi-stakeholder management framework for F2C computing, including novel programming models, privacy and security, data storage techniques, service creation, brokerage solutions, SLA policies, and resource orchestration methods [6].

There is an increasing demand in evaluating and identifying new market sectors and opportunities, and interest at the IoT evolution as a potential arena where current Cloud offering could be enriched and differentiated. In this perspective a relevant focus in setting up hubs in public environments (e.g. airports, train stations, hospitals, malls and related parking areas) is suggested, capable of tracking the presence of people and other objects in the field, and developing value added services on top for proximity marketing, prediction of path/behavior of consumers, and taking real time decisions.

The foreseen hubs could be adapted to be used as a planning tool for determining the number and distribution of people that use, or can potentially use various services like public transport, etc. This kind of hub can be easily considered as a fog device that should embed cloud connectivity to either process large amount of data or request extra-data – perhaps data coming from other fogs nearby.

As an additional opportunity to be evaluated, different fogs located in near sites (e.g. airport, train/main bus/harbor station) could interact sharing data and customer behavior gathered to improve the effectiveness of marketing proposals, given that the identity of objects/customers is protected.

This scenario has been named as the Smart Fog-Hub Service (SFHS). The use case is experimental and extends the concept of a “cloud hub” to a new concept of “fog hub”, driven by real market needs. In this scenario Tiscali believes that value is generated at the business services level, particularly in spaces with recurring concentrations of people and objects that can communicate and interact. Tiscali is interested in setting up Fog Hubs in such scenarios to interact with all the objects within the scope of coverage.

Fig. 2.
figure 2

Airport scenario

Tiscali believes that the IoT will be driven by business market instead of consumer market, and that SFHS would be the best way to aggregate business users, design new business scenarios and create value. There is no doubt that the capacities provided by mF2C will enable the distribution of the processing of data, reducing traffic load between cloud & hub and latency in interactive services.

The envisioned Smart Fog-Hub Service should be set up in public crowded environments, so a preliminary version will be tested within the Tiscali Campus in Cagliari, and a final version will be deployed at the Cagliari Elmas airport. With this approach the whole infrastructure will be tested and validated in a real scenario with possibility for Tiscali to exploit the marketing potential of the developed services.

In the specific context of airports there are a growing number of objects that are related to passengers and partners, or people that work in this environment. The field include check-in area, security control area, and departure gates. Check-in and departure gates host several shops and other frequented places. The foreseen services, provided through a web portal, are oriented to track and engage all people in the field offering information, suggestions on the best way to use available services, e.g. suggest the moment for shorter waiting times in Security Control to departing people, to move close to the gate or notify the final call, or recommend relevant proposals and offerings in shops close to the user (proximity marketing). All these suggestions can be refined according to behavior and choices done by passengers.

The technological scenario will include the following elements:

  • Edge sensors, which can include smartphones, laptops, tablets, any other IoT device with Wifi connection; most of them will be data generators, some could have some computing power and potentially could offer/share data and eventually also computing resources

  • Edge Fog, which is basically composed by the Fog-Hub, that will perform the role of data collector, power provider for the fog layer processing and will consist of the following features:

    • a computing element that has relevant computing power to run the defined applications, analytics and management functions/tools,

    • Wifi AP to collect data from the perceived objects within the covered field,

    • enough local storage to retain local and temporary processed data,

    • fast link interface with the cloud,

    • (optional) Bluetooth LE beacons could be added, where the edge fog component would run the management functions

  • Link connection between Edge Fog and Cloud,

  • Cloud, connected to the Edge Fog, which will be based on an OpenStack instance that will provide scalable computing power for massive data processing.

    Fig. 3.
    figure 3

    Airport mF2C topological scenario of Use Case 3

The resulting infrastructure will be based on standard components and protocols and will be sized according to the data volumes, and open to use different devices that will be made available by the Project partners.

The data collected by the edge devices include some device specific and personal data, detailed tracking position and preferences according to the portal navigation, and different paths followed. This data has to be protected in the communication between fog and the edge.

The additional workload on the networking elements will be managed with SDN/NFV to provide bandwidth optimization and low latency, while from a security and privacy perspective Fog and Cloud should be able to use different policies, with anonymization of data when requested. The edge fog element will be configured with some resiliency capabilities, at least for stored data and fast reboot/recovery.

In this scenario the described Fog-Hub cannot be a “cloud hub” because the amount of data to be processed and managed would exceed the network capabilities, so part of the computation should be spent at Fog level, thus this hub could be better named the Fog enabling Hub for ISPs, namely Smart Fog-Hub Service.

4 Benefits

With this kind of fog hubs, proximity marketing and social aggregation would be enabled, with the possibility to collect a lot of information on objects moving within the covered environment, offering connections, customized advertising and interactive applications giving the connected users the chance to share or offer some resources. This could require new billing/revenue sharing models and tools that also take into account the correct use of users’ personal information.

These are some of the Expected Benefits:

Proximity marketing and enhanced user engagement, the strict interaction of a huge amount of users enables a much more effective, customized offering and advertising, differentiating between B2B and B2C customers, based on user preferences and behavior, with the chance to determine the effectiveness of the proposition in terms of purchasing products/services. A continuous refinement in the proposition can be applied in terms of geo-fencing with a predefined set of boundaries, the recipient of the message can receive real value as well as a mere communication. It can be possible to organize customized promotional initiatives, ex ante or ex post with respect to the presence in the area. Prepare campaigns targeted at categorized users, e.g. workers in the airport, because they have been identified as such by people who are always passing between eight and nine o’clock. Of course, respecting privacy laws.

According to recent report from the Politecnico di Milano [7], 80% of users (chosen by those who usually browse the Internet) declare that online is the first source to look for information on a product or service to buy, 77% compare prices on the Internet, and a user in three chooses what to buy by looking for mobile information, typically from their smartphone. The use of the smartphone is therefore an appealing opportunity to engage customers.

Data collection and analysis, collecting lot of data from objects on the move can enable running advanced machine learning algorithms to extract user profiles and demands, but also trace trends or identify new required services. Most IoT data are not currently used nor stored [5]; the current use is mostly limited to address anomaly detection and real-time control, so a great deal of additional value remains to be captured, by using more data, as well as deploying more sophisticated applications such as analyzing workflows to optimize operating efficiency. Some Descriptive Analytics, what’s happened, to Predictive Analytics, what will happen, to Prescriptive Analytics, what can be done, can drive the Analytics on the data and generate value.

Related to the Multi SHFS (airport, train station, bus station, etc.), it will be possible to answer several questions like: what are the most popular areas? Where does the user stop the most? What are the average times of stay in the area?

Social integration, offering connections and interactive applications provides a way for connected users to share or offer some resources, under user defined access rules.

New revenue models, sharing users’ resources may drive new billing models and SLA policies not only between users and traditional providers but also among users themselves. Business-to-Business (B2B) applications can create more value than pure consumer applications and new business models for user and companies are emerging. The Internet of Things will enable—and in some cases force—new business models. The new “as-a-service” approach can give the supplier of services a more intimate tie with customers that competitors would find difficult to disrupt. The IoT will speed up this evolution path because IoT produces huge quantities of a type of asset that can be sold or exchanged: the data. The ability to identify facts and hidden relations in the data available to organizations not only allows the optimization of processes and increasing competitiveness, but also can open new opportunities for value creation. Data monetization is the process of generating new revenues through the sale or exchange of data in the possession of the organization and through the exploitation of these for the generation of new products and services.

Improved data Privacy and security, management of user personal data is done at edge level, separated from the cloud, encryption of storage and anonymization techniques are applied before moving data to the cloud, thus reducing the risk of disclosure on data, and preserving the confidentiality of data owners.

Optimized use of Resources and Service in the Airport field, the engagement and continuous tracking of people moving in the airport allows the proposition of suggestions oriented to an optimal use of available resources, services and an improved and pleasant quality of experience for passengers and partners. At the same time all dealers and service providers in the airport site will be facilitated in their marketing proposals and offering, by using the (anonymized) data collected by all people in the field.

5 Conclusions

This paper begins highlighting the Fog Computing concept as an architectural model that gives a better answer to the “Cloudification” approach. Fog computing, making the glue between the Cloud and the IoT, extends Cloud Computing and Services to IoT objects to the ends of the network.

Then an evaluation of current business trends on IoT was developed noting the importance of the physical environments where such systems are deployed, and spotted that a relevant IoT value is expected to be created in business-to-business vs. consumer markets, and which players in the value chain will capture the most value from IoT applications. This happens mostly in environments with high concentrations of IoT devices.

The experimental Use Case on Smart Fog Hub Service (SFHS) has been described, detailing the main objectives of exploring and analyzing proximity marketing and new revenue models through data collection and advanced analytics, and foreseeing new business models. Notably the data processing has to be distributed between cloud and fog layers because the amount of data to be managed can exceed network, storage and computing capabilities at the fog layer.

Finally the paper introduces the main expected benefits. Proximity marketing and social aggregation would be enabled, giving way the collection of relevant of information on objects moving within the covered environment, offering connections, customized advertising and interactive applications giving the chance to connected users to share or offer some resources, under controlled policies on users’ personal information.