We therefore expect the very fast emergence of fog applications which are defined as applications building a seamless interactive continuum between the physical and digital world, blending atoms and bits. Besides the obvious usage of these technologies for entertainment purposes, numerous applications are also expected in domains as diverse as live events coverage, health care, engineering, real estate, military, retail, and education.
Following the current trend of ever-increasing importance of mobile devices and infrastructures, end users will expect fog computing applications to be mobile. Applications will however require massive computation and storage to remain continuously available and responsive while processing potentially very large volumes of data.
Yet, although large quantities of computing resources are readily available in cloud platforms, traditional cloud infrastructures are ill-equipped to fully address the challenges of future fog computing applications. Because they rely on a handful of very large data centers, public cloud providers are often located very far from their end users, which does not match the requirements of fog applications:
To address these challenges, the mobile networking industry is heavily investing in fog computing platforms located at the edge of the networks, in immediate proximity to the end users 14,15,16 . Instead of treating the mobile operator’s network as a high-latency dumb pipe between the end users and the external service providers, fog platforms aim at deploying cloud functionalities in immediate proximity with the end users, inside or close to the mobile access points. Doing so will deliver added value to the content providers and the end users by enabling new types of user experience. Simultaneously, it will generate extra revenue streams for the mobile network operators, by allowing them to position themselves as fog computing operators and to rent their already-deployed infrastructure to content and application providers.
Achieving this vision will require an active fog computing technology/innovation ecosystem as well as a strongly-skilled workforce capable of designing future fog computing platforms and exploiting them to their fullest extent. The FogGuru project will foster the arising of a new generation of researchers and professionals, able to work at the edge between science and innovation to effectively design the necessary technologies to deploy and operate scalable fog computing infrastructures, and develop innovative fog computing applications. The FogGuru research will be carried out by eight talented Early-Stage Researchers (ESRs) who will jointly develop the missing technologies while training themselves to become fog computing gurus.
The ESRs’ work will be organized along the following major Research Objectives (RO) which are poorly addressed by current research efforts:
RO1: To Manage Resources and Applications in Scalable Fog Platforms. While traditional clouds are composed of many powerful machines located in a handful of data centers and interconnected by very high-speed networks, fog computing platforms are composed of a very large number of points-of-presence hosting few weak and potentially unreliable servers, interconnected with each other by commodity long-distance networks. This broad geographical distribution creates difficult challenges such as optimizing the usage of resources whose distribution may not always match the distribution of user demands, migrating computations and data in the presence of end user mobility, and automatically detecting and correcting anomalies.
RO2: To Adapt Stream-Processing Middleware Systems for Fog Applications. To enable the development of innovative applications which fully exploit the specificities of fog computing platforms, new programming abstractions will be necessary. We strongly believe that the Real-Time Stream Processing model, which was initially developed by the Data Analytics community, is also extremely well-suited for fog computing applications: it provides developers with an easy-to-understand development environment, while harnessing the full capacity of fog infrastructures to achieve extremely high performance. Designing an application as a workflow of operators offers a simple yet powerful abstraction which facilitates the application deployment and run-time management in complex distributed environments. This research objective aims at designing and developing the missing functionalities to adapt stream processing middlewares to fog computing environments.
RP3: To Develop Blueprints for Innovative Fog Applications. Fog computing enables a whole new range of IoT-driven applications: (i) latency-critical applications which require client-server latencies of the order of milliseconds to ensure a smooth user experience; and (ii) context-aware and geo-distributed applications where the processing should be moved closer to the data source in order to reduce network traffic and enhance scalability. The first class includes, e.g., virtual reality gaming applications, hyper-interactive shopping apps and remote treatment applications in healthcare. The second class spans from adaptive traffic-aware traffic light control systems to IoT big data analytics services, from decentralized analysis of security video streams to monitoring of wind farms. This research objective will deliver blueprints for both types of applications, in the form of templates running on top of the stream processing middleware developed in RO2 and further verticalized for experimentation in the smart city of Valencia.