Fog Computing










Fog Computing

by James G. Barr

Docid: 00018045

Publication Date: 2212

Publication Type: TUTORIAL

Preview

With the advent of the Internet and high-speed telecommunications,
traditional “on-premises” computing systems and services were supplemented
by systems and services made available via “the cloud,” a large and
loosely-coupled set of commercial data centers providing software,
hardware, and computing platforms to enterprise clients on a pay-as-you-go
basis. While the cloud computing model is enormously successful – and may,
in time, replace nearly all on-premises offerings – a number of recent
developments like the proliferation of smartphones and the continuing
evolution of the Internet of Things (IoT) have prompted computer planners
to consider the suddenly device-rich space between on-premises and cloud
environments. Invoking another meteorological metaphor, this region
between on-premises and cloud has been dubbed “the Fog,” and the data
processing within that region, “fog computing,” a term coined by Cisco.

Report Contents:

Executive Summary

[return to top of this
report]

With the advent of the Internet and high-speed telecommunications,
traditional “on-premises” computing systems and services were supplemented
by systems and services made available via “the cloud,” a large and
loosely-coupled set of commercial data centers providing software,
hardware, and computing platforms to enterprise clients on a pay-as-you-go
basis.

Related
Faulkner Reports
Cloud Computing Concepts Tutorial
Edge Computing Tutorial

While the cloud computing model is enormously successful – and may, in
time, replace nearly all on-premises offerings – a number of recent
developments, like the proliferation of smartphones and the continuing
evolution of the Internet of Things (IoT), have prompted computer planners
to consider the suddenly device-rich space between on-premises and cloud
environments.

Invoking another meteorological metaphor, this region between on-premises
and cloud has been dubbed “the Fog,” and the data processing within that
region, “fog computing,” a term coined by Cisco.

As observed by analyst Ramya Mohanakrishnan, the goal of fog computing is
to provide short-term, device-level computing and analytics, thus
reserving cloud infrastructure for “long-term [data storage] and
resource-intensive analytics.”1

Offering a more formal description, the US National Institute of
Standards and Technology (NIST) declares that “Fog computing is a layered
model for enabling ubiquitous access to a shared continuum of scalable
computing resources. The model facilitates the deployment of distributed,
latency-aware applications and services, and consists of fog nodes
(physical or virtual), residing between smart end-devices and centralized
(cloud) services.

“The fog nodes are context aware and support a common data management and
communication system. They can be organized in clusters – either
vertically (to support isolation), horizontally (to support federation),
or relative to fog nodes’ latency-distance to the smart end-devices. Fog
computing minimizes the request-response time from/to supported
applications, and provides, for the end-devices, local computing resources
and, when needed, network connectivity to centralized services.”2

Figure 1 illustrates the relationship between cloud computing (top
layer), fog computing (middle layer), and smart end-devices computing
(bottom layer).

Figure 1. Cloud – Fog – Smart End-Devices Ecosystem

Figure 1. Cloud - Fog - Smart End-Devices Ecosystem

Source: NIST3

Fog Node

The fog node is the core component of the fog computing
architecture.  A fog node may be composed of:

  • Physical elements (like gateways, switches, routers, servers, etc.) or
  • Virtual elements (like virtual switches, virtual machines,
    “cloudlets”, etc.)

Note: A cloudlet is a small-scale data center designed to
provide cloud computing services to mobile devices.

Fog nodes are tightly coupled with smart end-devices and provide
computing resources to these devices.4

Similar to cloud deployment models, a fog node may be:

Private – “A fog node that
is provisioned for exclusive use by a single organization comprising
multiple consumers (e.g., business units).”

Community – “A fog node that
is provisioned for exclusive use by a specific community of consumers from
organizations that have shared concerns (e.g., mission, security
requirements, policy, and compliance considerations.)”

Public – “A fog node that is
provisioned for open use by the general public.”

Hybrid – “A complex fog node
that is a composition of two or more distinct fog nodes (private,
community, or public) that remain unique entities, but are bound together
by standardized or proprietary technology that enables data and
application portability.”5

Related Concepts

Fog computing is often associated (actually, confused) with two
complementary computing models: “mist computing” and “edge computing.”

Mist Computing

Mist computing is a “rudimentary form of fog computing that resides …
at the edge of the network fabric, bringing [fog computing] closer to the
smart end-devices.”  Mist computing “is not viewed as a mandatory
[element] of fog computing.”6

Edge Computing

As the term implies, “edge computing” is computing at the network edge.

Edge computing involves the positioning of compute, storage, and
networking resources proximate to the end users they serve and the various
devices these end users employ.  The impetus behind edge computing
was the realization that some data, like data generated by autonomous
automobile sensors, must be processed immediately.  It cannot be
shuttled to a cloud repository, processed by a backend analytics package,
and returned to an automotive steering system, at least not in time to
prevent an accident.  The data must be processed on the spot, or “at
the network edge.”  The situation is analogous to a paramedic
attending an accident victim.  The paramedic can relay the patient’s
vital signs to the hospital, but must act immediately to stop any major
bleeding.

Even with local processing, “edge computing can [still] lead to large
volumes of data being transferred directly to the cloud.  This can
affect system capacity, efficiency, and security.  Fog computing
addresses this problem by inserting a processing layer between the edge
and the cloud.”7

The Market

[return to top of this
report]

As reported by Research and Markets, the global fog computing market,
valued at $151.7 million is 2021, is expected to reach $322.7 million by
2027, exhibiting a compound annual growth rate (CAGR) of 12.7 percent
during the 2022-2027 forecast period.

Market Drivers

The rapid adoption of fog computing is directly attributable to the
growth of the Internet of Things (IoT), as fog computing “facilitates
computing, storage, control, and networking services between [smart
end-devices] and … data centers.”

In addition, “the emerging trend of connected and modern vehicles is
positively influencing the deployment of fog computing displays for
car-to-car connectivity.  Moreover, governments of several countries
are investing in developing smart cities, which is creating a positive
market outlook.”

Market Players

Key players in the fog computing space include:

  • ADLINK Technology
  • Cisco
  • Ericsson
  • Dell Technologies
  • Johnson Controls
  • Fujitsu
  • General Electric
  • Hitachi Vantara
  • Huawei Technologies
  • IBM
  • Oracle
  • Toshiba8

OpenFog Consortium

Helping stimulate interest in fog computing is the OpenFog Consortium. As
described by the OPC Foundation, “The OpenFog Consortium (OpenFog) is a
public-private ecosystem formed to accelerate the adoption of fog
computing in order to solve the bandwidth, latency and communications
challenges associated with the Internet of Things (IoT), Artificial
Intelligence, Robotics, the Tactile Internet and other advanced concepts
in the digitized world.  It was founded by ARM, Cisco, Dell, Intel,
Microsoft and [the] Princeton University Edge Computing Laboratory in
November 2015.”

Use Cases

[return to top of this
report]

Although cloud computing is a well-established data processing model, fog
computing is still an emergent field intended to address the various
latency issues surrounding IoT and Industrial IoT (IIoT) devices. At
present, prominent use cases include smart buildings, smart cities, video
surveillance, and healthcare.

Smart Buildings

A smart building is any structure that utilizes automated procedures to
regulate building operations and improve building sustainability. 
These operations can include heating, ventilation, and air conditioning
(HVAC), lighting, access control and security, elevators, fire safety, air
and water quality, digital signage, and energy management.  Fog
computing is a principal enabling technology class for smart buildings.

Smart Cities

Smart buildings are the indispensable “building blocks” of “smart
cities,” a concept that combines smart physical facilities with smart
infrastructure, smart services, and, of course, smart citizens.  As
with smart buildings, smart cities rely heavily on fog computing
capabilities.

Video Surveillance

Analyst Mohanakrishnan reveals that “The most prevalent example of fog
computing is perhaps video surveillance, given that continuous streams of
videos are large and cumbersome to transfer across networks.  The
nature of the involved data results in latency problems and network
challenges.  Costs also tend to be high for storing media
content.  Video surveillance is used in malls and other large public
areas and has also been implemented in the streets of numerous
communities.  Fog nodes can detect anomalies in crowd patterns and
automatically alert authorities if they notice violence in the footage.”9

Healthcare

Modern healthcare delivery is improving through the use of sophisticated
medical sensors, tools that detect specific biological, chemical, or
physical processes and then transmit the data for analysis.  The data
collected can be used by a clinician to detect and diagnose a medical
condition, or by a health management specialist to gain insight into how
the human body functions.  New sensors are being developed to help
arrest the aging process; enable remote monitoring of aging or at-risk
individuals; and – in their current commercial application – to improve
physical fitness and performance.  Since data delays can be critical
in terms of patient outcomes, fog computing is helping to arrest any
inherent latency issues.

Bottom Line

In summing up the utility of fog computing, analyst Mohanakrishnan
observes that “Fog computing enhances business agility while improving
[quality of service].  The faster the information is processed, the
better the experience for users.  This also means that employees do
not need to operate on a choked-up network, and companies need not pay
insane amounts for extended cloud storage.  Considering the
many positives and accelerants of fog computing, companies need to
consider this system as naturally as they consider cloud computing while
building their infrastructure.
10

Recommendations

[return to top of this
report]

When to Consider Fog

Analyst Amakiri Welekwe suggests that enterprise planners should consider
fog computing options if:

“You have IoT-based systems with
geographically dispersed end devices generating data in the order of
terabytes, and where connectivity to the cloud is irregular or not
feasible.

“Massive amounts of data are constantly being
collected from data sources such as connected cars, vehicles, ships,
factory floors, roadways, farmlands, railways, etc., and transmitted to
the cloud.

“You have to regularly analyze and respond to
time-sensitive generated data in the order of seconds or milliseconds.”11

Fog, Edge, Cloud Planning

Before pursuing a fog or edge computing solution, enterprise planners
should develop an overall strategy, one that examines how each element in
their information technology portfolio – especially IoT elements – relate
to each other and to the whole of their information infrastructure. 
The result should lead to a comprehensive plan for implementing fog, edge,
and cloud computing, and for expanding the enterprise’s “off-premises”
presence.

Creating a Fog Console

As a practical matter, analyst Mohanakrishnan asserts that
“Administrators must track all deployed fog nodes within the system and
decommission them when required.  A central view of this
decentralized infrastructure can keep things in order and eliminate
vulnerabilities that arise out of zombie fog devices.  Besides a
management console, a robust reporting and logging engine makes compliance
audits easier to handle since fog components are bound by the same
mandates as cloud-based services.”12

Fog Security and Continuity

Not surprisingly, fog computing environments are vulnerable to common
security and privacy threats, including tampering, jamming, eavesdropping,
denial of service (DoS), man-in-the-middle attacks, and session
hijacking.  Any fog computing implementation plan should include a
set of robust and reliable fog security measures.13

Similarly, as fog computing becomes more integral to enterprise
operations, the enterprise business continuity and incident response plans
should provide for fog recovery in the event of fog service interruptions.

[return to top of this
report]

References

About the Author

[return to top of this
report]

James G. Barr is a leading business continuity analyst
and business writer with more than 40 years’ IT experience. A member of
“Who’s Who in Finance and Industry,” Mr. Barr has designed, developed, and
deployed business continuity plans for a number of Fortune 500 firms. He
is the author of several books, including How to Succeed in Business
BY Really Trying
, a member of Faulkner’s Advisory Panel, and a
senior editor for Faulkner’s Security Management Practices.
Mr. Barr can be reached via e-mail at jgbarr@faulkner.com.

[return to top of this
report]