considering the features and attributes of different technologies, it is
obvious that Fog Computing is the most suitable one for hosting Nano
Datacenters. As we mentioned before,
energy consumption and productive heat of these devices are really low so, it
can make them more eco-friendly. The relatively low cost of Nano Datacenters
directly affects the scalability feature. It facilitates the expansion of
operational space. The distribution feature of these nodes along with devices
heterogeneity imposes a burden on system designers. This is not just the whole
problem because privacy, security, and resource allocation will also be
affected. Fog Nano Datacenter may not be able to move by itself but, due to its
small size and low weight, it is possible to mount them on moving objects.
5) QoS in Fog Computing
main reason for designing and developing Fog Computing is to provide quality of
service for a variety of applications. Fog Nano Datacenters are deployed on the
edge of the networks. This kind of architecture has considerable advantages.
Studies show that Fog Computing can affect multiple parameters and metrics. In
the following, we list them and explain how Fog Computing can affect them.
Computing is a powerful technology that offers and guarantees many services but
certainly, low delay is not one of them. Cloud servers are typically far from
applicants. Along the way, there are much congestion, Packet Loss,
Re-transmission and queue which makes the requests to reach Cloud servers with
more delays. Fog nodes respond locally to requests that are logically less
likely to be delayed using this architecture. The lower delay makes it possible
for the Fog Computing to function well in real-time applications.
Jitter: Variation in
the delay of the received request is jitter. In Cloud Computing due to some
problems like congestion and queue delays, requests may be delivered from
different routes. This flaw can completely disrupt services such as VoIP and
Video Streaming. In this case, requests arrive unevenly at different time
intervals, requiring the use of various mechanisms for ordering these requests.
This will cause additional delays and subsequently increase response time. In
Fog Computing, Nano Datacenters are one hop away from IoT devices so they can
help to address this problem.
Packet Loss: The
paths to the Cloud servers are not always open and available. For reaching to
these servers, requests may go through multiple devices or routers. Bad cabling
and connections, Congestions, Sudden outages, Software and hardware problems,
High latency and, Inconsistent jitter are some problems that make internet an
unreliable environment. Although, there are a few solutions for these problems
but, these mechanisms impose additional overhead and delay which will not be
welcomed by Real-Time applications. Using Fog Computing and Nano Datacenters
can reduce these problems.
feature is really crucial for real-time or transaction tasks. When we consider
Cloud Computing, except inner servers and services, we cannot guarantee the
quality of services for outer environments. Nano Datacenters are tiny devices
and can be flawed but, the distributive nature of Fog Computing causes Fog
nodes to take on each other’s tasks when needed. Usually, in the outer
environment, all communications are happening locally therefore, it is easier
to guarantee QoS and diagnose faults.
hops may be required to send requests to the Cloud Computing environment and
process them by servers. This means that several communication links are used
to guide the requests. Therefore, it can be deduced that in total, more
bandwidth is used. Occupying more bandwidth will have an effect on different
parameters such as Bandwidth costs, router’s business, and congestion. Fog
computing architecture put Nano Datacenters on the edge which reduced required
hops for reaching servers.