14
EDGE COMPUTING
multiple sources and providing the compute for in-depth
analysis and decision-making. The Tesla on the motorway
makes simple decisions in the car, while a high compute,
highly-connected data centre out of town is receiving data
from the vehicle and hundreds of others, analysing that data
and making complex decisions for all of those cars.
For example, often cited applications, such as autonomous
vehicles, with their massive sensor arrays and ultra-severe
latency requirements, need their intelligence to be local. It
needs to be in-vehicle in order to facilitate an instant decision
– braking to avoid a pedestrian. If remote processing was
required, you would need a remote data centre presence
every tenth of a mile, and even that would risk an ill-timed
loss of connection. If there’s an edge, it’s in the autonomous
vehicle itself and not in some nearby data centre at the
closest 5G small cell site. If you categorise IoT applications
that are truly latency sensitive, most, if not all, require lots of
local intelligence.
Moving forward, the role of the data centre will be in
handling this new and varied complexity – joining data from
Micro data centres remain vital
There will be instances where micro-data centres become
vital. For example, when a fire occurs, fire fighters will arrive
at the scene and must create an instant intelligent edge – the
at-scene command centre. The intelligent decision support
must be at the scene or at the edge.
Data from body-worn sensors and other
items of equipment must be integrated
into a common operational picture for
both commanders on-site, and operators
at the command centre. Since they had
no previous knowledge of where the
situation would occur, the edge must be
instantly created. Adding to the challenge
is the unknown on-site network availability
and relying on a centralised location for
decision support is impractical. The ultra-
latency sensitive data must be processed
at the edge for immediate support. On-
scene data will also be sent to the central
location for after-action analysis, ML and
AI processing.
Another reason often put forward for
micro data centres at the edge is data pre-
processing. Since a multitude of sensors
are sending so much data upstream, you
must have a location with intelligence to
pre-process the data and to compress it to
save on bandwidth to the main corporate
data centre location. The issue with this
is the link with expensive and scarce
bandwidth from the sensor to the edge
data centre. Once at the edge data centre,
bandwidth upstream to your primary
centralised metropolitan data centre
should be plentiful and inexpensive. The
cost savings from this approach should,
therefore, be minimal.
We estimate that only 10% of IoT
applications and their supporting
workloads require a physical presence at the edge. The
remaining 90% can be sufficiently served from the existing
metropolitan data centre and co-location facilities. Unless
you’re in the 10%, you need to take a complete view of the
costs-versus-performance trade-offs when contemplating an
intelligent IoT edge strategy.
In order to make 2019 the year 5G, AI and autonomous
vehicles begin to realise their potential, data centre providers
need to focus not on creating a new edge, but on making
connectivity, compute and interconnection more seamless
and more available. The task for data centres in helping
their customers respond rapidly to these changes is not to
re-engineer the system, but to deliver more connectivity and
compute to make those complex decisions ever faster. n
www.networkseuropemagazine.com