Colocation &
Outsourcing
Changing requirements
However, the rapidly declining
cost of fibre connectivity has
changed all that. It was this which
was the catalyst in spurring a few
data centre operator pioneers to
start establishing large purpose-built
facilities well outside of London,
NGD being one, to offer lower cost
and arguably more naturally secure
alternatives to London.
The introduction of cloud
computing has been called
a disruptive technology and
has had a dramatic impact on
colocation. Over recent years this
impact of cloud computing and
various ‘as a service’ subscription
models such as IaaS, SaaS and
PaaS have caused a significant
change in the original colocation
concept. As a result the onus and
responsibilities on colo operators
is increasing exponentially. This
is because of the additional
demands being made on the data
centre technical infrastructure,
available power and connectivity
which are prerequisites for
efficiently scaling the cloud-
centric business models of most
organisations, not to mention the
growing Big Data, IoT and HPC
requirements of others.
These factors, combined
with increasing user and service
provider requirements for greater
resilience, energy efficiency and
security, plus assurances over
data privacy and in some cases,
sovereignty, are evolving data
centre colocation into something
altogether more complex than
version 1.0 ever was. With this,
older, smaller and increasingly
power strapped facilities are
already finding it challenging
to compete for the colocation
business of many of today’s
increasingly sophisticated buyers.
Of all the drivers for change
that are gradually moving the goal
posts, the huge impact of the cloud
and emergence of both hyperscale
and high performance computing
(HPC) are particularly noteworthy.
Cloud
Companies are quickly realising
that they need many different
types of cloud services to meet a
growing list of user and customer
needs. The Cloud Industry
‘The Cloud
Industry
Forum’s
research
earlier this
year showed
some 88%
of UK
businesses
now using
the cloud
with over
half of these
favouring
the hybrid
approach.’
Forum’s research earlier this
year showed some 88% of UK
businesses now using the cloud
with over half of these favouring
the hybrid approach whereby
data is processed and stored over
a combined public and private
cloud infrastructure. In addition
some legacy IT resources which
are vital to an organisation cannot
be migrated to a private or public
cloud infrastructure.
For the best of both worlds,
hybrid cloud offers a private
cloud combined with the use
of public cloud services which
together can create a unified,
automated, and well managed
computing environment.
This said, hybrid cloud
environments are only as good as
the weakest link; the public cloud’s
connection to the data centre. This
increasingly calls for colocation
data centres that can bypass the
internet with cloud gateways,
allowing faster, more secure private
network connections directly
into global public cloud network
infrastructures, such as Microsoft’s
Azure ExpressRoute.
Developers will also need
to be aware that moving large
amounts of data between private
and public cloud will cause latency
and sometimes their delivery
models will need to be redesigned
purely to get over this problem.
Having the flexibility to carry
out pre-production testing in the
data centre is therefore highly
beneficial for ensuring everything
works as it should prior to
launching hybrid applications.
Aside from this and the
requisite level of scalable power
to rack, the other key factor to
consider is a data centre’s level
of engineering competence,
necessary not only for configuring
and interconnecting these complex
environments, but also for helping
businesses bring their legacy IT
into the equation.
September 2017 | 25