Intelligent Data Centres Issue 33 | Page 45

FEATURE

For many years , the industry has been in a deep discussion about the concept of Edge Computing . Yet the definition varies from vendor to vendor , creating confusion in the market , especially where end-users are concerned . In fact , within more traditional or conservative sectors , some customers are yet to truly understand how the Edge relates to them , meaning the discussion needs to change , and fast .

According to Gartner , “ the Edge is the physical location where things and people connect with the networked , digital world , and by 2022 , more than 50 % of enterprise-generated data will be created and processed outside the data centre or cloud .” All of this data invariably needs a home , and depending on the type of data that is secured , whether it ’ s business or mission-critical , the design and location of its home will vary .
Autonomous vehicles are but one example of an automated , low-latency and data-dependent application . The real-time control data required to operate the vehicle is created , processed and stored via two-way communications at a number of local and roadside levels . On a city-wide basis , the data produced by each autonomous vehicle will be processed , analysed , stored and transmitted in real-time , in order to safely direct the vehicle and manage the traffic . Yet on a national level , the data produced by millions of AVs could be used to shape transport infrastructure policy and redefine the automotive landscape globally .
Each of these processing , analysis and storage locations requires a different type of facility to support its demand . Right now , data centres designed to meet the needs of standard or enterprise business applications are plentiful . However , data centres designed for dynamic , real-time data delivery , provisioning , processing and storage are in short supply .
That ’ s partly because of the uncertainty over which applications will demand such infrastructure and , importantly , over what sort of timeframe . However , there ’ s also the question of flexibility . Many of the existing micro data centre solutions are unable to meet the demands of Edge or , more accurately , localised , lowlatency applications , which also require high levels of agility and scalability . This is due to their pre-determined or specified approach to design and infrastructure components .
Traditionally , the market has been met with small-scale Edge applications which have been deployed in prepopulated , containerised solutions . A customer is often required to conform to a standard shape or size and there ’ s no flexibility in terms of their modularity , components , or make-up . So how do we change the thinking ?
A flexible Edge
Standardisation has , in many respects , been crucial to our industry . It offers a number of key benefits , including the ability to replicate systems predictably across multiple locations . But when it comes to the Edge , some standardised systems aren ’ t built for the customer – they ’ re a product of vendor collaboration : one that ’ s also accompanied by high costs and long lead times .
On the one hand , having a box with everything in it can undoubtedly solve some pain points , especially where integration is concerned . But what happens if the customer has its own alliances , or may not need all the components ? What happens if they run out of capacity in one site ? Those original promises of scalability or flexibility disappear , leaving the customer with just one option – to buy another container . One might consider that that rigidity , when it comes to ‘ standardisation ’, can often be detrimental to the customer .
There is , however , the possibility that such modular , customisable and scalable micro data centre architectures can meet the end-user ’ s requirements perfectly , allowing end-users to truly define and embrace their Edge .
Is there a simpler way ?
Today , forecasting growth is a key challenge for customers . With demands
Sam Prudhomme , Vice President of Sales & Marketing , Subzero Engineering
increasing to support a rapidly developing digital landscape , many will have a reasonable idea of what capacity is required today . But predicting how it will grow over time is far more difficult and this is where modularity is key .
For example , pre-pandemic , a content delivery network , with capacity located near large users groups may have found itself swamped with demand in the days of lockdown .
Today , they may be considering how to scale-up local data centre capacity quickly and incrementally to meet customer expectations , without deploying additional infrastructure across more sites .
There is also the potential of 5G-enabled applications . So , how does one define what ’ s truly needed to optimise and protect the infrastructure in a manufacturing environment ? Should an end-user purchase a containerised micro data centre because that ’ s what ’ s positioned as the ideal solution ? Or should they customise and engineer a solution that can grow incrementally with demands ? Or would it be more beneficial to deploy a single room that offers a secure , high-strength and walkable roof that can host production equipment ?
The point here is that when it comes to micro data centres , a one-sizefits-all approach does not work . Endusers need the ability to choose their infrastructure based on their business demands – whether they be in industrial www . intelligentdatacentres . com
45