Networks Europe Jul-Aug 2020 | Page 30

30 AI and machine learning challenges in the availability and preparing of data. A business cannot become data-driven, if it doesn’t understand the information it has and the concept of ‘garbage in, garbage out’ is especially true when it comes to the data used for AI. With many organisations still on the starting blocks, or having not yet entirely finished their journey to become data driven, there appears to be a misplaced assumption that they can quickly and easily leap from being in the process of preparing their data to implementing AI and ML, which realistically, won’t work. To successfully step into the world of AI, businesses need to firstly ensure the data they are using is good enough. AI in the data centre Over the coming years, we are going to see a tremendous investment in large scale and High-Performance Computing (HPC) being installed within organisations to support data analytics and AI. At the same time, there will be an onus on data centre providers to be able to provide these systems without necessarily understanding the infrastructure that’s required to deliver them or the software or business output needed to get value from them. We saw this in the realm of big data, when everyone tried to swing together some kind of big data solution and it was very easy to just say we’ll use Hadoop to build this giant system. If we’re not careful, the same could happen with AI. There have been many conversations about the fact that if we were to peel back the layers of many AI solutions, we’ll find that there are still a lot of people investing a lot of hard work into them, so when it comes to automating processes, we aren’t quite in that space yet. AI solutions are currently very resource heavy. There’s no denying that the majority of data centres are now being asked how they provide AI solutions and how they can assist organisations on their AI journey. Whilst organisations might assume that data centres will have everything to do with AI tied up. Is this really the case? Yes, there is a realisation of the benefits of AI, but actually how it is best implemented, and by who, to get the right results, hasn’t been fully decided. Solutions to how to improve the performance of largescale application systems are being created, whether that’s by getting better processes, better hardware or whether it’s reducing the cost to run them through improved cooling or heat exchange systems. But data centre providers have to be able to combine these infrastructure elements with a deeper understanding of business processes. This is something very few providers, as well as Managed Service Providers (MSPs) and Cloud Service Providers (CSPs) are currently doing. It’s great to have the kit and use submerged cooling systems and advanced power mechanisms but what does that give the customer? How can providers help customers understand what more can be done with their data systems? How do providers differentiate themselves and how can they say they harness these new technologies to do something different? It’s easy to go down the route of promoting that ‘we can save you X, Y, Z’ but it means more to be able to say ‘what we can achieve with AI is..X, Y, Z‘. Data centre providers need to move away from trying to win customers over based solely on monetary terms. www.networkseuropemagazine.com