While hyperscalers already use AI to improve operations, most other data centers aren’t integrated enough to make it work.

SwissCognitive, AI, Artificial Intelligence, Bots, CDO, CIO, CI, Cognitive Computing, Deep Learning, IoT, Machine Learning, NLP, Robot, Virtual reality, learningThe work of data center management is changing quickly. There are hybrid environments and multi-cloud to deal with, edge computing, and a constant onslaught of rapidly evolving cybersecurity threats.

AI promises to – any day now – come to the rescue of IT warriors, give them the silver bullet, the answer to all complexities they struggle against. Self-learning systems will adapt on their own to fast-evolving environments, protect against known and unknown threats, respond instantaneously with super-human accuracy, and do it all on the cheap.

In theory, anyway; in practice, not so much. Not yet, and probably not for a long time, due to siloed systems and a lack of integrated management platforms.

Data center complexity has been increasing exponentially, said Amr Ahmed, managing director at EY Consulting Services. In the past, a company might have had one mainframe. Then, with client-server, the environment grew to tens, hundreds, or thousands of machines, he said. “The distributed environment – hundreds of thousands; virtualization – millions; cloud – tens of millions.” That’s beyond human ability to manage. “AI is essential,” he told DCK. “There is no way to work around it. It is not a choice. It is not optional.”

The biggest cloud providers, the hyperscalers, have been applying machine learning (a type of AI) to this problem of scale for a while. “Predictions of failure, moving workload around automatically – these things are not stuff that’s going to happen in the next ten years,” he said. “It already exists. The cloud services providers are already using this in their cloud environments. That is how they can offer their services at scale.”

Particularly in the area of data center power and cooling, advanced analytics have been used to reduce energy costs for years. “There are many tools that analyze this data and make decisions,” Ahmed said.

When AI can help improve a data center’s uptime, it’s a clear and obvious benefit – and a big area of focus for large data center operators. AI and ML can be used to predict failure of critical tasks and avoid unexpected system and service failures or data center outages, said Dan Simion, VP of AI and analytics at Capgemini. “This approach creates a self-healing mechanism,” he told DCK.


Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!


 

While the larger data centers providers are taking the lead here, high tech companies may also be building these kinds of AI systems from scratch, if it’s in their wheelhouse, he added.

The most digitally mature companies are already seeing value from their AI investments, he said, as are companies with large data centers.

AI Hopes Crash into Silo Walls

For smaller data centers, the easiest way to start deploying AI is to rely on technology vendors. However, there are limits to this approach, namely, the difficulty of dealing with interdependency and business context. […]

Read more: www.datacenterknowledge.com