Copyright by www.forbes.com
It seems like every week there’s a new survey out detailing the ever-increasing amount of focus that IT shops of all sizes put on the technology. If it’s true that data is the new currency, then it’s that mines that data for value. Your C-suite understands that, and its why they continually push to build and capabilities.
Nowhere is / more impactful than in the world of government and government contractors. It’s not just the usual suspects of defense and intelligence who demand these capabilities—/ is fast becoming a fact-of-life across the spectrum of government agencies. If you’re a government contractor, then you’re already seeing / in an increasing number of RFP/RFQs.
I’m a storage analyst. I don’t like to think about . I like to think about data. I advise my clients on how storage systems and data architecture must evolve to meet the needs of emerging and disruptive technologies. These days, those technologies all seem to be some variation of containerized deployments, hybrid-cloud infrastructure and enterprise . There’s no question that Artificial Intelligence is the most disruptive.
High-power GPUs dominate . Depending on the problem you’re trying to solve, that may be one GPU in a data scientist’s workstation, or it may be a cluster of hundreds of GPUs. It’s also a certainty that your deployment will scale over time in ways that you can’t predict today.
That uncertainty forces you to architect your data center to support the unknown. That could mean deploying storage systems that have scalable multi-dimensional performance that can keep the GPUs fed, or simply ensuring that your data lakes are designed to reduce redundancies and serve the needs of all that data’s consumers.
These aren’t problems of implementing , but rather designing an infrastructure that can support it. Most of us aren’t experts. We manage storage, servers, software or networking. These are all things will be disrupted by in the data center.
The single best way to prepare for the impacts of in the data center is to become educated on what it is and how it’s used. The dominant force in and GPU technology for is NVIDIA. Thankfully, NVIDIA has a conference to help us all out.
NVIDIA’s GPU technology conference for
Every spring NVIDIA hosts its massive GPU Technology Conference (GTC) near its headquarters in Silicon Valley. It’s there where 6,000+ attendees gather to hear about all aspects of what NVIDIA’s GPUs can do. This ranges from graphics for gaming and visualization, to inference at the edge, to in the enterprise. It’s one of my favorite events each year (read my recap the most recent GTC here, if interested). […]