Google LLC this week announced the beta availability of a new cloud service that provides environments which come optimized for deploying and testing applications
Google LLC this week announced the beta availability of a new cloud service that provides environments which come optimized for deploying and testing applications powered by , a subset of that tries to mimic the way the human brain tackles problems.
The service, called Deep Learning Containers , can be run both in the cloud or on-premises, and consists of numerous performance-optimized Docker containers that come pre-packaged with various tools necessary to run algorithms.
Those tools include preconfigured Jupyter Notebooks, which are interactive tools used to work with and share code, equations, visualizations and text, and Google Kubernetes Engine clusters, which are used to orchestrate multiple container deployments.
The service also provides acceleration capabilities with Nvidia Corp.’s graphics processing units and Intel Corp.’s central processing units. Nvidia’s CUDA, cuDNN and NCCL libraries are also thrown in.
In a blog post , Google software engineer Mike Cheng explained that Deep Learning Containers are designed to provide all of the necessary dependencies needed to get applications up and running in the fastest possible time. The service also integrates with various Google services, such as BigQuery for analytics, DataProc for Apache Hadoop and Apache Spark, and Dataflow for batch processing and streaming data using Apache Beam.
The service supports all of the major frameworks, including PyTorch and TensorFlow, Cheng said.
Besides running Deep Learning Containers on-premises, users also have the option to host them on Google’s Compute Engine and Kubernetes Engine services, or on the Google Platform, which was introduced in April as a specialized cloud service for building, testing and deploying models.
Analyst Holger Mueller of Constellation Research Inc. said software containers provide two key benefits for enterprises, namely the ability to scale workloads and to standardize on environments and make them more accessible. With Deep Learning Containers, Google is making environments easier to setup and faster to access.[…]