Google LLC this week announced the beta availability of a new cloud service that provides environments which come optimized for deploying and testing applications
copyright by siliconangle.com
Google LLC this week announced the beta availability of a new cloud service that provides environments which come optimized for deploying and testing applications powered by , a subset of that tries to mimic the way the human brain tackles problems.
The service, called Deep Learning Containers , can be run both in the cloud or on-premises, and consists of numerous performance-optimized Docker containers that come pre-packaged with various tools necessary to run algorithms.
Those tools include preconfigured Jupyter Notebooks, which are interactive tools used to work with and share code, equations, visualizations and text, and Google Kubernetes Engine clusters, which are used to orchestrate multiple container deployments.
The service also provides acceleration capabilities with Nvidia Corp.’s graphics processing units and Intel Corp.’s central processing units. Nvidia’s CUDA, cuDNN and NCCL
In a blog post , Google software engineer Mike Cheng explained that Deep Learning Containers are designed to provide all of the necessary dependencies needed to get applications up and running in the fastest possible time. The service also integrates with various Google
The service supports all of the major
Besides running Deep Learning Containers on-premises, users also have the option to host them on Google’s Compute Engine and Kubernetes Engine services, or on the Google
Analyst Holger Mueller of Constellation Research Inc. said software containers provide two key benefits for enterprises, namely the ability to scale workloads and to standardize on environments and make them more accessible. With Deep Learning Containers, Google is making
0 Comments