FAGMA

14 open source tools to make the most of machine learning

14 open source tools to make the most of machine learning

Tap the predictive power of with these diverse, easy-to-implement libraries and frameworks

SwissCognitiveSpam filtering, face recognition, recommendation engines — when you have a large data set on which you’d like to perform predictive analysis or pattern recognition, is the way to go.

The proliferation of free open source software has made easier to implement both on single machines and at scale, and in most popular programming languages. These open source tools include libraries for the likes of Python, R, C++, Java, Scala, Clojure, JavaScript, and Go.

Apache Mahout
Apache Mahout provides a way to build environments for hosting applications that can be scaled quickly and efficiently to meet demand.

Mahout works mainly with another well-known Apache project, Spark , and was originally devised to work with Hadoop for the sake of running distributed applications, but has been extended to work with other distributed back ends like Flink and H2O.

Mahout uses a domain specific language in Scala. Version 0.14 is a major internal refactor of the project, based on Apache Spark 2.4.3 as its default.

Compose
Compose, by Innovation Labs, targets a common issue with models: labelling raw data, which can be a slow and tedious process, but without which a model can’t deliver useful results.
Compose lets you write in Python a set of labelling functions for your data, so labelling can be done as programmatically as possible. Various transformations and thresholds can be set on your data to make the labelling process easier, such as placing data in bins based on discrete values or quantiles.

Core Tools
Apple’s Core framework lets you integrate models into apps, but uses its own distinct learning model format. The good news is you don’t have to pre-train models in the Core format to use them; you can convert models from just about every commonly used framework into Core with Core Tools.

Core Tools runs as a Python package, so it integrates with the wealth of Python libraries and tools. Models from TensorFlow, PyTorch, Keras, Caffe, ONNX, Scikit-learn, LibSVM, and XGBoost can all be converted. Neural network models can also be optimised for size by using post-training quantisation (e.g., to a small bit depth that’s still accurate).

Cortex
Cortex provides a convenient way to serve predictions from models using Python and TensorFlow, PyTorch, Scikit-learn, and other models. Most Cortex packages consist of only a few files — your core Python logic, a cortex.yaml file that describes what models to use and what kinds of compute resources to allocate, and a requirements.txt file to install any needed Python requirements.

The whole package is deployed as a Docker container to AWS or another Docker-compatible hosting system. Compute resources are allocated in a way that echoes the definitions used in Kubernetes for same, and you can use GPUs or Amazon Inferentia ASICs to speed serving. […]

Read more: www.reseller.co.nz

0 Comments

Leave a Reply