The rising environmental and monetary costs of deep learning are catching enterprises’ attention, as are new AI techniques like graph neural networks and contrastive learning.

Copyright by

SwissCognitive, AI, Artificial Intelligence, Bots, CDO, CIO, CI, Cognitive Computing, Deep Learning, IoT, Machine Learning, NLP, Robot, Virtual reality, learningAI adoption is accelerating across industries, driven by a combination of concrete results, high expectations and a lot of money. Among the many new AI concepts and techniques launching almost daily, 10 AI tech trends in particular grab data scientists’ attention.

1. MLOps

Machine learning operations (MLOps) isn’t a new concept, but it’s a relatively new “Ops” practice which operationalizes machine learning models. MLOps seeks to understand what works and doesn’t work in a model in order to create more reliable models in the future.

It’s the last mile of machine learning model building, and a practice that historically hasn’t been given much attention, said Lee Rehwinkel, VP of science at B2B pricing and sales software company Zilliant.

“It’s one of the reasons a lot of models never see the light of day, but it’s super important [because] you build a model but how do you know the uptime of that model? How fast is it going to make predictions? Does it need to be trained or retrained?” he said.

2. Contrastive learning

Contrastive learning is a machine learning technique that finds similar and dissimilar things in a data set without labels. It can be used on an image database, for example, to find images like each other.  

“Contrastive learning is becoming the new paradigm in unsupervised learning. The reason unsupervised learning is so useful is that the internet is a treasure trove of unlabeled data of text and pictures,” said Cameron Fen, head of research at A.I. Capital Management. 

“Typically, you could do this with transfer learning, but what makes contrastive learning so exciting is that you can do this with data that are too expensive to label and with a much larger data set than fine-tuning a prebuilt image classifier on ImageNet,” he said. […]

Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!


Read more: