History will look back on this time as the beginning of the artificial revolution. Artificial intelligence is beating us at Go , inventing its own languages , writing for us , and composing music . Machine learning is tackling complex, concrete challenges like image compression and solving classification problems like recognition and image classification. Even the security and IT industries are benefiting from .
The problem with any complex new technology — apart from actually inventing and building it — is figuring out how to explain it to the public. How does it work? And how is it valuable? Unfortunately, a lot of messaging either emphasizes trivial details or is too vague to give any insight. Let’s cut through the noise by focusing on what’s most important in : how well it can keep up with the ever-changing conditions.
It’s not the algorithm, it’s the data
Everyone bends over backward to say they use “” and “neural networks.” They’re using these terms because they sound cool and make it seem as though they have unlocked the secrets of the artificial brain or something equally mysterious.
But neural networks — although powerful — are not magic and are only one of the potential approaches to building a good solution. They’re not the sort of thing you want to spend all of your time and effort on. That’d be like saying, “All of our engineers wear non-polyester pants because they’re more comfortable and they can write better code as a result.” Well, that may be true, but what good does it do to talk about it? How do cotton pants solve the problem? Instead, success in depends on the training data and the problem you’re trying to solve.
The point is, there are more important things to talk about besides the neural networks we’re using. The nitty-gritty details of how code is written are less important than the results the code produces. In the same way, any algorithm is good if it produces useful, efficient models. Just focusing on the algorithm because it sounds cool leads nowhere.
The real secret sauce is making a useful model with quality data. You can have without sophisticated algorithms, but not without good data. For example, I work in the malware detection space, where it’s vital to have a diverse and representative set of both malicious and benign files users are likely to experience.
This is easier said than done. It takes a lot of time, effort, and ingenuity to expand and curate data sets to be as large, varied, and relevant as possible. Detecting malware is an arms race between cybersecurity teams and hackers, which means change is constant. Instead of talking about neural networks or , we should be discussing how effectively the models we build react to data sets and eventually the real world. […]