Automated is going through an accelerator program right now, focusing on cyber security. But that’s not the only thing the company is thinking about.
copyright by www.govtech.com
Artificial intelligence () — of the “learning algorithms” variety, not the Skynet kind — is everywhere in the tech world right now. It’s because of the concept’s many possibilities: object recognition in pictures and videos, anticipating cybersecurity threats, finding specific kinds of people amid thousands or millions in a data set.
There’s also a fundamental need for all algorithms: training. They all need to run data to learn what it is they’re looking for. That’s how they “learn.”
What if there isn’t enough data to make the algorithms as good as they need to be? Or what if it takes too long to collect and prepare that data?
An early stage startup that just entered a northern Virginia cybersecurity accelerator thinks it has the solution: fake data.
Or, as Automated ’s CEO Jeff Schilling puts it, synthetic data. His company has built up the capability to produce data — a lot of data — based on historical examples. It’s not real, but it mimics real data closely enough that algorithms can use it. And it’s realistic enough that it could be real.
It’s acceleration, basically.“We want the people to get there better, faster,” Schilling said.He described the process of creating fake data in terms of the process of humans learning to speak. They begin with sounds, then form approximate words, then learn to put those words in sequence via grammar. And just as humans use grammatic rules to put words together into novel sentences, Automated uses “grammar” to create novel data.There are a few potential upsides to the concept. First, if learning algorithms hit the real world with more training under their belts, theoretically […]