Toronto needs to become a city with more edge. Edge AI, that is. Why AI, why edge, and why Toronto?
Copyright by www.thestar.com
I believe the most important and disruptive technology over the next decade is artificial intelligence or AI. This is not general AI — computers that are as smart or smarter than humans — but a subset called machine learning. This technology, which has been in warp drive since 2012 , is narrow, but enormously useful: vast amounts of data are used to train a system, an algorithm is produced that can detect patterns and then inferences can be made on fresh data using that algorithm.
Gartner expects that AI will create $3.3 trillion (U.S.) in business value by 2021 (that’s roughly twice the size of Canada’s economy) while a Deloitte survey of our clients found that 81 per cent say AI will be either critically important or very important for their business by 2020.
Up until recently, almost all of the training and inference for AI was done in large data centres that are distant from the devices that consumers and enterprises use: this is called the network core, while the devices are said to be at the edge. There are a few of these AI data centres in Toronto, but most of them (99 per cent plus) are elsewhere (mainly in the U.S.), and the chips that are used in those data centres (about $4 billion in sales) are not made here. Not only would it be difficult for a Toronto startup to enter this market, it may not even be the best target: the data centre AI chip business is down about 10 per cent compared to last year.
These data centre chips in the core of the network are found on racks: one high powered version costs $400,000, weighs 350 pounds, and consumes 10,000 watts of power. Which is perfectly fine inside a data centre with more cooling capacity than a mall, hooked up to power lines sufficient for a small town, with reinforced floors, and with thousands or millions of customers who can spread that cost among them.
But you can’t put that rack on a smartphone, in a battery-powered warehouse robot, or in a camera or sensor. In those cases, what is needed is a small, cheap, low power chip that can do at least some of the AI tasks that up until now have been done deep in the core of the network.
Doing the AI processing on the device is called edge computing, and an upcoming report of mine predicts that the market for edge AI hardware will be more than a billion dollars globally by 2024, and growing more than 50 per cent per year between now and then. That’s an exciting tiger whose tail we should grab onto — but what makes me think Toronto can capture more than our fair share of this market? […]