By any measure , we’ve entered the age of machine and artificial intelligence. The confluence of massive data, cheap storage, elastic compute, and algorithmic advances, particularly in , has given rise to applications that previously were confined to the pages of science fiction novels.
Machines now surpass humans in complex strategy games , to say nothing of , transcription , and other advances that begin to complicate our assumptions about what is and isn’t uniquely human. -based personal assistants are commonplace, and fully autonomous vehicles seem just around the bend .
Given these recent advances, much of the dialogue around / has centered disproportionately, albeit understandably, on breakthroughs in algorithms and their applications. Notably absent in the discussion has been any mention of the infrastructure underlying these intelligent systems.
Just as in the earliest days of computing, when one needed to be expert in assembly language, compilers, and operating systems to develop a simple application, so today you need an army of stats and distributed systems PhDs to build and deploy at scale. The abstractions and tooling necessary to make / usable are the missing link. The upshot is that / remains a limited and expensive discipline reserved for only a few elite engineering organizations.
Ultimately, this relates to a lag in the evolution of infrastructure, which to date is far outpaced by innovation in machine techniques. Put simply, the systems and tooling that helped usher in the current era of practical machine are ill-suited to power future generations of the intelligent applications they spawned.
Going forward, an entirely new toolchain is necessary to unlock the potential of /, to make it operational and usable — let alone approachable — for developers and enterprises. It stands to reason, then, that the next great opportunity in infrastructure will be to provide the building blocks for systems of intelligence. […]