Collaboration rather than command-and-control is key to creating culturally and ethically positive systems.
Copyright: theguardian.com – “AI can shape society for the better – but humans and machines must work together”
One of the first images of AI I encountered was a white, spectral, hostile, disembodied head. It was in the computer game Neuromancer, programmed by Troy Miles and based on William Gibson’s cyberpunk novel. Other people may have first encountered HAL 9000 from Stanley Kubrik’s 2001: A Space Odyssey or Samantha from Spike Jonze’s Her.
Images from pop culture influence people’s impressions of AI, but culture has an even more profound relationship to it. If there’s one thing to take away from this article, it is the idea that AI systems are not objective machines, but instead based in human culture: our values, norms, preferences, and behaviours in society. These aspects of our culture are reflected in how systems are engineered. So instead of trying to decide whether AI systems are objectively good or bad for society, we need to design them to reflect the ethically positive culture we truly want.
Here’s an example: Roger Dannenberg, a professor at Carnegie Mellon University in Pittsburgh, has created an AI system that plays music with people. It accompanies performers based on ideas of pitch, scale, tempo and so on, that could be called facets of western music theory. In contrast, the composer and scholar George E Lewis, coming from a tradition based in the African diaspora – jazz and other traditions, as nurtured by Chicago’s Association for Advancement of Creative Music – has created a system called Voyager that is a “nonhierarchical, interactive musical environment that privileges improvisation”.[…]
Read more: www.theguardian.com