Bringing artificial intelligence into the classroom, research lab, and beyond

 

SwissCognitiveArtificial intelligence is reshaping how we live, learn, and work, and this past fall, MIT undergraduates got to explore and build on some of the tools and coming out of research labs at MIT. Through the Undergraduate Research Opportunities Program (UROP), students worked with researchers at the MIT Quest for Intelligence and elsewhere on projects to improve AI literacy and K-12 education, understand face recognition and how the brain forms new memories, and speed up tedious tasks like cataloging new library material. Six projects are featured below.

Programming Jibo to forge an emotional bond with kids

Nicole Thumma met her first robot when she was 5, at a museum. “It was incredible that I could have a conversation, even a simple conversation, with this machine,” she says. “It made me think robots are the most complicated manmade thing, which made me want to learn more about them.”

Now a senior at MIT, Thumma spent last fall writing dialogue for the social robot Jibo, the brainchild of MIT Media Lab Associate Professor Cynthia Breazeal . In a UROP project co-advised by Breazeal and researcher Hae Won Park , Thumma scripted mood-appropriate dialogue to help Jibo bond with students while playing learning exercises together.

Because emotions are complicated, Thumma riffed on a set of basic feelings in her dialogue — happy/sad, energized/tired, curious/bored. If Jibo was feeling sad, but energetic and curious, she might program it to say, “I’m feeling blue today, but something that always cheers me up is talking with my friends, so I’m glad I’m playing with you.​” A tired, sad, and bored Jibo might say, with a tilt of its head, “I don’t feel very good. It’s like my wires are all mixed up today. I think this activity will help me feel better.”

In these brief interactions, Jibo models its vulnerable side and teaches kids how to express their emotions. At the end of an interaction, kids can give Jibo a virtual token to pick up its mood or energy level. “They can see what impact they have on others,” says Thumma. In all, she wrote 80 lines of dialogue, an experience that led to her to stay on at MIT for an MEng in robotics. The Jibos she helped build are now in kindergarten classrooms in Georgia, offering emotional and intellectual support as they read stories and play word games with their human companions.

Understanding why familiar faces stand out

With a quick glance, the faces of friends and acquaintances jump out from those of strangers. How does the brain do it? Nancy Kanwisher’s lab in the Department of Brain and Cognitive Sciences (BCS) is building computational models to understand the face-recognition process. Two key findings: the brain starts to register the gender and age of a face before recognizing its identity, and that face perception is more robust for familiar faces.


Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!


 

This fall, second-year student Joanne Yuan worked with postdoc Katharina Dobs to understand why this is so. In earlier experiments, subjects were shown multiple photographs of familiar faces of American celebrities and unfamiliar faces of German celebrities while their brain activity was measured with magnetoencephalography. Dobs found that subjects processed age and gender before the celebrities’ identity regardless of whether the face was familiar. But they were much better at unpacking the gender and identity of faces they knew, like Scarlett Johansson, for example. Dobs suggests that the improved gender and identity recognition for familiar faces is due to a feed-forward mechanism rather than top-down retrieval of information from memory. […]

Read more – news.mit.edu