1 min read

Link: From Baby Talk to Baby A.I.

As Tammy Kwan and Brenden Lake delivered blackberries from a bowl into the mouth of their twenty-one-month-old daughter, Luna. Luna was dressed in pink leggings and a pink tutu, with a silicone bib around her neck and a soft pink hat on her head. A lightweight GoPro-type camera was attached to the front. “Babooga,” she said, pointing a round finger at the berries. Dr. Kwan gave her the rest, and Dr. Lake looked at the empty bowl, amused. “That’s like $10,” he said. A light on the camera blinked. For an hour each week over the past 11 months, Dr. Lake, a psychologist at New York University whose research focuses on human and artificial intelligence, has been attaching a camera to Luna and recording things from her point of view as she plays. His goal is to use the videos to train a language model using the same sensory input that a toddler is exposed to — a LunaBot, so to speak. By doing so, he hopes to create better tools for understanding both A.I. and ourselves. “We see this research as finally making that link, between those two areas of study,” Dr. Lake said. “You can finally put them in dialogue with each other.”  #

--

Yoooo, this is a quick note on a link that made me go, WTF? Find all past links here.