Skip to content
Luca Nicola21 Jun 2018< 1 min read

The first artificial intelligence that knows how to listen like a human being | IBSA Foundation

Researchers at MIT have created the first artificial neuronal network that can recognize sounds like a human ear. Notably, scientists focused on two auditory tasks: speech and music. Their model has “trained” with thousands of two-second clips containing words spoken by a person or musical notes. After many thousands of examples, artificial intelligence has learned to perform the task with the same precision as a human listener.

The study, which appeared in the journal “Neuron” in April, also offers evidence that the human auditory cortex is organized in a hierarchical manner, very similar to the visual cortex: basic sensory information is processed immediately, while more advanced functions such as the meaning of word are “extracted” in later stages.

avatar

Luca Nicola

Copywriter since 1988, he began his career in De Agostini, and then chose to continue as a freelancer. Graduated in Philosophy, he is currently also a professor of Web Marketing at the Federlegno Training Center. As a communication consultant, he has been working for many clients for many years, including some large international groups. In 2012 he opened the personal blog “Mela N” where he deals with topics related to Writing, Communication, Content Marketing and Storytelling.

You may be interested in: