The Pentagon’s research wing is funding efforts to build AI language systems that learn more like people and less like machines.
The latest artificial intelligence project at Pentagon’s research office is shedding new light on the phrase “mean what you say.”The Defense Advanced Research Projects Agency on Thursday announced it would begin funding research to reshape the way AI language systems like Alexa and Siri learn to speak. Instead of crunching gargantuan datasets to learn the ins and outs of language, the agency wants the tech to teach itself by observing the world like human babies do.
Using this approach, the Grounded Artificial Intelligence Language Acquisition, or GAILA, program aims to build AI tools that understand the meaning of what they’re saying instead of stringing together words based on statistics.
“Children learn to decipher which aspects of an observed scenario relate to the different words in the message from a tiny fraction of the examples that [machine-learning] systems require,” DARPA officials wrote in the solicitation. “ML technology is also brittle, incapable of dealing with new data sources, topics, media, and vocabularies. These weaknesses of ML as applied to natural language are due to exclusive reliance on the statistical aspects of language, with no regard for its meaning.”
No comments:
Post a Comment