Simulated Language Acquisition in a Biologically Realistic Model of the Brain
By: Daniel Mitropolsky, Christos Papadimitriou
Potential Business Impact:
Makes computers learn language like a baby.
Despite tremendous progress in neuroscience, we do not have a compelling narrative for the precise way whereby the spiking of neurons in our brain results in high-level cognitive phenomena such as planning and language. We introduce a simple mathematical formulation of six basic and broadly accepted principles of neuroscience: excitatory neurons, brain areas, random synapses, Hebbian plasticity, local inhibition, and inter-area inhibition. We implement a simulated neuromorphic system based on this formalism, which is capable of basic language acquisition: Starting from a tabula rasa, the system learns, in any language, the semantics of words, their syntactic role (verb versus noun), and the word order of the language, including the ability to generate novel sentences, through the exposure to a modest number of grounded sentences in the same language. We discuss several possible extensions and implications of this result.
Similar Papers
Hebbian learning the local structure of language
Computation and Language
Teaches computers to learn language like brains.
Language Arithmetics: Towards Systematic Language Neuron Identification and Manipulation
Computation and Language
Teaches computers to switch languages easily.
Bridging Brains and Machines: A Unified Frontier in Neuroscience, Artificial Intelligence, and Neuromorphic Systems
Neurons and Cognition
Brain-like computers learn like people, faster.