Learning nouns activates separate brain region from learning verbs

Another MRI study, this time investigating how we learn parts of speech:

The test consisted of working out the meaning of a new term based on the context provided in two sentences. For example, in the phrase “The girl got a jat for Christmas” and “The best man was so nervous he forgot the jat,” the noun jat means “ring.” Similarly, with “The student is nising noodles for breakfast” and “The man nised a delicious meal for her” the hidden verb is “cook.”

“This task simulates, at an experimental level, how we acquire part of our vocabulary over the course of our lives, by discovering the meaning of new words in written contexts,” explains Rodríguez-Fornells. “This kind of vocabulary acquisition based on verbal contexts is one of the most important mechanisms for learning new words during childhood and later as adults, because we are constantly learning new terms.”

The participants had to learn 80 new nouns and 80 new verbs. By doing this, the brain imaging showed that new nouns primarily activate the left fusiform gyrus (the underside of the temporal lobe associated with visual and object processing), while the new verbs activated part of the left posterior medial temporal gyrus (associated with semantic and conceptual aspects) and the left inferior frontal gyrus (involved in processing grammar).

This last bit was unexpected, at first. I would have guessed that verbs would be learned in regions of the brain associated with motor action. But according to this study, verbs seem to be learned only as grammatical concepts. In other words, knowledge of what it means to run is quite different than knowing how to run. Which makes sense if verb meaning is accessed by representational memory rather than declarative memory.

Brains that can't say words can sing them instead

Teaching stroke patients to sing “rewires” their brains, helping them recover their speech, say scientists.

By singing, patients use a different area of the brain from the area involved in speech. If a person’s “speech centre” is damaged by a stroke, they can learn to use their “singing centre” instead.

During the therapy sessions, patients are taught to put their words to simple melodies. Professor Schlaug said that after a single session, a stroke patients who was are not able to form any intelligible words learned to say the phrase “I am thirsty” by combining each syllable with the note of a melody.

The article doesn’t say whether patients can ever go back to talking without singing. I can only hope that as their lives begin to sound like an opera, the corresponding drama, murder and intrigue doesn’t follow.


Touch affects cognition

In a mock haggling scenario, those sat on soft chairs were more flexible in agreeing a price. The team also found candidates whose CVs were held on a heavy clipboard were seen as better qualified than those whose CVs were on a light one….Overall, through a series of experiments, they found that weight, texture, and hardness of inanimate objects unconsciously influence judgments about unrelated events and situations. It suggests that physical touch, which is the first of sense to develop, may be a scaffold upon which people build social judgments and decisions.

Handedness affects abstract associations

Despite the common association of “right” with life, correctness, positiveness and good things, and “left” with death, clumsiness, negativity and bad things, recent research shows that most left-handed people hold the opposite association. Thus, left-handers become an interesting case in which conceptual associations as a result of a sensory-motor experience, and conceptual associations that rely on linguistic and cultural norms, are contradictory.


I was inspired to research the words “organ” and “organized” after I read a statement made by Merleau-Ponty scholar Lawrence Hass that “perceptions are organized (organ-ized) information.” He included the hyphen to emphasize a very interesting point: it may be that our ability to organize our thoughts is rooted in a concrete aspect of embodiment. We have specialized organs and neural pathways for particular ranges of wave frequencies (light for the eyes, sound for the ears, vibration for the skin). So, it’s plausible that organization of thought may have its roots in the configuration of our sense organs. Astounding!

Here’s a typical definition of organize:

  • v. arrange in an orderly way
  • v. to make into a whole with unified and coherent relationships (yourdictionary.com)

These definitions aren’t satisfying. What makes an organization orderly, unified, and coherent? The definition Hass implies is much more illuminating: to be organized is to be divided according to the sense organs of a perceiver. Now we’re getting somewhere!

But moving in a slightly different direction, what the hell are we doing playing a musical instrument called an “organ”? And what does all this mean for Edgard Varèse’s famous definition of music as “organized sound”?


  • n. from the Greek organon meaning “implement”, “musical instrument”, “organ of the body”, literally, “that with which one works” (Online Etymology Dictionary)
  • n. an instrument or means, as of action or performance

Substituting “organ” in Varèse’s famous definition with these, the word “music” means:

  • music is sound with which one works
  • music is sound that is a means of action or performance

For the first time I understand what Varèse meant when he said music is “organized sound.” We use the word music to mean sound that is utilized by someone to work or perform. Nothing more, nothing less.

Put yourself in my position

…so you can understand how I feel:

“Our language is full of spatial metaphors, particularly when we attempt to explain or understand how other people think or feel. We often talk about putting ourselves in others’ shoes, seeing something from someone else’s point of view, or figuratively looking over someone’s shoulder,” Sohee Park, report co-author and professor of psychology, said. “Although future work is needed to elucidate the nature of the relationship between empathy, spatial abilities and their potentially overlapping neural underpinnings, this work provides initial evidence that empathy might be, in part, spatially represented.”

“We use spatial manipulations of mental representations all the time as we move through the physical world. As a result, we have readily available cognitive resources to deploy in our attempts to understand what we see. This may extend to our understanding of others’ mental states,” Katharine N. Thakkar, a psychology graduate student at Vanderbilt and the report’s lead author, said. “Separate lines of neuroimaging research have noted involvement of the same brain area, the parietal cortex, during tasks involving visuo-spatial processes and empathy.”

Hand-observing robot understands human goals

You cannot make human-robot interaction more natural unless you understand what ‘natural’ actually means. But few studies have investigated the cognitive mechanisms that are the basis of joint activity (i.e. where two people are working together to achieve a common goal)…

By observing how its human partner grasped a tool or model part, for example, the robot was able to predict how its partner intended to use it. Clues like these helped the robot to anticipate what its partner might need next. “Anticipation permits fluid interaction,” says Erlhagen. “The robot does not have to see the outcome of the action before it is able to select the next item.”