books

Merleau-Ponty's philosophy

9780253219732_lrg

“Yes or no: do we have a body—that is, not a permanent object of thought, but a flesh that suffers when it is wounded, hands that touch?” — The Visible and the Invisible

Merleau-Ponty’s Philosophy by Lawrence Hass was the first full book I read on the great phenomenologist. If you’re fascinated by sensation, perception, synesthesia, metaphor, and flesh (and frankly, who isn’t?), please read it! It offers many wonderful revelations. I’ll briefly review the following topics from the book:

  • Sensation/perception is a false dichotomy.
  • Perception is “contact with otherness.”
  • Synesthesia is a constant feature of experience.
  • The concepts of “reversibility” and “flesh”

Continue reading

Standard
books

Enactive perception

Action In Perception

I just finished Action In Perception by Alva Noë. It’s a very readable introduction to the enactive view of perceptual consciousness, which argues that perception neither happens in us nor to us; rather, it’s something we do with our bodies, situated in the physical world, over time. Our knowledge of the way in which sensory stimulation varies as we control our bodies is what brings experience about. Without sensorimotor skill, a stimulus cannot constitute a percept. Noë presents empirical evidence for his claim, drawing on the phenomenology of change blindness as well as tactile vision substitution systems. I highly recommend the book.

The emphasis on embodied experience leads to the use of touch as a model for perception, rather than the traditional vision-based approach. Here’s an excerpt:

Touch acquires spatial content—comes to represent spatial qualities—as a result of the ways touch is linked to movement and to our implicit understanding of the relevant tactile-motor dependencies governing our interaction with objects. [Philosopher George Berkeley] is right that touch is, in fact, a kind of movement. When a blind person explores a room by walking about in it and probing with his or her hands, he or she is perceiving by touch. Crucially, it is not only the use of the hands, but also the movement in and through the space in which the tactile activity consists. Very fine movements of the fingers and very gross wanderings across a landscape can each constitute excercises of the sense of touch. Touch, in all such cases, is movement. (At the very least, it is movement of something relative to the perceiver.) These Berkeleyan ideas form a theme, more recently, in the work of [Brian O’Shaughnessy’s book “Consciousness and World”]. He writes: “touch is in a certain respect the most important and certainly the most primordial of the senses. The reason is, that it is scarely to be distinguished from the having of a body that can act in physical space”…

But why hold that touch is the only active sense modality? As we have stressed, the visual world is not given all at once, as in a picture. The presence of detail consists not in its representation now in consciousness, but in our implicit knowledge now that we can represent it in consciousness if we want, by moving the eyes or by turning the head. Our perceptual contact with the world consists, in large part, in our access to the world thanks to our possession of sensorimotor knowledge.

Here, no less than in the case of touch, spatial properties are available due to links to movement. In the domain of vision, as in that of touch, spatial properties present themselves to us as “permanent possibilities of movement.” As you move around the rectangular object, its visible profile deforms and transforms itself. These deformations and transformations are reversible. Moreover, the rules governing the transformation are familiar, at least to someone who has learned the relevant laws of visuomotor contingency. How the item looks varies systematically as a function of your movements. Your experience of it as cubical consists in your implicit understanding of the fact that the relevant regularity is being observed.

Virtual and augmented reality interface design practices have already begun to demonstrate these concepts. Head mounted augmented reality displays sense the user’s eye and body movements to construct virtual percepts. Head related transfer functions (HRTFs) synthesize sound as it would be heard by an organism with certain physical characteristics, in a particular environment. It seems to me that an important implication for enactive interface design is that haptic sensory patterns can lead to perceptual experience in all sensory modes (vision, hearing, touch). Thus, all human-computer interaction/user experience can be viewed in a haptic context.

Standard