Fingerprint ridge width is coupled to Pacinian resonance

French scientist Georges Debregeas has published a finding that the width of the ridges of our fingerprints just happens to be optimized for maximally vibrating our nerve endings:

The latest evidence suggests that fingerprints process vibrations in the skin to make them easier for nerves to pick up. They may seem little more than digital decoration, but biomechanics have long known that fingerprints have at least one use: they increase friction, thereby improving grip…

In fact the role that fingerprints play in touch is far more important and subtle than anyone imagined.

…Biologists have known for some time that Pacinian corpuscles are most sensitive to vibrations at 250Hz. So how do fingers generate this kind vibration? Biologists have always assumed that humans can control the frequency of vibrations in the skin by changing the speed at which a finger moves across a surface. But there’s little evidence that people actually do this and the Paris team’s discovery should make this view obsolete.

…They say that fingerprints resonate at certain frequencies and so tend to filter mechanical vibrations. It turns out that their resonant frequency is around 250 Hz. What an astonishing coincidence!

That means that fingerprints act like signal processors, conditioning the mechanical vibrations so that the Pacinian corpuscles can best interpret them…

The article also notes that in robotics this is called morphological computation; that is, computation through interactions of physical form.

Hand prosthesis with sensitive fingertips

This is the first device of its kind that sends signals back to the brain, allowing the user to have feelings in their fingers and hand. The Smart Hand takes advantage of the phantom limb syndrome which is the sensation amputees have that their missing body part is still attached… By connecting sensors in the hand to the nerve endings in the stump of the arm, patients can feel and control the Smart Hand.

The test patient underwent a complicated, experimental surgical procedure to wire the nerve endings in his stump to an electronic interface. His personal risk will advance science and potentially help millions of people. Thank you, Robin Af Ekenstam.

In the next version I hope they make the Smart Hand’s fingertips get a little bit more sensitive after you clip its fingernails.

(via Engadget)

Printed strain sensors = "sense of touch"

In future, the robot could find its own way. A sensor will endow it with a sense of touch and help it to detect its undersea environment autonomously.

“One component in this tactile capability is a strain gauge,” says Marcus Maiwald…“If the robot encounters an obstacle,” he explains, “the strain gauge is distorted and the electrical resistance changes. The special feature of our strain gauge is that it is not glued but printed on – which means we can apply the sensor to curved surfaces of the robot.”

The sensor system on this robot is not all that complex; strain gauges are literally a dime a dozen (or less). But the configuration of the sensors reminds us of an animal body, and that’s what intrigues us. Since the strain sensors are printed along the surface of the robot in a continuous way (rather than being attached at some specific point), we’re reminded of how touch receptors are embedded throughout the skin, bringing to mind the phrase “sense of touch.” The Roomba has a mechanical sensor that is technically similar to the ones in this new robot, but we don’t talk about the Roomba having a sense of touch because the sensor is in a discrete place. To have a sense of touch you need to be able to sense contact (almost) anywhere on the surface of the body.

Hand-observing robot understands human goals

You cannot make human-robot interaction more natural unless you understand what ‘natural’ actually means. But few studies have investigated the cognitive mechanisms that are the basis of joint activity (i.e. where two people are working together to achieve a common goal)…

By observing how its human partner grasped a tool or model part, for example, the robot was able to predict how its partner intended to use it. Clues like these helped the robot to anticipate what its partner might need next. “Anticipation permits fluid interaction,” says Erlhagen. “The robot does not have to see the outcome of the action before it is able to select the next item.”