Research Indicates Rodents Use Whiskers to Navigate Surroundings

BitFlow Axion-CL frame grabber helps scientists track high-speed micromotions of rodent whiskers

WOBURN, MA, NOVEMBER 4, 2024 -- Rodents use their whiskers to explore their environment and discover objects in their immediate vicinity. While whiskers themselves do not have nerves, the active back and forth rhythmic movement of whiskers over surfaces produces a neural representation of complex tactile scenes. This neural process allows rodents to navigate in their native environment of tunnels and dark burrows, find and identify items, and guide their movement along walls.


Recently, researchers at the University of Illinois Urbana-Champaign investigated how high-frequency energy bursts produced by whisker movements are transformed into distinct "bar codes" carrying a significant amount of information. Their findings may shed light on the neural coding in other whisker-like sensory organs across the animal world as well as in texture perception of primate skin.

Video Synchronized with Acoustics
The researchers from the University of Illinois employed high-speed videography in tandem with sensitive acoustical measurements to carry out their studies in detail. They discovered a systematic sequence of vibrational modes with frequencies up to 10kHz by observing the whiskers' micromotions. From there, they formulated a hypothesis that a rodent's whiskers are essentially pre-neural processors that transform the micromotions into temporal code with ultra-high Kb/s bandwidth. Temporal coding, in this instance, refers to a neural code that uses the precise timing or high-frequency fluctuations of neural firing rates to carry information.

For the video portion of the tests, whisker movements across various objects were simultaneously recorded in two orthogonal planes. Video in the xy plane was captured using a Mikrotron EoSens 3.0MCL three-megapixel camera, mounted overhead, configured at a resolution of 656 × 600 pixels and a rate of 1000 to 1500 frames per second (fps). The Mikrotron camera was equipped with a 0.36× telecentric lens that produced a 25 mm × 23 mm field-of-view (FOV). Video in the yz plane was captured using a side camera set at 659 × 494 pixels at 120 fps and a 16 mm lens that produced 34 mm × 23 mm FOV.

Video streams from both cameras were simultaneously transmitted by a BitFlow Axion-CL Camera Link two-channel frame grabber to a PC controlled by NorPix StreamPix multicamera software. Each frame was triggered by the Axion-CL frame grabber, and time stamps were generated with less than 1ms jitter. The BitFlow frame grabber benefits from a PCIe Gen 2 interface and a DMA optimized for fully loaded computers. The video system was illuminated by an overhead LED light focused with 40mm focal length aspheric condenser.

During the experiment, rodent whiskers were swept over an object such as a pole, grating or sandpaper, to mimic interaction in the natural environment. Within each sweeping period, the video frame that captures the whisker "first touch" event was manually identified and then time-aligned to the onset of a large voltage peak in the microphone signal recording. Whiskers outside the camera field of view were subtracted from each original frame, resulting in a high contrast processed frame.

Collisions of moving whiskers against objects create time-varying forces at the whisker follicle, producing a barrage of neural discharges leading to "haptic perception," that is, the sensory process of gaining information about objects through touch. The scientists discovered that a single micro-collision of a whisker with the surface generates vibrational spanning frequencies up to 10 kHz. While propagating along the whisker, these high-frequency modes can carry up to 80% of shockwave energy, exhibit 100× smaller damping ratio, and arrive at the follicle 10× faster than low frequency components.

1. Ding, Y., Vlasov, Y. Pre-neuronal processing of haptic sensory cues via dispersive high-frequency vibrational modes. Sci Rep 13, 14370 (2023)

Featured Product

ATI Industrial Automation - MC-50 Manual Tool Changer

ATI Industrial Automation - MC-50 Manual Tool Changer

With intuitive and ergonomic lever operation, the patent-pending MC-50 Manual Robot Tool Changer provides a simple solution for quickly changing robotic end-of-arm tooling by hand. This compact and robust Tool Changer is designed for applications on collaborative robots that support payloads up to 25 kg and small industrial robots supporting payloads up to 10 kg. Featuring an ISO 50 mm mounting interface on the Master-side and Tool-side, the low-profile MC-50 mounts directly to most cobots and seamlessly integrates with many common cobot marketplace grippers and end-effectors.