|
Posted by Okachinepa on 04/07/2025 @


Courtesy of SynEvol
Credit: John A. Rogers/ Northwestern University
Currently, most haptic technologies are restricted to providing basic vibrations. Nonetheless, human skin contains numerous sensors that can sense pressure, stretching, vibration, and various tactile signals.
Currently, engineers at Northwestern University have invented a novel technology that generates accurate, regulated movements to mimic these intricate sensations.
The research was recently released in the journal Science.
This small, light, and cordless device rests on the skin and exerts force in various directions to produce a variety of sensations such as vibration, pressure, stretching, sliding, and twisting. It can also merge these effects and modify speed to create a more authentic and detailed sensation of touch.
The gadget runs on a tiny rechargeable battery and links wirelessly to smartphones and virtual reality headsets through Bluetooth. Its compact, versatile design permits placement on any part of the body, usage in combinations with other actuators, or incorporation into current wearable technology.
The researchers recognize the capability of the device to improve virtual reality, aiding visually impaired individuals to explore their surroundings, mimicking textures on flat displays for e-commerce, providing tactile responses during virtual medical appointments, and enabling hearing-impaired users to "experience" music.
Courtesy of SynEvol
Credit: John A. Rogers/ Northwestern University
“Nearly all haptic actuators merely prod the skin,” stated John A. Rogers of Northwestern, who spearheaded the device's design. "However, skin is responsive to far more advanced sensations of touch." We aimed to design a device that could exert forces in all directions — including pushing, twisting, and sliding, not just poking. We created a small actuator that can move the skin in every direction and in various directional combinations. Through this, we can precisely manage the intricate feeling of touch in a completely programmable manner.
A trailblazer in bioelectronics, Rogers holds the title of Louis A. Simpson and Kimberly Querrey Professor of Materials Science and Engineering, Biomedical Engineering, and Neurological Surgery, with positions in the McCormick School of Engineering and the Feinberg School of Medicine at Northwestern University. He additionally oversees the Querrey Simpson Institute for Bioelectronics. Rogers collaborated on the project with Yonggang Huang from Northwestern, the Jan and Marcia Achenbach Professor in Mechanical Engineering and a professor of civil and environmental engineering at McCormick. Kyoung-Ho Ha, Jaeyoung Yoo, and Shupeng Li from Northwestern are the co-first authors of the study.
The research expands upon earlier projects from Rogers' and Huang's laboratories, where they created a customizable array of small vibrating actuators to provide a tactile sensation.
In recent years, both visual and auditory technologies have surged dramatically, offering unmatched immersion via devices such as high-fidelity, intricately detailed surround-sound speakers and fully immersive virtual-reality headsets. Haptic technologies, on the other hand, have largely reached a standstill. Even cutting-edge systems provide only buzzing vibration patterns.
This developmental divide arises mainly from the remarkable intricacy of human touch. The sensation of touch encompasses various kinds of mechanoreceptors (or sensors) — each possessing unique sensitivities and response traits — situated at different depths within the skin. When these mechanoreceptors are activated, they transmit signals to the brain, which interprets them as touch.
Courtesy of SynEvol
Credit: John A. Rogers/ Northwestern University
Achieving that level of sophistication and nuance demands exact control over the kind, intensity, and timing of stimuli applied to the skin. This poses a significant challenge that existing technologies have had difficulty — and have not succeeded — in addressing.
“One reason haptic technology falls behind video and audio in terms of richness and realism is the complexity of skin deformation mechanics,” stated J. Edward Colgate, a haptics pioneer and co-author of the study from Northwestern. "Skin can be pressed in or extended sideways." Skin can stretch gradually or rapidly, and it may occur in intricate patterns over a complete area, like the entire palm of the hand.
To replicate that complexity, the Northwestern team created the initial actuator with complete freedom of motion (FOM). This indicates that the actuator is not restricted to just one kind of motion or a narrow range of motions. Rather, it can shift and exert forces in every direction across the skin. These dynamic forces activate all mechanoreceptors in the skin, either separately or in conjunction with each other.
“It represents a significant advancement in handling the intricacies of the sense of touch,” remarked Colgate, Walter P. Murphy Professor of Mechanical Engineering at McCormick. The FOM actuator is the initial small, compact haptic tool that can jab or extend the skin, work slowly or quickly, and be utilized in arrays. Consequently, it can generate an impressive variety of tactile experiences.
Measuring only a few millimeters, the device utilizes a small magnet and a series of wire coils, organized in a nesting pattern. When electric current passes through the coils, it produces a magnetic field. When the magnetic field engages with the magnet, it generates a force powerful enough to move, push, pull, or twist the magnet. By grouping actuators into arrays, they can mimic the sensations of pinching, stretching, squeezing, and tapping.
“Accomplishing both a compact design and significant force output is essential,” stated Huang, who oversaw the theoretical research. “Our team created computational and analytical models to determine ideal designs, ensuring every mode produces its highest force component while reducing undesired forces or torques.”
Transforming the digital realm into reality
On the opposite side of the device, the team incorporated an accelerometer, allowing it to measure its position in space. Using this information, the system can deliver haptic feedback tailored to the user's context. If the actuator is located on a hand, for instance, the accelerometer can sense whether the user’s hand is facing palm up or palm down. The accelerator can also monitor the actuator’s motion, delivering details on its speed, acceleration, and rotation.
Rogers stated that this motion-tracking feature is particularly beneficial for moving through areas or interacting with various textures on a flat display.
"If you glide your finger over a silk fabric, it will experience less friction and move more swiftly than when coming into contact with corduroy or burlap," he stated. "You might picture buying clothing or materials online and wishing to touch the fabric."
In addition to mirroring daily tactile sensations, the platform can also convey information via the skin. By altering the frequency, intensity, and rhythm of haptic feedback, the team transformed the auditory experience of music into a tactile sensation, for instance. They could also modify tones simply by altering the direction of the vibrations. Experiencing these vibrations allowed users to distinguish between different instruments.
“We managed to analyze all the aspects of music and translate them into haptic feelings without compromising the nuanced details tied to particular instruments,” Rogers stated. "It's merely one instance of how the sense of touch might enhance another sensory experience." We believe our system may assist in narrowing the divide between the digital and physical realms. Incorporating a genuine sense of touch can make digital interactions seem more immersive and appealing.
|
|