Mark Zuckerberg, co-founder of Facebook and now CEO of Meta, stated on Monday that a new touch sensor and a plastic material could potentially work together to support the development of a so-called metaverse.

Meta artificial intelligence researchers collaborated with Carnegie Mellon University scientists to develop a deformable plastic “skin” less than 3 millimeters thick. The technology, known as ReSkin, can detect forces as low as 0.1 newtons from objects as small as 1 mm in size.

“We’re getting closer to realistic virtual objects and physical interactions in the metaverse,” said Meta CEO Mark Zuckerberg. The relatively inexpensive material, known as ReSkin, contains magnetic particles that generate a magnetic field.

The magnetic field from the embedded particles changes when the skin comes into contact with another surface. The sensor detects changes in magnetic flux and sends the information to AI software, which attempts to understand the force or touch that has been applied.

“We designed a high-resolution touch sensor and collaborated with Carnegie Mellon to develop a thin robot skin,” Zuckerberg wrote on Facebook on Monday. “We’re one step closer to having realistic virtual objects and physical interactions in the metaverse.” The skin was tested on soft fruit robots that handled grapes and blueberries. It was also placed inside a rubber glove while a human hand formed a bao bun shape.

To ensure that the AI system had enough data to understand how changes in magnetic field relate to touch, it had to be trained on 100 human touches.

The work is scheduled for publication in an academic journal later this month, but it has not yet been peer-reviewed.

Touch has been largely ignored by AI researchers because touch sensors have been too expensive or too flimsy to obtain reliable data, according to Abhinav Gupta, a research scientist at Meta, during a media call Friday. “If you think of how humans or babies learn, rich multimodal data is quite critical for developing an understanding of the world,” Gupta said. “We are learning from pixels, sound, touch, taste, smell, and so on.”

“However, if you look at how AI has advanced in the last decade, you will notice that we have made huge advances in pixels (computer vision)…. We’ve also made strides in sound: audio, speech, and so on. However, despite its importance, touch has been omitted from this advancement.”

Giving machines and robot assistants the ability to feel will allow them to understand what humans are doing, according to Gupta, who added that Meta’s ReSkin can detect forces as low as 0.1 newtons from objects as small as 1 mm wide.

“We can try to have a better understanding of the physics behind objects for the first time,” Gupta said, adding that this will aid Meta’s quest to build a metaverse. The metaverse is either the next evolution of the internet or the latest corporate buzzword designed to pique investors’ interest in some hazy innovation that may not even materialize in the next decade.

In any case, tech companies, particularly Meta, are increasingly promoting the concept of the metaverse, a virtual world in which you can live, work, and play. If you’ve seen the film “Ready Player One,” you’ll have a good idea of what the metaverse is: Put on a pair of computerized glasses and you’ll be transported to a digital world where anything is possible. If Meta’s metaverse ambitions are realized, it may be possible to interact with virtual objects and receive some sort of physical response from hardware.

“When wearing a Meta headset, you want some haptics to be provided so users can feel even richer experiences,” Gupta explained.

“How can you provide haptic feedback if you don’t know what kind of touch humans feel or what the material properties are?”