MIT CSAIL engineers have developed an intelligent carpet that can accurately assess a person’s movements or posture without a camera. The system could be useful for training feedback, monitoring falls, or tracking for VR and games.
The prototype of the mat measures 3.3 m² and contains over 9,000 sensors, which consist of a pressure-sensitive film and a conductive thread. In essence, as weight is applied to different parts of the carpet, different electrical signals are sent based on the amount of pressure, where that pressure is applied on the mat, and the relative positions between the pressure points.
The system was initially trained on a synchronized combination of tactile input on the mat and corresponding video of people performing various actions such as walking, sit-ups, push-ups, yoga poses, sitting, lying, rolling or standing on their tiptoes.
Then the print cards of these actions were assigned to virtual models of a person performing them, which allowed the system to estimate a person’s posture alone (excuse the pun) based on the print data. Upper body movements can also be derived relatively precisely – the system can, for example, use the weight shift to detect whether a person is bending to the left or right.
All in all, this means that the Smart Carpet can learn what a lunge looks like, for example, without camera input. The CSAIL team says the system was 97 percent accurate in identifying certain actions and was able to predict a person’s pose with an accuracy of 10 cm (3.9 inches).
“You could imagine using the carpet for training purposes,” says Yunzhu Li, co-author of the study. “Based on tactile information alone, it can recognize activity, count the number of repetitions and calculate the amount of calories burned.”
Other possible uses include monitoring the elderly for falls, assisting the injured in rehab, or tracking a player’s movements in VR or video games. It could be less cumbersome than portable trackers, easier to set up than infrared sensors, and more private than using cameras.
The team says the rug is also easily scalable and relatively inexpensive – the prototype was built for less than $ 100. Next, the researchers want to find ways to collect more information from the signals, such as the height or weight of a user, and adapt them for multiple users at the same time.
The research was published in the Proceedings of the IEEE / CVF Conference on Computer Vision and Pattern Recognition. The team demonstrates the smart carpet in the video below.
Smart Carpet: Estimating a person’s 3D pose using only tactile sensors