A new way to bring personal items to mixed reality
Think of your most prized belongings. In an increasingly virtual world, wouldn’t it be great to save a copy of that precious item and all the memories it holds?
In mixed-reality settings, you can create a digital twin of a physical item, such as an old doll. But it’s hard to replicate interactive elements, like the way it moves or the sounds it makes — the sorts of unique interactive features that made the toy distinct in the first place.
Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) sought to change that, and they have a potential solution. Their “InteRecon” program enables users to recapture real-world objects in a mobile app, and then animate them in mixed-reality environments.
This prototype could recreate the interaction functions in the physical world, such as the head motions of your favorite bobblehead, or playing a classic video on a digital version of your vintage TV. It creates more lifelike and personal digital surroundings while preserving a memory.
InteRecon’s ability to reconstruct the interactive experience of different items could make it a useful tool for teachers explaining important concepts, like demonstrating how gravity pulls an object down. It could also add a new visual component to museum exhibits, such as animating a painting or bringing a historical mannequin to life (without the scares of characters from “Night at the Museum”). Eventually, InteRecon may be able to teach a doctor’s apprentice organ surgery or a cosmetic procedure by visualizing each motion needed to complete the task.
The exciting potential of InteRecon comes from its ability to add motions or interactive functions to many different objects, according to CSAIL visiting researcher Zisu Li, lead author of a paper introducing the tool.
“While taking a picture or video is a great way to preserve a memory, those digital copies are static,” says Li, who is also a PhD student at the Hong Kong University of Science and Technology. “We found that users wanted to reconstruct personal items while preserving their interactivity to enrich their memories. With the power of mixed reality, InteRecon can make these memories live longer in virtual settings as interactive digital items.”
Li and her colleagues will present InteRecon at the 2025 ACM CHI conference on Human Factors in Computing Systems.
Making a virtual world more realistic
To make digital interactivity possible, the team first developed an iPhone app. Using your camera, you scan the item all the way around three times to ensure it’s fully captured. The 3D model can then be imported into the InteRecon mixed reality interface, where you can mark (“segment”) individual areas to select which parts of the model will be interactive (like a doll’s arms, head, torso, and legs). Alternatively, you can use the function provided by InteRecon for automatic segmentation.
The InteRecon interface can be accessed via the mixed reality headset (such as Hololens 2 and Quest). It allows you to choose a programmable motion for the part of the item you want to animate after your model is segmented.
Movement options are presented as motion demonstrations, allowing you to play around with them before deciding on one — say, a flopping motion that emulates how a bunny doll’s ears move. You can even pinch a specific part and explore different ways to animate it, like sliding, dangling, and pendulum-like turns.
Your old iPod, digitized
The team showed that InteRecon can also recapture the interface of physical electronic devices, like a vintage TV. After making a digital copy of the item, you can customize the 3D model with different interfaces.
Users can play with example widgets from different interfaces before choosing a motion: a screen (either a TV display or camera’s viewfinder), a rotating knob (for, say, adjusting the volume), an “on/off”-style button, and a slider (for changing settings on something like a DJ booth).
Li and colleagues presented an application that recreates the interactivity of a vintage TV by incorporating virtual widgets such as an “on/off” button, a screen, and a channel switch on a TV model, along with embedding old videos into it. This makes the TV model come to life. You could also upload MP3 files and add a “play button” to a 3D model of an iPod to listen to your favorite songs in mixed reality.
The researchers believe InteRecon opens up intriguing new avenues in designing lifelike virtual environments. A user study confirmed that people from different fields share this enthusiasm, viewing it as easy to learn and diverse in its ability to express the richness of users’ memories.
“One thing I really appreciate is that the items that users remember are imperfect,” says Faraz Faruqi SM ’22, another author on the paper who is also a CSAIL affiliate and MIT PhD student in electrical engineering and computer science. “InteRecon brings those imperfections into mixed reality, accurately recreating what made a personal item like a teddy bear missing a few buttons so special.”
In a related study, users imagined how this technology could be applied to professional scenarios, from teaching medical students how to perform surgeries to helping travelers and researchers log their trips, and even assisting fashion designers in experimenting with materials.
Before InteRecon is used in more advanced settings, though, the team would like to upgrade their physical simulation engine to something more precise. This would enable applications such as helping a doctor’s apprentice to learn the pinpoint accuracy needed to do certain surgical maneuvers.
Li and Faruqi may also incorporate large language models and generative models that can recreate lost personal items into 3D models via language descriptions, as well as explain the interface’s features.
As for the researchers’ next steps, Li is working toward a more automatic and powerful pipeline that can make interactivity-preserved digital twins of larger physical environments in mixed reality for end users, such as a virtual office space. Faruqi is looking to build an approach that can physically recreate lost items via 3D printers.
“InteRecon represents an exciting new frontier in the field of mixed reality, going beyond mere visual replication to capture the unique interactivity of physical objects,” says Hanwang Zhang, an associate professor at Nanyang Technological University's College of Computing and Data Science, who wasn’t involved in the research. “This technology has the potential to revolutionize education, health care, and cultural exhibitions by bringing a new level of immersion and personal connection to virtual environments.”
Li and Faruqi wrote the paper with the Hong Kong University of Science and Technology (HKUST) master’s student Jiawei Li, PhD student Shumeng Zhang, Associate Professor Xiaojuan Ma, and assistant professors Mingming Fan and Chen Liang from HKUST; ETH Zurich PhD student Zeyu Xiong; and Stefanie Mueller, the TIBCO Career Development Associate Professor in the MIT departments of Electrical Engineering and Computer Science and Mechanical Engineering, and leader of the HCI Engineering Group. Their work was supported by the APEX Lab of The Hong Kong University of Science and Technology (Guangzhou) in collaboration with the HCI Engineering Group.
© Image: Alex Shipps/MIT CSAIL, with elements from the researchers.