Beyond pick-and-place, X-OOHRI exposes abstract robot actions via a radial menu after selecting a real-world object. Users then manipulate virtual twins to specify missing spatial parameters.
This can also support remote teleoperation 🎮
3/4
Beyond pick-and-place, X-OOHRI exposes abstract robot actions via a radial menu after selecting a real-world object. Users then manipulate virtual twins to specify missing spatial parameters.
This can also support remote teleoperation 🎮
3/4
Check out “Explainable OOHRI: Communicating Robot Capabilities and Limitations as AR Affordances” at #HRI2026 for more details!
🔗 Project page: https://xoohri.github.io
📄 Paper: https://arxiv.org/abs/2601.14587
Led by Lauren Wang, in collaboration with Mo Kari
@hciPrinceton HCI and
@PrincetonCS.
#HCI #AR #Robotics #HRI
4/4
If you have a fediverse account, you can quote this note from your own instance. Search https://hci.social/users/parastoo/statuses/116059622018413944 on your instance and quote it. (Note that quoting is not supported in Mastodon.)