Omidshafiei and Agha-mohammadi visualize robot thoughts


November 12, 2014

In a darkened, hangar-like space inside MIT’s Building 41, a small, Roomba-like robot is trying to make up its mind. Standing in its path is an obstacle — a human pedestrian who’s pacing back and forth. As the robot considers its options, its “thoughts” are projected on the ground: A large pink dot appears to follow the pedestrian — a symbol of the robot’s perception of the pedestrian’s position in space. Lines, each representing a possible route for the robot to take, radiate across the room in meandering patterns and colors, with a green line signifying the optimal route. The lines and dots shift and adjust as the pedestrian and the robot move.

This new visualization system combines ceiling-mounted projectors with motion-capture technology and animation software to project a robot’s intentions in real time. The researchers have dubbed the system “measurable virtual reality (MVR) — a spin on conventional virtual reality that’s designed to visualize a robot’s “perceptions and understanding of the world,” says Ali-akbar Agha-mohammadi, a postdoc in MIT’s Aerospace Controls Lab. The system was developed by Shayegan Omidshafiei, a graduate student, and Agha-mohammadi. Read more on MIT News.

Leave a Reply

Your email address will not be published. Required fields are marked *