Imagine if you were trying to develop an autonomous robot, and it continually made a mistake but you couldn't tell why. Ultimately, you might end up having to review all the lines of code that made up its programming, hoping that the error would just jump out at you. In order to avoid such scenarios, MIT's Ali-akbar Agha-mohammadi and Shayegan Omidshafiei have created a system known as measurable virtual reality (MVR). It projects a robot's perceptions and intentions onto the floor, so that designers can see what it's thinking.
The system incorporates a ceiling-mounted projection system and multiple motion-capture cameras. Those cameras can simultaneously track multiple small robots in three-dimensional space, using markers placed on the robots.
As the robots encounter things like obstacles or targets, software that duplicates the robots' programming determines the possible routes available to them. It then renders those choices as animation, which is projected onto the floor in real time and space.
Along with possible routes, the system also displays variables such as the perceived location of objects, and the proximity of other robots. In any case, by being able to see the robot's perception of the environment superimposed over the real thing, many of its formerly nonsensical behaviors can suddenly make sense – and then the problem can be more quickly and easily corrected.
Additionally, MVR could allow for the testing of drones when safety regulations forbid such testing in the real world. Already, the scientists have used the system to evaluate forest-fire-detecting quadcopters, by having them fly over a projected aerial view of a forest while looking for computer-generated fires. They also believe that it could prove useful for developing delivery drones, allowing the actual aircraft to fly over projected urban environments.
The technology can be seen in use, in the following video.
Want a cleaner, faster loading and ad free reading experience?
Try New Atlas Plus. Learn more