Last week, Orbital Sciences' second commercial resupply mission delivered two Project Tango Google smartphones to the International Space Station. The sensor-filled phones will be used to create a detailed 3D map of the spacecraft, which will then help two soccer ball-sized, free-flying satellites autonomously navigate through some very tight spots.
Back in 2006, three Synchronized Position Hold, Engage, Reorient, Experimental Satellites (SPHERES) free-flying robots boarded the ISS to perform experiments on satellite servicing, vehicle assembly and flying formations. Each satellite is only about the size of a soccer ball, but it is still able to autonomously and precisely navigate its surroundings in micro-g environments.
Autonomous navigation is important when you consider that relaying instructions from the ground faces the problem of delays in the order of several seconds. So NASA is using these robots as a testbed for visually guided autonomous robotic exploration, hoping that its future robots will be able to make good use of the SPHERES technology.
Crucially, each robot is also able to interface with a wide range of electronics, from stereoscopic goggles to smartphones. For instance, in 2011 a Nexus S smartphone was interfaced with one of the satellites in order to test vision-based navigation in a spacecraft of such a small size.
With its latest resupply mission, NASA has now sent the astronauts two Project Tango smartphones that will put the two remaining SPHERES to good use. The prototype phone under development by Google will interface with the robots and raise their capabilities through the addition of cameras, special-purpose computer vision processors, and depth and motion tracking sensors.
The experiment will take place in two separate phases. First, the astronauts aboard the ISS will manually carry the Project Tango smartphones around the station, creating a detailed 3D map of the interior of the spacecraft thanks to the advanced sensors embedded in the phones.
Then, the astronauts will connect the smartphones to the SPHERES themselves, turning them into what NASA calls "Smart SPHERES." The microsatellites will therefore be able to use the 3D map to autonomously navigate their environment, providing situational awareness to crewmembers inside the station and flight controllers in mission control.
If all goes well, in the future these devices may be employed aboard the ISS to perform "housekeeping" tasks, using sensors to monitor air quality and noise levels, as well as performing other automated safety checks and relaying data to the flight computers via a Wi-Fi connection, which would save the astronauts some precious time.
"NASA uses robots for research and mission operations; just think about the rovers on Mars or the robotic arm on the ISS or space shuttle," says Smart SPHERES project lead Chris Provencher. "Inside the ISS space is limited, so it’s really exciting to see technology has advanced enough for us to demonstrate the use of small, mobile robots to enhance future exploration missions."
Want a cleaner, faster loading and ad free reading experience?
Try New Atlas Plus. Learn more