A far cry in terms of both size and capability from the “bricks” of just over a decade ago, the smartphones of today are virtual offices and entertainment arcades that fit in your pocket. As we reported last month, America’s Department of Homeland Security is examining whether the ability to detect dangerous airborne chemicals should be the next function that mobile phones add to their ever-expanding utility belts. Researchers at the University of California, San Diego (UCSD) have now begun work on a prototype sensor that could help map airborne toxins in real time.
Working in collaboration with San Diego startup, Rhevision, Inc., the UCSD research team headed by Michael Sailor, professor of chemistry and biochemistry, has already successfully finished the first phase of development of the toxin-detecting sensor - a porous flake of silicon that changes color when it interacts with specific chemicals. By manipulating the shape of the pores, the researchers can tune individual spots on the silicon flake to respond to specific chemical traits. A megapixel camera smaller than the head of a pencil eraser captures the image from the array of nanopores in Sailor's chip.
UPGRADE TO NEW ATLAS PLUS
More than 1,500 New Atlas Plus subscribers directly support our journalism, and get access to our premium ad-free site and email newsletter. Join them for just US$19 a year.UPGRADE
"It works a little like our nose," Sailor said. "We have a set of sensory cells that detect specific chemical properties. It's the pattern of activation across the array of sensors that the brain recognizes as a particular smell. In the same way, the pattern of color changes across the surface of the chip will reveal the identity of the chemical."
Already their chips can distinguish between methyl salicylate, a compound used to simulate the chemical warfare agent mustard gas, and toluene, a common additive in gasoline. Potentially, they could discriminate among hundreds of different compounds and recognize which might be harmful.
To focus on the fine-scale detail in their optical array, the team uses a new kind of supermacro lens that works more like an animal's eye than a camera lens – similar technology to the FluidFocus lens lens unveiled by Philips at CeBIT way back in 2004. The new lens, developed by Rhevision, uses fluid rather than bulky moving parts to change its shape, and therefore focus.
"The beauty of this technology is that the number of sensors contained in one of our arrays is determined by the pixel resolution of the cell phone camera. With the megapixel resolution found in cell phone cameras today, we can easily probe a million different spots on our silicon sensor simultaneously. So we don't need to wire up a million individual sensors," Sailor said. "We only need one. This greatly simplifies the manufacturing process because it allows us to piggyback on all the technology development that has gone into making cell phone cameras lighter, smaller, and cheaper."
The team is working on sensitivity to additional chemicals. One of the top priorities for emergency responders is carbon monoxide, which firefighters can't smell in the midst of a sooty fire though it's deadly. Sensors on their masks could let them know when to switch to self-contained breathing devices, Sailor said. Similar sensors might warn miners of the buildup of explosive gases.View gallery - 5 images