Photography

Sony announces camera sensor chips with built-in AI processing

View 4 Images
Sony's new AI-enabled camera sensors will be able to analyze footage in real time, making surveillance and security systems faster, more efficient and vastly less bandwidth-intensive
Real-time tracking of supermarket transactions
Sony
The AI-equipped sensors can strip the important parts out of an image and send only those on for further processing
Sony
The IMX500 and IMX501 chips have AI deep learning and processing capabilities built into the back of the chip
Sony
Sony's new AI-enabled camera sensors will be able to analyze footage in real time, making surveillance and security systems faster, more efficient and vastly less bandwidth-intensive
View gallery - 4 images

Sony has shown off what it's calling "the world's first image sensors to be equipped with AI processing functionality." These new sensors handle AI image analysis on board, so only the necessary data can be sent for further cloud processing.

Artificial Intelligence is a natural pair with digital video cameras. They take in monstrous amounts of data, the vast majority of which is of no interest to anybody, particularly when you're talking about things like security cameras.

As automation continues to escalate, we're going to need AI to keep an eye on more and more camera feeds. Checkout-free stores in the vein of Amazon's minimally staffed Go shop must constantly monitor dozens, if not hundreds of cameras to figure out who's picking up what, and what they're doing with it. Current technology puts the camera sensor in one place and the AI computer in another. Sometimes that's in the same box, sometimes it's in the same building, and sometimes it's in the marvelous cloud, where lots of distributed computers can all work on it at once.

Sony has decided to put it right there on the back of the camera sensor. Its new IMX500 and IMX501 sensors are 12.3-megapixel, 4K/60-fps image sensors with a built-in AI logic chip and processing memory on the flip side right next to the image sensor operation circuits.

It'll be able to analyze the footage right as it's happening, in as little as 3.1 milliseconds (less than the duration of a single video frame), spitting out only the important bits to send further up the chain. Customer X picked up product Y and put it in a basket – plus cropped-in captures of the item, the basket and the customer if necessary. These can be wrapped up into tiny parcels of data and sent onward using just a tiny percentage of the bandwidth you'd need to move a full 4K/60-fps video stream.

Thus, your key systems get just the information they need, and they get it faster, and the whole operation uses less energy because the data is pre-processed right where it's captured. You can load whatever AI models you like into the chip's onboard memory.

This kind of sensor is unlikely to replace the one in your mobile phone, or get rolled into the next Alpha-series camera body. It's much more geared towards commercial use, where intelligent cameras can be used to count attendance, or watch traffic, or work out who's wearing COVID masks, or whatever the use case might be. Check out a video below.

Source: Sony

View gallery - 4 images
  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
1 comment
f8lee
While very impressive from the technical standpoint, the tin-foil-hat conspiracy nut inside me is screaming "now not only must you keep your phone in a Faraday bag but also find ways to disguise your face from the omnipresent Big Brother's eyes!"