Tiltrotortech
If they link three or more together to use only the most common outputs, they will be able to compensate for most of the 'mistakes' while still being more efficient than a single chip. (If their efficiency numbers are right).
Imran Sheikh
so finally a Turbo for Processor :) doesen't this means that an inExact processing(15x fast) can be achieved in a regular Processor by patching it to Bios level and blocking less used section of it... the benefit it is "its reversible"
Dave B13
Seemed to be a horrible idea at first, then I ran into the use for photos above. Photos seemed like a good idea untill I studied the pictures. I'm very surprised I can tell the 0.54 percent error from the no-error picture. The error of 7.58 percent photo appearance difference is not an acceptable degradation. From this I can imagine using it for video. I think it would be great for robot vision. I can't think of any other application where I'd want a computer part dumbed down for "efficiency".
PrometheusGoneWild.com
Now if they just can make a regular chip that can decide to prune itself when requested. This would allow the programmer to decide when to dumb things down for specific processes..... For example for recording monitoring a security camera the chip could dumb down itself until it sees movement and then kick into high efficiency/low speed/high energy use.
Iván Imhof
Well, I'm not impressed. The third image is terrible, the middle one has less details, less sharp and more faded in colors at the first sight.
The whole IT technology is already loaded with bugs, you can't find any device or software able to do its function properly and flawlessly. What is more frustrating that usually nobody knows exactly what is the reason of the bugs, why it doesn't work like it should, and there are just trial-and-error ways to fix it.
Why we want to have more errors for some more speed? We already accepted that there is no guarantee for any software (read the EULA), and I have no doubt that companies will raise they "error tolerance" level to apply such chips, so we can experience more bugs, but at least they will come faster. :) Great prespective!
cachurro
I think you guys are not getting the picture. Today, our computers have many 'processors'. So adding a couple of inexact cores to a CPU could enable it to increase efficiency on demand by shutting down some of the exact ones. And it would keep the OS and any required program running in an exact core. So don't panic. Another task for these inexact computing is solving non polynomial problems that need heuristic approaches. Like finding the best route for mail delivery, air traffic, inferring phylogeny, etc. These approaches usually consist of many educated trials an errors and keeping the best solution so far, so one mistake will not be significant, and you can always re-check the best solutions. Oh, and these problems use a huge amount of computing power, so this is not a minor breakthrough!
Joel Detrow
Dave, those are frames rendered from video, so yeah, that's the idea. Obviously photos wouldn't necessarily be allowed to make so many mistakes. One thing I wonder about is the resolution of the video - I'm pretty sure the 7.58% error would be greatly muted in higher-resolution video such as 4 or 8K.
I can imagine graphics processing exploding once again with the use of this kind of pruning - enthusiasts who really care about image quality can set it so every pixel is exactly as it should be, but folks who don't care if some pixels are a few hair-shades off could play at substantially higher resolution & framerates. I bet games of the future will have options to set the % error in addition to all the other graphics options we get now. Very cool!
Dan Stillings
Somehow, I am reminded of Rosie on the Jetsons... :-)
Gregg Eshelman
To err is Human. To really screw things up requires a computer.
And here we have the proof, processing chips designed to make errors.
VoiceofReason
@Dave B13 I can think of many applications. Like Dennis said, it can have several chips inside and only use the high powered chip when it needs to. There could also be settings on the device. Some people would gladly accept a 7.58 loss if it meant not having to charge the device once a month instead of every several days.
Reread the article. You fixated on the most pruned chips. You mean to tell us that you won't accept a device that only varies 1/4 of a percent for 3.5 times less power usage?