Finally, video-gamers whose parents told them that playing games would never help them get a job can point to research that proves them wrong – well, sort of, anyway. A new study by a University of Warwick researcher has demonstrated that scientists trying to model a range of processes could ‘borrow’ an Xbox chip to get all the power and capabilities they need, saving thousands of dollars on parallel processing hardware and/or countless man-hours.

Dr Simon Scarle, a researcher in the University of Warwick’s WMG Digital Laboratory, is studying abnormal electrical activity in the heart which can lead to heart attacks. He needed to conduct simulations of how electrical excitations in the heart moved around damaged cardiac cells in order to investigate or even predict cardiac arrhythmias. However, to conduct these simulations using traditional CPU-based processing, scientists would normally need to book time on a dedicated parallel processing computer or spend thousands of dollars on a parallel network of PCs.

But Dr Scarle put his background as a software engineer at the Warwickshire firm Rare Ltd, part of Microsoft Games Studios, to good use. His time there made him wise to the parallel processing power of Graphical Processing Unit (GPU) of the Xbox 360. For the cost of a few hundred pounds, he was able to conduct much the same scientific modeling as several thousand pounds of parallel network PCs.

The results of his work have just been published in the journal Computational Biology and Chemistry under the title of ‘Implications of the Turing completeness of reaction-diffusion models, informed by GPGPU simulations on an Xbox 360: Cardiac arrhythmias, re-entry and the Halting problem’ (phew, I’ve written shorter essays than that title).

“This is a highly effective way of carrying out high-end parallel computing on ‘domestic’ hardware for cardiac simulations. Although major reworking of any previous code framework is required, the Xbox 360 is a very easy platform to develop for and this cost can easily be outweighed by the benefits in gained computational power and speed, as well as the relative ease of visualization of the system,” Sarle said.

However, his research does have some bad news for a group of cardiac researchers. Sarle’s study demonstrates no-one can predict the rise of certain dangerous arrhythmias, as he has shown that cardiac cell models are affected by a specific limitation of computational systems known as the Halting problem.

View gallery - 3 images