During the Summer months, a PC running in a small game room or office can quickly turn a warm day into something far more uncomfortable. But with the general idea being that power produces heat, it's mobile CPUs that could be reducing the heat of your desktop chip.
With Develop running in Brighton this week, PCGamesN have reported that Boyd Multerer, lead engineer of Xbox Lve and Xbox One, spoke about how it's mobile phone processors that are the main focus of hardware engineers; but the effects of their research are slowly incorporated into desktop PCs.
"Mobile phones drive the research of CPUs... the market is phones" said Boyd during his Develop talk. "Who cares about a couple of hundred million PCs anymore? This is about billions of phones.” He raises a very valid point. Claiming most hardware engineers are focused on gaining a competitive edge by developing chips that consume less power from battery-starved mobile phones, it's that focus that's having a knock-on effect with servers - typically known having serious cooling needs due to power draw and optimised CPU instructions.
"The phone industry says I can have 16 CPUs so that when you want to run a complicated app I can go into high power mode and I'll have all of them working at the same time and work really fast. As soon as that app goes to the background, 14 of those CPUs get turned off and sit there drawing zero power while the next app is in front."
Speaking about multi-core processes specifically, Multerer talks about how Adobe "have done the work to get it running across all the cores and getting them all going at 100%" while video game developers have rarely put in the same time and effort, commonly leading to massive under-utilisation of current chips and otherwise having a CPU be far more inefficient than it otherwise could be.
Since the dawn of multi-core chips we've been led to understand that software rarely gets the most out of our chips despite yearly iterations gaining higher clock-speeds, cores and threads. Will the mobile market ever cool down enough to allow better desktop CPU utilization? Multerer says "it's not going to happen in the next ten years."