|
|
An idea strikes me - currently, we (as a species) do a lot of our heating - of houses &c. - by directly converting electricity into heat.
It would be possible to add an intermediary stage - a computational one. We do this a lot at present, of course; the energy used to power your PC will ultimately end up as heat. Each PC is, in effect, an, oh, let's call it 250W electric heater.
But suppose we used computers instead of our heaters? Instead of having an electric fire on the wall, imagine a bunch of processors, some rudimentary architecture (no video or human-computer interface, a small amount of local storage), and a network connection, all bundled up into a small box. Allow that box to connect to things like, say, the SETI at home stuff, or the Beeb's climate change predictor thing - any of the big, multi-computer networking projects - and we'd get both surplus computational power and heat instead of just wasting the energy direct to heat.
Drawbacks as far as I can see would be: you'd need some way of connecting each box to the home's internet connection. You'd need to do some work to make sure the major projects could talk to each box. You'd have security issues ensuring that only "legit" projects could control the boxes. Initial cost would be higher than a wasteful electric heater (although, not, perhaps, hugely so, especially if they were being mass-produced).
Chip-manufacturers get a big win, spare computing capacity goes through the roof, and the world, if not exactly saved, isn't particularly worse off than it will be anyway.
Anyone got any thoughts, plusses I haven't seen or more crippling obstacles?
(or a link to whichever bastard has got there first! ) |
|
|