Thursday, August 03, 2006

Crystal Ball

Moonbase is making some prognostications about the future of programming in the next ten years. On top of the list is that energy consumption will dominate computing. Here's the energy list:

  • The notions of optimizing for speed and optimizing for power consumption will begin to converge.
  • Energy efficiency will become a selling point for all “consumer” computing devices (not just battery-dependent ones).
  • We will see the beginnings of hardware and software infrastructure for energy accounting similar to that available for CPU time. This includes profilers.
  • Fine-grained billing for use of computer resources will make a little bit of a comeback. People will still prefer flat rates.
  • “Ubiquitous” computing will become popular in developed regions, but energy economies of scale (and a desire to avoid contact burns from high-powered portable devices) will be a selection pressure towards a network of just-dumb-enough nodes and centralized computers.
  • Many things will turn out to be cheaper to do than to simulate.
Lots of interesting points here, particularly programming for power consumption. As a pretty handy programmer myself, I wondered exactly how this could happen - how could one actually write code that was more energy efficient? At first glance it seemed like this was overkill, but then I saw the green wifi project and saw how this could be useful. Green wifi is a project to create solar powered wireless nodes, and part of it is programming the router to minimize power comsumption. This obviously makes sense; since the router is solar powered, very judicious use of on/off time is required. There could be some interesting logic here, maybe coupled with a neural network to determine optimal on/off times. I could see some weird stuff happening, like dropping connected users to conserve power for later on when the router predicts more users will be connected. Exciting stuff.

No comments: