You have /5 articles left.
Sign up for a free account or log in.

In days of old, when students were bold, computers were both more impressive and less powerful. They also generated a tremendous amount of heat, so that data centers invested almost as much in cooling equipment as they did in the computers themselves.

Yesterday, I was walking down an office hall when I overheard a co-worker discussing how he “underclocked” (ran at a speed slower than maximum) his home PC’s graphics processor in order to let it run cooler. Cost and scale might change, but the basic considerations remain pretty much the same.

Two thoughts regarding central computing on campuses:

  1. Your central server room(s) can house a few large servers or a lot of little ones. With modern “virtualization” software, the few big boxes can look and act as if there were many smaller machines, so in terms of how people and systems use the servers, it really doesn’t matter. However, fewer larger servers cost less to buy, cost less to operate (lower energy requirements), can decrease software licensing costs and generate less heat. If you don’t spend money generating needless heat, you don’t have to spend additional money dissipating it.
  2. What heat your server room(s) do generate can be put to good use, at least potentially. During the summer, you might want to exhaust that heat as efficiently as you can, without overburdening your cooling systems. But during heating season, it only makes sense to take advantage of heat from computing equipment, find a way to channel it into campus buildings and thereby reduce the load on (and fossil fuel consumed by) the actual heating system.

Back in the bad old days, the color of computing was (big) blue. Now computing needs to be (smaller and) green. The alternative choices are black and red. As in carbon and ink, respectively.

Next Story

Written By