All practical computers made today operate electronically. Moving electrons—electricity—are the media of their thoughts. Electrical pulses course from one circuit to another, switched off or on in an instant by logic chips. Circuits combine the electrical pulses together to make logical decisions and send out other pulses to control peripherals. The computer's signals stay electrical until electrons colliding with phosphors in the monitor tube push out photons toward your eyes or generate the fields that snap your printer into action.

Of course, your computer needs a source for the electricity that runs it. The power does not arise spontaneously in its circuits but rather must be derived from an outside source. Conveniently, nearly every home in America is equipped with its own electrical supply that the computer can tap into. Such is the wonder of civilization.

But the delicate solid-state semiconductor circuits of today's computers cannot directly use the electricity supplied by your favorite utility company. Commercial power is an electrical brute, designed to have the strength and stamina to withstand the miles of travel between generator and your home. Your computer's circuits want a steady, carefully controlled trickle of power. Raw utility power would fry and melt computer circuits in a quick flash of miniature lightning.

For economic reasons, commercial electrical power is transmitted between you and the utility company as alternating current, the familiar AC found everywhere. AC is preferred by power companies because it is easy to generate and adapts readily between voltages (including to very high voltages that make long distance transmission efficient). It's called alternating because it reverses polarity—swapping positive for negative—dozens of times a second (arbitrarily 60Hz in America; 50Hz in Europe).

The changing or oscillating nature of AC enables transformers to increase or decrease voltage (the measure of driving force of electricity), because transformers only react to electrical changes. Electrical power travels better at higher voltages because waste (as heat generated by the electrical current flowing through the resistance of the long-distance transmission wires) is inversely proportional to voltage. Transformers permit the high voltages used in transmitting commercial power—sometimes hundreds of thousands of volts—to be reduced to a safe level (nominally 117 volts) before it is led into your home.

As wonderful as AC is to power companies, it's an anathema to computer circuits. These circuits form their pulses by switching the flow of electricity tapped from a constant supply. Although computers can be designed that use AC, the constant voltage reversal would complicate the design so that juggling knives while blindfolded and riding a roller coaster would seem tame in comparison. Computers (and most electronic gear) use direct current (DC) instead. Direct current is the kind of power that comes directly from a primary source—a battery—a single voltage that stays at a constant level (at least constant as long as the battery has the reserves to produce it). Moreover, even the relatively low voltage that powers your lights and vacuum cleaner would be fatal to semiconductor circuits. Tiny distances separate the elements inside solid-state circuits, and high voltages can flash across those distances like lightning, burning and destroying the silicon along the way.

     Python   SQL   Java   php   Perl 
     game development   web development   internet   *nix   graphics   hardware 
     telecommunications   C++ 
     Flash   Active Directory   Windows