Why Do Computers Use So Much Energy: Where does it go?

The amount of energy used by a computer isn’t much compared to some household appliances, but it is significant.

Standard computers use the bulk of the energy they consume to operate the main processor and the motherboard chipsets. The energy is consumed as bits in the devices are constantly changing states between 1 and 0. As these changes occur, it gets released as heat and dissipated by the PC cooling system.

You may be surprised to learn how much energy a computer can consume for seemingly doing nothing. Even though there’s no movement, those little black chips are working hard.

What uses so much energy in a computer?

The process that uses so much energy in a computer has been the bain of computer engineers for years. However, as transistor density in chips goes up, the amount of energy consumed in an ever-shrinking space becomes more and more of a challenge to deal with.

Every time a bit changes state, a small amount of electricity is transferred between two points. Either to create the one or drain off the power to become a zero. When this electricity moves, it has to travel through conductors and semiconductors that have resistance.

That means they consume a tiny amount of electricity and generate a tiny amount of heat on every change. This tiny amount of heat on its own isn’t a problem. The problem arises when you pack millions of transistors close together and have them change state millions of times per second.

Thankfully, computer engineers have found solutions to this problem, and our modern electronics have continuously improved.

Regardless, the changes to the state of individual bits inside the computer chips in your computer which rapidly take place are the actions that consume electrical energy. It’s an unavoidable part of computing and the topic of many research efforts. Superconducting material can conduct electricity with zero or near-zero losses.

If you have ever heard of a superconducting computer, a computer that doesn’t have heat losses, that’s the problem they’re trying to tackle. Likewise, light-based computers are trying to achieve the same thing only from a different angle. By using light instead of electricity, engineers can avoid the losses associated with electric circuits at that level.

What uses the most electricity in a PC?

The most electricity used in a PC is in the chip with the highest density of transistors. The main processor. Especially if your processor has multiple cores. The second-largest consumer after the processor is the chipset on the motherboard. This includes things like your graphics card, your sound card, and the BIOS.

After that, the next largest consumer is your hard drive or long-term storage. These devices are either mechanical in nature or have a substantial number of transistors on their own. Last but not least is the memory or RAM. Even though your memory stores your data that may be changing, it often loads and holds many of the same values, so the consumption is less.

A standard processor with a few cores consumes around 66 Watts. Once a chip maker increases the number of cores on a given processor, the transistor count can really explode. So much so that the power consumption can be as high as 140 Watts. That’s a lot for a single part just over one square inch in size.

It takes excellent cooling and energy management to keep a chip like that from damaging itself. A good way to think about it is to consider how much heat is given off by a single 100 Watt light bulb. Now squeeze that down into the reduced surface area of a small black chip. That’s a lot of heat concentrated in a small area.

This is all assuming that you don’t have a large aftermarket GPU or graphical processing unit. These devices are add-on cards that can consume as much or even more than the entire computer altogether. The density of cores and transistors on a graphical processing unit can easily dwarf the largest processor.

That’s because GPUs often support hundreds or even thousands of cores that are application-specific to high-speed graphics rendering. Some GPUs are so large that when you add them to your computer, you may also need to upgrade your power supply to operate them.

Typical Standard PCPower Used in Watts
CPU66
Motherboard Chipset19
Hard Drive10
Memory (RAM)3
Power98

What type of energy do computers use?

Computers at the board level typically run on multiple DC voltage levels or direct current electrical power. However, to make them accessible, they have the provisions to be plugged into standard 120V AC power. To operate correctly, high-quality, stable AC power is necessary. Without it, your computer may not work properly or even could be damaged.

On a desktop computer, the AC power provided by the mains at 120V is delivered to the power supply, where the different DC voltages with specific current capability are created. Then, the power supply distributes the power to where it’s required through a wiring harness. It can be quite messy, but it does a good job of delivering the power where it needs to be.

On a laptop, the power is provided by the battery, which is already DC electrical power. It uses a charger to connect the laptop to 120V AC mains to get a charge. The laptop will have dedicated power supply chips that change the battery voltage to the voltage necessary for all the computer functions within the laptop, built onto the mainboard.

Tablet and smartphone devices have a similar arrangement to laptops but with the added flexibility of charging from a USB port. That allows users to charge their device from any USB port or an adapter that creates USB-level DC power from 120V AC mains power.

Scroll to Top