Does a computer use more electricity when charging USB devices?

67

16

Something I've always wondered. If I constantly hook up cellphones, hard drives and the like via USB to my computer, will it eat up more on the electricity bill? Or are the USB ports using up electricity by just being enabled anyway, thus not affecting the power usage?

arnehehe

Posted 2013-06-06T09:14:39.323

Reputation: 759

1Probably not enough to measure, without lab-quality meters. – Daniel R Hicks – 2013-06-06T10:34:33.353

15@DanielRHicks If he plugs five devices at 0.5A each, that makes 16W (with a 80% efficiency). That may not be relevant for the electricity bill, but it's easily measurable with a $15 watt-meter. – zakinster – 2013-06-06T11:52:43.510

@zakinster: However, the additional current drawn from the mains by just one charging USB device is about 10 mA. That could easily be within the error of an inexpensive multimeter measuring up to 2 A. Moreover, the power consumption of a pc could very well vary by 16 W in a few seconds, regardless of whether USB devices are plugged in. – Marcks Thomas – 2013-06-06T13:31:37.010

I can measure the difference with an EUR 25 (roughly US $30) 'kill-a-watt' meter. At least some of these things are both cheap and sensitive enough. (Power usage is stable at 154 Watt when my host is idle. No applications running and monitoring for a few minutes to make sure I get a good reading rather than spiked values). – Hennes – 2013-06-06T14:03:25.930

7

Randall Munroe briefly discusses your question here: http://what-if.xkcd.com/35/

– Eric Lippert – 2013-06-06T15:19:04.953

6No. You can start making profit by pumping energy from SUB sockets. – Val – 2013-06-06T16:29:03.763

7My UPS has a power meter, and when I suspend my computer with no USB devices plugged in, the power usage measures 0 watts. If I plug in a tablet and two phones for charging (the USB ports are always powered while the computer is suspended), the power usage reads 7 watts. I don't know how accurate the UPS power meter is, but there's definitely measurable power used. I haven't checked USB power usage while the computer is powered on, but the computer hovers around 80W while idle, so I'm assuming that USB charging would push it to around 87W. – Johnny – 2013-06-06T17:18:13.670

2Good question. ~Is putting an extra item in your fridge making it use more electricity? – tymtam – 2013-06-07T07:39:48.310

@Tymek, Fridge obviously is heated when you put a warm thing into it and should pump more heat into to cool thing down. So, the right question is: does your microwave oven reduce consumption when there is no food in it? When there is a metal solid in it, which becomes white-hot instantly? – Val – 2013-06-09T11:28:04.273

@Val What if the inside of the fridge just gets warmer and the fridge cooler doesn't start because it's still below a threshold? I think you're too quick to dismiss my question, but I like your microwave example. – tymtam – 2013-06-09T23:50:02.000

Answers

94

Short answer:

Does a computer use more electricity when charging USB devices?

Generally yes, but not necessarily as much as you would expect; it won't be free power, but it might be obtained more efficiently. It really depends on the particular power supply's efficiency curve, and the point at which you are operating it at (and power consumption is affected by software) :

  • If your computer power supply is underloaded (e.g. idle state), adding more load will slightly increase the power efficiency for the whole system.
  • If your computer power supply is correctly loaded, it will be near its peak efficiency with is generally much better than a USB wall charger.
  • If your computer power supply is already overloaded (which should never happened) you have more pressing issues than USB power efficiency.

Long answer:

A USB port can output maximums of 500mA (USB1&2) and 950mA (USB3) at 5V which gives maximums of 2.5W (USB1&2) and 4.75W (USB3).

USB ports don't consume power by themselves. Without anything plugged, they are just open-circuits.

Now, if you get 1A (5W) out a USB3 port, it will usually increase the global power consumption by ~6W (depending on your power supply efficiency) which would be an increase of 2% to 5% of your computer power consumption.

But, in some cases, it may be different.

If you take a look at some PSU efficiency curve (from AnandTech) :

Cooler Master UCP 900W efficiency curve

You'll see the efficiency is not a constant value, it varies a lot depending on the load applied to the PSU. You'll see on that 900W PSU that at low power (50W to 200W), the curve is so steep that an increase in the load will entail a substantial increase in efficiency.

If the increase in efficiency is high enough, it would mean that in some cases, your computer may not need to actually draw an extra 5W from the wall socket when you're drawing an extra 5W from a USB port.

Let's take an example of a computer drawing 200W on a PSU with an actual efficiency of 80% at 200W :

Computer power consumption : 200W
USB device power consumption : 5W
PSU efficiency at 200W  : 80.0%
Wall power consumption without USB : 200W / 80,0% = 250.00W

Now, depending on the efficiency curve of the PSU between 200W and 205W, the relative power consumption of the USB device may be completely different :


<Case 1>
PSU efficiency at 205W  : 80.0%
Wall power consumption with USB : 205W / 80.0% = 256,25W
Wall power consumption of the USB device : 6.25W

This is the usual simplified case, where the efficiency is the same, hence the power consumption of the USB device is equivalent to 5W / 80.0% = 6.25W


<Case 2>
PSU efficiency at 205W  : 80,5%
Wall power consumption with USB : 205W / 80,5% = 254,66W
Wall power consumption of the USB device : 4.66W

In this case, the PSU efficiency is increasing between 200W and 205W, thus you can't deduce the relative power consumption of the USB device without taking into account the whole computer power consumption, and you'll see the relative increase at the wall socket may actually be lower than 5W.

This behavior only happens because, in that case, the PSU is under-loaded, so it's not the usual case, but it's still a practical possibility.


<Case 3>
PSU efficiency at 205W : 82%
Wall power consumption with USB : 205W / 82% = 250,00W
Wall power consumption of the USB device : 0W

In this case, the PSU draws the same power from the wall socket, whatever the load it receives. This is the behavior of a zener regulator where all unnecessary power are dissipated into heat. It's a behavior that can be observed in some kind of low-end PSU at very small load.


<Case 4>
PSU efficiency at 205W : 84%
Wall power consumption with USB : 205W / 84% = 244,00W
Wall power consumption of the USB device : -6W

That last case, is a purely hypothetical case where the PSU would actually consume less power at higher load. As @Marcks Thomas said, this is not something you can observe from a practical power-supply, but it's still theoretically possible and proves that the instinctive TANSTAAFL rule cannot always be applied that easily.


Conclusion :

If you need to charge a lot of 5V devices, it's better to do it from an already running computer than from multiple wall chargers. It won't be free but it will be more efficient.

Also note that you may need USB ports with 1A capability (e.g. USB3) in order to get the same charge speed.

zakinster

Posted 2013-06-06T09:14:39.323

Reputation: 2 209

6I don't think any practical power supplies have a sufficiently steep efficiency curve to actually reduce consumption under increased load, but +1 for making the very relevant point that a computer can be more efficient than a wall charger. – Marcks Thomas – 2013-06-06T11:38:58.650

4@MarcksThomas I don't think either, but it's theoretically possible and it would be easy to build a dummy inefficient PSU that behaves this way. I was just making the point that the simple TANSTAAFL reasoning only works if you don't take into account the fact that the computer PSU may already draw power that you're not using. The overall consumption won't obviously decrease, but I wouldn't be surprise if it doesn't increase as much as expected. – zakinster – 2013-06-06T12:06:48.750

1If you start using an extra 5 Watt to charge a device your computer is likely to draw an extra 6 Watt from the wall socket. (That is 5 Watt plus inefficiency of the PSU, where an average PSU is roughly 80% efficient). A separate charger might be less efficient, especially if if it left plugged in 24/7. This is because even when not in used a charger tends to consume some power. Not much, but 24 hours per day times not much will add up. (Not that the OP asked for that much detail. :) ). – Hennes – 2013-06-06T13:52:48.660

@Hennes It's indeed likely that an extra 5W on the +5V intails an extra 6W on the wall socket. But if you take a PC drawing 200W with a PSU efficiency of 0.8 at 200W and 0.805 at 205W, the +5W would actually draw on the wall socket *205W / 0.805 - 200W / 0.8 = 4.65W*. And it's not a very unlikely efficiency curve for a powerful PSU. – zakinster – 2013-06-06T14:43:25.623

2@zakinster If a PC draws 200W with 80% efficiency, it will draw 250W from the wall (since 20% is lost in PSU conversion). Adding 5W to the PC's draw amount gives 205W drawn, and at 80% efficiency this gives 256.25W drawn from the wall (or an additional 6.25W). – Breakthrough – 2013-06-06T15:27:56.367

3@Breakthrough Completely true if the efficiency is a constant 80% at 200W and 205W, but I specified in my example that the PSU efficiency was actually 80,5% at 205W – zakinster – 2013-06-06T15:35:40.373

@zakinster, you need to change your second example so that your theoretical PSU is drawing 200 @ 80% in the first scenario; right now, seeing the wall consumption decrease as a result of going from 205W @ 82% to 205W @ 84% is entirely unremarkable. :-) – Hellion – 2013-06-06T17:05:25.530

@Hellion actually it's two variants of the first senario, I'll edit to clarify. – zakinster – 2013-06-06T17:09:58.893

@zakinster, ah, I see what you were going for now, thanks. – Hellion – 2013-06-06T17:36:26.967

1This is totally misleading. It does not matter if you need 4 or 6 Watts of power input, the answer to the question is always yes. There is not a single power supply where the efficiency jumps by orders of magnitude if 5 Watts are additionally drawn. Look at your own graph, over a range of 1000 Watts a mere 17% change in efficiency has been measured. – Alexander – 2013-06-08T13:16:36.010

Ah sorry, I didn't see that you had changed the efficiency value... However, I wanted to illustrate that indeed the computer would use more electricity, even in that example I highlighted (even given the efficiency change, you will still draw more power from the wall - and indeed, a 5W change is negligible [even given a 250W supply, that's only a 2% change] enough that the efficiency wouldn't increase enough to outweigh the increased load - and that already assumes you're on the lower half of the efficiency curve). I agree with @Alexander that this is misleading, for these reasons. – Breakthrough – 2013-06-09T06:41:28.480

The long answer did indeed explain the concerns in my last comment; however, the previous short answer did seem to be somewhat misleading. I edited the answer to better reflect these concerns, and to better indicate the full implications of the "long answer" (specifically, there's no free power here). – Breakthrough – 2013-06-09T06:53:20.380

What a fascinating answer. I really appreciate you guys going the extra mile on this – arnehehe – 2013-06-12T08:07:20.387

45

TANSTAAFL also applies here.

You don't get power for nothing. Otherwise we could just use the USB ports to power another computer, and use the other computer to power the first. It is a fun idea, but it does not work.

The energy for charging is rather small though. USB1 or 2 use 100 to 500 mAmp at 5 volts. That is a maximum of 2½ Watt. Compared to normal idle power drain of a PC that is rather small. (Normal: 50 watt for an office PC to 150 watt idle for a high end PC. And roughly thrice that when gaming, compiling etc etc).

Hennes

Posted 2013-06-06T09:14:39.323

Reputation: 60 739

Figured as much, but nice to know for sure. Thanks – arnehehe – 2013-06-06T09:53:29.193

120 to 100 mWatt at 5 volts equals 5 Watt ? I must have missed something. – zakinster – 2013-06-06T10:19:46.490

2Oops. Math fixed. Actually, not just watt. volt x watt = Watt was a brainfart. That should have been in amperage. – Hennes – 2013-06-06T10:29:15.493

17@Hennes You can't apply the free lunch rule as easily, the computer power supply may already be wasting the energy needed by the USB devices and may be able to power these devices without even increasing the load on the wall socket. That may not be the usual case, but it's a common behavior for a seriously under-loaded PSU. – zakinster – 2013-06-06T15:52:07.657

True if it is seriously under-loaded. But that really is an edge case which you will only encounter with a PSU running roughly around 10% of its max rated power. In which case you really have a home-build computer with the wrong PSU, or you have a machine build for high-end gaming or similar and you are not gaming on it at that time. I assume it is rare enough that I do not need to worry about it. – Hennes – 2013-06-06T16:08:24.490

@Hennes What if the computer is in sleep mode while you're charging your devices ? That's a pretty common case of under-loaded PSU. – zakinster – 2013-06-06T16:19:09.963

That one never crossed my mind. I still power down all my desktops and laptops rather then unsuspend or unsleep them. I am also not quite sure how USB will react to an attempt to charge when powered down. On some systems (e.g. a Dell E6500) it will not charge a device at all, though there is a BIOS option to bypass this. If anyone has any experience with this I will be glad to learn about it. – Hennes – 2013-06-06T16:36:32.423

5TANSTAAFL is also known as the principle of "conservation of energy." – wchargin – 2013-06-06T19:04:47.897

17This answer is ill-founded. The conservation of energy principle alone does not guarantee us that a charging device uses more energy while charging and less while not charging. A charging device could consume the same energy regardless of whether or not it is charging, by wasting power when it's not charging. You can get something for nothing when you utilize that which is otherwise wasted. (Thus it is necessary to argue that this does not happen in a computer with USB ports.) – Kaz – 2013-06-06T21:24:26.240

1-1 What zakinster said – tymtam – 2013-06-07T07:37:53.070

5-1 for not reading the question carefully. The question didn't ask whether USB ports provide magical free power. The question asked whether they always use power, or only when they're charging something. – Kyralessa – 2013-06-07T08:24:18.207

11

Yes. It's a basic rule of physics; if something's taking power away from your computer, your computer must get that power from somewhere. USB ports don't consume power just by being enabled*, any more than a power outlet would consume power just by having the switch "on" with nothing plugged in.

* Alright, there is a minimal amount of power consumed by the USB controller chip monitoring to see if something's plugged in, but that's a tiny amount of power.

Stu

Posted 2013-06-06T09:14:39.323

Reputation: 633

And that controller chip power is used regardless of whether a flash drive is plugged in or not, so it doesn't even factor in :) – Thomas – 2013-06-06T14:41:04.907

Sure, but if you disable the ports (some laptops have the option), I'd expect it to power-down the controller too. – Stu – 2013-06-06T18:42:42.840

4It is not a basic rule of physics. – Kaz – 2013-06-06T21:26:40.640

2I would -1 - it is not 'basic rule of physics'. – tymtam – 2013-06-07T07:39:08.527

9

Yes, you are using more electricity, but not in amounts that will make a huge difference to your bill at the end of the month.

NickW

Posted 2013-06-06T09:14:39.323

Reputation: 1 029

1nice and simple answer :) – Joe DF – 2013-06-06T17:48:28.833

But if your computer is a laptop, it will make a difference to your battery life. – 200_success – 2013-06-09T02:33:09.447

Agreed, but he asked about his electricity bill :) – NickW – 2013-06-10T08:26:29.173

4

Short Answer:

YES; you'll always pay for the USB power with at least that much more power from the wall. Not only is this required by the laws of thermodynamics, it's also inherent in the way power supplies work.


Longer Answer:

We'll take the whole system of the computer, its internal power supply, its operating circuits and the USB port circuitry to be one big, black box called the Supply. For the purposes of this illustration, the whole computer is one oversized USB charger, with two outputs: the computer operating power, which we will call Pc, and the output USB power, which we will call Pu.

Converting power from one form, (voltage, current, frequency), to another, and conducting power from one part of a circuit to another, are all physical processes which are less than perfect. Even in an ideal world, with superconductors and yet-to-be-invented components, the circuit can be no better than perfect. (The importance of this subtle message will turn out to be the key to this answer). If you want 1W out of a circuit, you must put in at least 1W, and in all practical cases a bit more than 1W. That bit more is the power lost in the conversion and is called loss. We will call the loss power Pl, and it is directly related to the amount of power delivered by the supply. Loss is almost always evident as heat, and is why electronic circuits which carry large power levels must be ventilated.

There is some mathematical function, (an equation), which describes how the loss varies with output power. This function will involve the square of output voltage or current where power is lost in resistance, a frequency multiplied by output voltage or current where power is lost in switching. But we don't need to dwell on that, we can wrap all that irrelevant detail into one symbol, which we will call f(Po), where Po is the total output power, and is used to relate output power to loss by the equation Pl = f(Pc+Pu).

A power supply is a circuit which requires power to operate, even if it is delivering no output power at all. Electronics engineers call this the quiescent power, and we'll refer to it as Pq. Quiescent power is constant, and is absolutely unaffected by how hard the power supply is working to deliver the output power. In this example, where the computer is performing other functions besides powering the USB charger, we include the operating power of the other computer functions in Pq.

All this power comes from the wall outlet, and we will call the input power, Pw, (Pi looks confusingly like Pl, so I switched to Pw for wall-power).

So now we are ready to put the above together and get a description of how these power contributions are related. Well firstly we know that every microwatt of power output, or loss, comes from the wall. So:

Pw = Pq + Pl + Pc + Pu

And we know that Pl = f(Pc+Pu), so:

Pw = Pq + f(Pc+Pu) + Pc + Pu

Now we can test the hypothesis that taking power from the USB output increases then wall power by less than the USB power. We can formalise this hypothesis, see where it leads, and see whether it predicts something absurd, (in which case the hypothesis is false), or predicts something realistic, (in which case the hypotheses remains plausible).

We can write the hypothesis first as:

(Wall power with USB load) - (Wall power without USB load) < (USB power)

and mathematically as:

[ Pq + f(Pc+Pu) + Pc + Pu ] - [ Pq + f(Pc) + Pc ] < Pu

Now we can simplify this by eliminating the same terms on both sides of the minus sign and removing the brackets:

f(Pc+Pu) + Pu - f(Pc) < Pu

then by subtracting Pu from both sides of the inequality (< sign):

f(Pc+Pu) - f(Pc) < 0

Here is our absurdity. What this result means in plain English is:

The extra loss involved in taking more power from the supply is negative

This means negative resistors, negative voltages dropped across semiconductor junctions, or power magically appearing from the cores of inductors. All of this is nonsense, fairy tales, wishful thinking of perpetual-motion machines, and is absolutely impossible.


Conclusion:

It is not physically possibly, theoretically or otherwise, to get power out of a computer USB port, with less than the same amount of extra power coming from the wall outlet.


What did @zakinster miss?

With the greatest respect to @zakinster, he has misunderstood the nature of efficiency. Efficiency is a consequence of the relationship between input power, loss and output power, and not a physical quantity for which input power, loss and output power are consequences.

To illustrate, let's take the case of a power supply with a maximum output power of 900W, losses given by Pl = APo² + BPo where A = 10^-4 and B = 10^-2, and Pq = 30W. Modelling the efficiency (Po/Pi) of such a power supply in Excel and graphing it on a scale similar to the Anand Tech curve, gives:

enter image description here

This model has a very steep initial curve, like the Anand Tech supply, but is modelled entirely according to the analysis above which makes free power absurd.

Let's take this model and look at the examples @zakinster gives in Case 2 and Case 3. If we change Pq to 50W, and make the supply perfect, with zero loss, then we can get 80% efficiency at 200W load. But even in this perfect situation, the best we can get at 205W is 80.39% efficiency. To reach the 80.5% @zakinster suggests is a practical possibility requires a negative loss function, which is impossible. And achieving 82% efficiency is still more impossible.

For summary, please refer to Short Answer above.

Billysugger

Posted 2013-06-06T09:14:39.323

Reputation: 51

Great answer, but I disagree with your conclusion; the loss function does not need to increase everywhere. In fact, it is trivial to design, for the sake of argument, a power supply that reduces loss under load, although this feature would not be useful. This answer very thouroughly shows unlikeliness, not impossibility. – Marcks Thomas – 2013-07-01T11:26:55.543

The OP was referring to charging from a practical computer. While I have no doubt that one could artificially add dissipative elements which switch in under certain circumstances, to prove a point, that would constitute increased load, (for the purposes of trying to prove a point), and not increasing loss. But if there's a reasonable and practical power supply design that exhibits a negative loss function, and is not improved measurably in terms of cost or performance by eliminating the negative loss function, then I'd like to see it. – Billysugger – 2013-07-01T13:55:35.697

3

It's possible that a computer could draw the same power while charging devices, as when not charging devices (all else being equal, like the CPU load). Laws of physics, like the principle of conservation of energy, do not provide any guarantee that this cannot happen.

For that to happen, the computer would have to be wasting power when the devices are not plugged in, such that when they are plugged in, the otherwise wasted power is then redirected into them and thereby utilized.

Electronic designers would have to go out of their way to contrive such a wasteful design, but it is possible. A circuit that draws exactly the same amount of power, whether or not it is charging one or more batteries, is harder to design than one which draws power in proportion to the charging work, and the result is a wasteful device that nobody wants.

In reality, designers reach for off-the-shelf voltage regulators to power the components of the motherboard. Voltage regulators have the property that the less loaded they are, the less power they draw overall, and the less they waste internally. (Linear regulators waste more, switching ones less, but both consume less when less loaded.)

Anything in the system that is powered down contributes something to net energy saving: powered down ethernet port, powered down Wi-Fi transmitter, spun down disk, sleeping CPU, or USB port not delivering current. The saving is two-fold: firstly, the subsystem itself doesn't use energy, and secondly, less energy is wasted upstream as heat dissipation in the power supply chain.

Kaz

Posted 2013-06-06T09:14:39.323

Reputation: 2 277

1Actually, power supply circuits which draw a relatively constant amount of power regardless of how much power is needed used to be somewhat common, and I wouldn't be surprised if they're still used in some applications. If a mains-powered device never needs more than 1mA, a 100K resistor, "ordinary" diode, a zener, and a cap can pretty cheaply convert the AC120 to an unregulated voltage which is low enough to feed into a cheap regulator. Such a device would probably draw about 1/8 watt continuously, independent of how much was used, but could likely be cheaper than any practical alternative. – supercat – 2013-06-06T23:59:45.083

1

Yes. It's basic physics (thermodynamics). In the same way, charging your phone in your car uses a little more petrol. Another example is kinetic watches: you have to eat a tiny bit more food because you wear a kinetic watch! It is probably immeasurable, but the law of conservation of energy demands it. Energy cannot be created or destroyed.

Dermot

Posted 2013-06-06T09:14:39.323

Reputation: 111