Power adapters vs ATX PSU

2

I have a home server based on an old Pentium III 933 @ 500 MHz PC with 512MB RAM, a 60GB hard drive, two Gigabit NICs, running Ubuntu Server 9.04, and there are three network devices "supporting" it:

  • Cable modem
  • Wi-Fi router, currently used only as a wireless access point
  • 5-port Gigabit switch

The server itself consumes around 35 Watts so the 300 Watt PSU inside has some power to spare.

I've been thinking - would it be possible to get rid of the power adapters my external networking devices use and connect them to the ATX PSU instead?

I know a single power adapter can take even 5-10 Watts "in idle" (when the device is not connected to it), and pretty much energy is wasted for 220v-5v conversion, so why not use the already converted energy from the PSU?

I could configure an USB or PCI wireless adapter as an access point, but internal cable modems are pretty hard to get and I don't know if my cable provider would allow that, so I don't know if I would be able to pull it off this way. Thus for now I'm only considering getting rid of the power adapters.

Has anyone tried that?

macbirdie

Posted 2009-08-13T11:21:42.100

Reputation: 1 661

Oh, Improving Girlfriend Acceptance Factor for home server aesthetics is also crucial. ;) – macbirdie – 2009-08-13T11:27:52.383

Answers

1

What you will probably find is that the external power adaptors all have slightly different voltages and output. An atx power supply only has 12v and 5v output (and 3.3 on some newer ones). So you are unlikely to get any joy from trying to wire things up.

Your best bet may be to try and get USB powered devices as these don't need their own power bricks. But there is a limit to what you can draw on these.

You could look at getting a cable modem with gigabit and wifi built in, that would reduce some of the clutter (do you really need gigabit at home?).

Alternatively, if aesthetics is an issue, then hide the whole lot in the closet.

Jeremy French

Posted 2009-08-13T11:21:42.100

Reputation: 870

Yeah, I've just looked - wireless router uses 3.3v, cable modem 12v, so these could, at least theoretically, do fine, but the switch takes 7.5v. Having an all-in-one server/router/NAS/ipv6 gateway/video streaming would give the advantage of having much less cable clutter, which I don't like myself. Otherwise, "that's great honey, but you've simply hidden all your mess to the closet", like they don't do that all the time. ;) – macbirdie – 2009-08-13T17:09:26.197

0

I have done stuff like this before, but many years ago. I recall being able to make 7 volts DC by using the +12v lead as the positive, and the +5v lead as the negative.

The difference between these potentials (voltage is potential difference) is +7v. This should probably be enough to run your switch.

If not, you could also try using the +12v lead as the positive, and the +3.3v lead as the negative--this would give you +8.7v to run the switch.

Newer power supplies may or may not like you trying this. It may be wise to get a spare power supply to play around with first.

eleven81

Posted 2009-08-13T11:21:42.100

Reputation: 12 423

1This is somewhat dangerous. It will only work if there is enough consumption on the 5 or 3.3 lead inside the PC to handle the output of the device you are hotwiring in. If not, it will try to force current back into the powersupply, which could cause nothing, failure, or explosion in the worst case. Personally I wouldn't risk it. – davr – 2009-09-10T17:24:36.063