uses for old graphics cards?

1

I have a bunch of old graphics cards laying around. They are all 10 to 15 years old and have no purpose except taking up space. How do i harness their computing power without connecting it to a pc. Can i power up a gpu with a raspberry pi using a pci x16 to usb converter or soldering wires to the board and connect them to the gpio of a rpi? Can i actually do something usefull with these cards or are they totally worthless? Time and efficiency are no problem here because i'm only doing this for experimentation and to learn how gpu's really work.

Dylan Missuwe

Posted 2018-03-28T16:31:05.967

Reputation: 11

Question was closed 2018-03-29T03:22:51.643

3I usually cut them up and make key-rings or melt the components off and make notebooks. – Digital Lightcraft – 2018-03-28T16:33:52.587

4

I'm not sure if a 15-year-old graphics card has any computing power worth using – that's when GPUs with arbitrarily programmable shaders had only started to appear.

– user1686 – 2018-03-28T16:37:05.887

Answers

4

Drivers are your problem. Graphics cards are small, self-contained computers that are made specifically to work with their onboard components using a design called a "reference card". You may have heard that term referring to the most basic version of a video card in retail for third-party video cards. The board itself has firmware which manages the onboard components like your motherboard BIOS, and the drivers install into the operating system to tell the OS how to use the board. Without the interface jack and drivers for your system, that system can't take advantage of the card, and is hit-and-miss if it has a "default video driver".

My rules on keeping video cards are as follows:

If the card is PCI, keep it or try to sell it. Some PCI classic capable motherboards still exist out there for HTPC setups using ARM chips and similar nonstandard CPUs. PCI video cards are hard to find because they only have this niche demand, and many are not as powerful as the on-die AMD Radeon that their processors support.

If the card is PCI-E, keep it. If you can't use it and it's new enough, sell it. If you can use it, but don't need it, keep one as a spare in case your video card ever goes out and you need to plug in another to get by while you buy a new one.

If the card is final-generation AGP like the early GeForce cards or the Voodoo5 and Voodoo6 cards, you might be able to sell it to enthusiasts. If it's an earlier AGP card, recycle it.

CDove

Posted 2018-03-28T16:31:05.967

Reputation: 1 155

2Totally agree. Its always a good idea to keep one or more as a spare for testing purposes. While there are people who are interested in older cards, they tend to be, as you said, "enthusiasts." However they tend to look for very specific makes and models. It cant hurt to try to sell them, but I wouldnt put much effort into it. Another use for an old card, depending on driver, connector and bus type is putting in your computer and adding a monitor or two for more desktop space. – Keltari – 2018-03-28T17:10:37.103

0

https://www.amazon.com/Mailiya-6-Pack-Powered-Adapter-Extension/dp/B077W8BDTR/ref=sr_1_1_sspa?ie=UTF8&qid=1522293602&sr=8-1-spons&keywords=pcie+to+usb&psc=1

https://www.newegg.com/Product/Product.aspx?Item=N82E16815158165&cm_re=pci_to_pcie_adapter--15-158-165--Product

I am going to ignore the physical connection aspect probably can be done via USB. If your project is computationally heavy, then the weak bandwidth of USB won't hold you back much.

If it doesn't support OpenCL (both/generic), AMD (Stream processors), or CUDA (NVIDIA) there are no easy ways to exploit there performance.

It is unlikely that all but the newest cards support OpenCL.

https://en.wikipedia.org/wiki/CUDA

https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units

https://en.wikipedia.org/wiki/OpenCL

If you can find software that supports one of the 3 methods above you can just install the drivers and software then run it, and it will detect compatible hardware and use it automatically.

Otherwise you have to learn how to program for all 3 standards to be sure. Then you can write you own code.

Even then cards that old will likely only have 30-80 cores, and will not produce life changing improvements. In comparison modern cards with 1,000-4000 cores can reduce a video rendering project from 2 hours to minutes.

enter image description here

CPU vs GPU

Even 14 CPU cores working at once only complete 7.537 compared to the GPU in the same amount of time.

cybernard

Posted 2018-03-28T16:31:05.967

Reputation: 11 200

You blew my mind with the 1,000-4,000 cores. Are they individulal cpu's or does it work differently? – Dylan Missuwe – 2018-03-30T18:34:25.840

@DylanMissuwe I they have a more limited instruction set that a CPU, but when you have 4000 it makes up the difference. Note the fury x2 has 8192. – cybernard – 2018-03-30T21:59:04.500