Drivers are your problem. Graphics cards are small, self-contained computers that are made specifically to work with their onboard components using a design called a "reference card". You may have heard that term referring to the most basic version of a video card in retail for third-party video cards. The board itself has firmware which manages the onboard components like your motherboard BIOS, and the drivers install into the operating system to tell the OS how to use the board. Without the interface jack and drivers for your system, that system can't take advantage of the card, and is hit-and-miss if it has a "default video driver".
My rules on keeping video cards are as follows:
If the card is PCI, keep it or try to sell it. Some PCI classic capable motherboards still exist out there for HTPC setups using ARM chips and similar nonstandard CPUs. PCI video cards are hard to find because they only have this niche demand, and many are not as powerful as the on-die AMD Radeon that their processors support.
If the card is PCI-E, keep it. If you can't use it and it's new enough, sell it. If you can use it, but don't need it, keep one as a spare in case your video card ever goes out and you need to plug in another to get by while you buy a new one.
If the card is final-generation AGP like the early GeForce cards or the Voodoo5 and Voodoo6 cards, you might be able to sell it to enthusiasts. If it's an earlier AGP card, recycle it.
3I usually cut them up and make key-rings or melt the components off and make notebooks. – Digital Lightcraft – 2018-03-28T16:33:52.587
4
I'm not sure if a 15-year-old graphics card has any computing power worth using – that's when GPUs with arbitrarily programmable shaders had only started to appear.
– user1686 – 2018-03-28T16:37:05.887