Copper is copper, gold is gold. That's the expensive part of any interconnect.
What makes a good high speed cable and connector is the insulation quality (material) and tolerance control (thickness). This applies to the PCB at both ends as well. There's some cost adder here.
Then comes the transmitter and receiver circuit. The accuracy of those chips to provide the exact impedance matching per spec, and correct impedance during the bit transition (eye measurement). This is temperature dependent, and silicon process dependent, so the Si mfr needs to put auto-tuning or calibration in place to compensate. Hitting 1 GHz is somewhat easy with any telephone twisted pair. 10 GHz is a real pain.
I have just yesterday swapped two 50' lengths of CAT5E @GBE between two NAS boxes, and I get no more transmission errors. That proves that the receiver makes a difference.
On a similar topic, we tested super-expensive Video RCA Monster Cable vs cheap General Electric cable, using a 35 GHz Tek Network Analyzer, and found that the Monster cable didn't even make 35 MHz bandwidth. The GE made 1000 MHz clean. Bandwidth required: >7 MHz, preferably 35 MHz (for analog signal).
Some issues also come from new features like the "green" switches, which change amplitude (drive strength) depending on cable length and connect speed. Sometimes they get jammed at the wrong level, and the data rate gets stuck at 100Mbits.
The ultimate test is physical measurement, which isn't something most IT centers bother with. Or a controlled breakout box which would attenuate (disrupt) the signal on some path, and check for margin. I haven't seen such a box yet.