5
1
I need to build a (cheap) computer that might serve to:
- mine digital currency
- render 3D animations
- solve SETI problems
- ...etc
Basically I am just using the GPU's to solve math problems. I need very little live throughput to / from the cards.
My Question
Is there a way to:
- mount video cards through USB or Thunderbolt or some other chain-able protocol
- without writing custom drivers
- on a linux variant
There are some motherboards that support up to 6 PCIe connections, but it would be so much nicer if I could mount as many as the system resources could handle.
Not my question
- You would need to power them some other way. Got it. They all need external power.
- USB (and maybe even Thunderbolt) doesn't have the throughput for high-vector video throughput. Got it. I am not using these as video cards per-se. PCIE to USB/Thunderbolt for Graphics Card
Other, possibly interesting answers
- There is this clustering solution that... (likes gpu's?)
- There are these other processors that might be better suited... (asic?)
Discoveries made since asking the q
Cluster of motherboards so cheap as to be irrelevant compared to the price of the GPU's, see this very interesting dissertation project video, alas... Raspberry Pi's and Arduinos don't seem to have PCIe slots. The HummingBoard-Pro does, but it is $55. My number needs to be under $25 each to be cost effective. Here are others: Gateworks Price Unk, Intel Galileo with mPCI, $45 each.
- Samuel Cozennat gives us a gorgeous (but expensive) example using Intel NUC's. He includes the hardware build and provisioning setup. Very nice, Sam! Thanks.
PCI-e can be split somewhat like USB and Thunderbolt... who knew? Here are a couple limited splitters: Amfeltec, C0C0C3. The PCIe spec indicates that it could theoretically support 32 1x devices.
Thunderbolt has the capability (especially for low / non-video data-rates), but existing bios / mainboard / driver setups are not generally developed. There are some existing products that target laptops.
1I don't know, but I appreciate your "Not my question" section, as well as your formatting. – trueCamelType – 2017-04-01T04:35:01.373
1@trueCamelType, you, me and Johannes would have been bad-asses in 1440... I guess he still is / was ;) – Sy Moen – 2017-04-01T04:52:52.677
2So you want to possibly spend several hundred currency units on a graphics card but your budget for all other components is 25$? In addition "rendering 3D animations" isn't just math. As for your question just look for "external GPU" and you will have various hits.Though there probably is only a very small selection as even if you're saying you won't need a lot of throughput you will access RAM frequently. In addition there might be better solutions (digital mining) than graphics cards if you want to do "serious" work. – Seth – 2017-04-03T07:30:10.213
In the case of a cluster where each cpu supports a single gpu, yes, $25 is my limit per CPU/micro computer. Re 3D animations vs "math", I understand, but it is a very limited compute set for which GPU's are optimized. If I could find Blender ASIC's I would use them ;) – Sy Moen – 2017-04-03T07:35:21.733
2Could you include more information about the scope/requirements/size of your planned deployment? Running a dual GPU solution it going to be wildly different from running a farm of hundreds of GPUs and actually specifying some kind of performance requirement would help as well. Right now one could suggest APUs or a farm of very old GPUs as well. – Seth – 2017-04-03T08:06:06.950
Well, the main question is less about the gpus (or apus) and more about the controlling hardware. Another way of framing the question is, "Can I reduce the hardware cost by eliminating the need for actual PCIe slots?" However, the scope is 10's at least and {n} theoretically. Indeed, not less than 25 in any case. – Sy Moen – 2017-04-03T08:57:42.517
I will have to add that there is a cost element in management which I may be overlooking. Maybe a $45 Intel Galileo to run each video card would actually be cost effective since the software provisioning would be so much simpler. – Sy Moen – 2017-04-03T09:06:10.153