Is it possible to combine processing power of 2 computers?

14

4

Here are a few questions, I wish you could enlighten me.

  1. Is it possible to combine processing power of 2 computers?
  2. How do I do it?

Wern Ancheta

Posted 2010-03-21T22:36:13.263

Reputation: 5 822

7>

  • Certainly. 2) With enormous difficulty.
  • < – Daniel R Hicks – 2013-08-05T00:27:54.600

    Answers

    11

    Not transparently where a running program can somehow use the second machine to execute code, since they are logically separate with no way for the cpu to communicate or access each other's memory.

    That doesn't mean you can't combine processing power:

    1. Specific software might have components that can execute on other machines, e.g. protein folding, SETI @ home. These tend to be specialized, i.e. you can't start up Excel and tell it to use another computer for computation.
    2. If you are doing processor intensive tasks, you could use the secondary machine to run them, e.g. encoding/recoding a video stream.

    If you are looking to harness the secondary computer in any way, being able to remote control it is crucial. Two ways to do this are via some sort of remote access (RDP, VNC) or alternatively something like synergy+.

    mindless.panda

    Posted 2010-03-21T22:36:13.263

    Reputation: 6 642

    1Say for example I have 4GB of RAM on my laptop and 4GB on my PC, will an RDP allow me to virtually to run a progran with 8GB of RAM? – TGamer – 2017-10-02T00:51:59.907

    5

    One of my most used lines - Yes and No!

    Yes it is possible - for certain applications that are designed to work this way. (Commonly known as a cluster - Further reading here)

    No it is not possible (at least as far as I know) to take two off the shelf computers, "tie" them together and get the combined memory, processing power and everything else.

    William Hilsum

    Posted 2010-03-21T22:36:13.263

    Reputation: 111 572

    1

    @~quack yes, and I said for certain applications... As the question was tagged Windows-7, I assumed that he meant an every day computer with any off the shelf program.... Anyway - Don't Beowulf clusters require specially written applications?... I am no expert, never used one, but I quickly read http://www.beowulf.org/overview/index.html (especially last two paragraphs)

    – William Hilsum – 2010-03-21T23:02:41.593

    sorry, deleted my earlier comment after rereading and noticed you'd already linked to the clustering concept. beowulfs are designed around off-the-shelf components, and are one way of "tieing" multiple systems together, but you're right that they don't really work for programs that aren't specifically designed for them. – quack quixote – 2010-03-21T23:48:47.473

    4

    It is very possible! But judging by the simplicity of your question, I assume you would like to simply run a program which will magically make your computer twice as fast, which is not possible.

    You need to understand that when a program runs it maintains its state by provoking the CPU to move memory between the HDD, RAM, and CPU registers, as well as addresses on various components (such as video cards or network cards). The trouble with using a CPU from another computer to help you is that it needs access to the same memory. And maintaining a mirror image of your computers memory on another computer requires so much overhead that it easily defeats the purpose of trying to add another computer to gain performance :)

    But the type of things which can be split among multiple computers are image rendering or some mathematical calculations which can work independently.

    Nippysaurus

    Posted 2010-03-21T22:36:13.263

    Reputation: 1 223

    3

    If what you are looking for is a method of combining the processing power of two PCs into one, the "easiest" way to do it is to configure both of them as virtual machine hosts using software like VMWare ESXi (Be forewarned this will require the device to have compatible hardware) and creating a resource group or cluster and creating a virtual machine that uses the resources of both computers. This is NOT going to get you a full 2x speed (You'll lose resources due to virtualization) and is a limited solution due to likely compatibility requirements but it is the most "correct" answer to your question. The virtual machine will act like a single PC with the processing power of both hosts minus the overhead required to sustain virtualization.

    George Spiceland

    Posted 2010-03-21T22:36:13.263

    Reputation: 391

    I'm afraid communication between hosts will be the bottleneck. – gronostaj – 2013-12-17T21:21:36.690

    1Yes, communication between hosts can be a bottleneck, this is part of the overhead involved. Non-gigabit networks would be the prime culprit of this, but given the cheap and prolific gigabit availability I don't feel like this is especially penalizing. However that is true of all network distributed processing, and VM is a far more usable scenario with significantly less complication than Remote Procedure Calls and custom-written applications dedicated to (what would have to be anyways) network distributed computing. – George Spiceland – 2013-12-17T21:53:25.750

    2

    I agree with the other answers:

    • If you have an enormous, multi-sheet Excel workbook, and you want to be able to run Excel twice as fast (updating formulas and scenarios, running macros, etc.), you’re out of luck.
    • If you have a custom application that is easily partitioned, such as finding the square root of every integer from 1 to 1,000,000, it should be easy for you to break the problem into pieces and distribute them.
    • If you have a custom application like calculating the first 1,000,000 digits of π (pi), you may be able to do it, if you understand the problem space well enough.

    If you are talking about developing software to run in a distributed (multi-computer) environment, here are a couple of suggestions:

    • Use remote procedure calls (RPCs).  Just as you can make a host a file server or a web server, RPCs allow you to make a machine, essentially, a CPU server.  Conceptually, you would have one master machine, which would be an RPC client, and it would call library functions that would transparently be executed on the server.  In its simplest form, this architecture would not give you any performance benefit, since only one CPU would be executing at any instant.  However, in the asynchronous model, the client could start a remote procedure on the server and then do other things while the server is running.
    • Use a language designed for parallel processing, such as Unified Parallel C (UPC).  This is an extension of the C language with facilities for distributed data and simultaneous execution.  References:

    Scott

    Posted 2010-03-21T22:36:13.263

    Reputation: 17 653