Disconnecting PuTTY with Amazon Web Services EC2

0

I am using PuTTY to submit a bash file to Amazon Web Services EC2 to run R files. The cursor in PuTTY disappears when I submit the bash file and does not reappear until all of the R files have finished running (currently ~40 minutes later, but eventually 10-20 hours later).

I have never seen this before when using clusters at various universities and NGOs. Historically the cursor has always reappeared immediately.

If I shut my laptop computer down before all of the R files have finished running will this cause the R files to crash or otherwise not run to completion?

Is there a way I can get the cursor to reappear immediately in PuTTY?

Mark Miller

Posted 2019-12-01T11:36:52.470

Reputation: 401

Answers

1

I have never seen this before when using clusters at various universities and NGOs. Historically the cursor has always reappeared immediately.

From what I know, the difference is that most clusters tell you to use special commands like qsub for batch job submission. These commands return instantly because they don't run the script immediately – they just add it to the queue and a separate system decides where and when to run your script.

On AWS, however, you don't have any of that – it's a completely plain Linux system. If you directly run a Bash script via ./myscript.sh or bash myscript.sh, then it does run immediately right there in your terminal. And like with any other program that you run, the shell prompt won't reappear until the program is finished.

If I shut my laptop computer down before all of the R files have finished running will this cause the R files to crash or otherwise not run to completion?

Yes, when a terminal is closed (i.e. when the connection is "hung up"), all processes running from it get a signal to exit. They might ignore it, but usually they exit immediately.

Is there a way I can get the cursor to reappear immediately in PuTTY?

There are two most common options for long-running tasks on Linux:

  1. Use Bash's job control, i.e. append & to run a command in background. For example:

    ./slowscript.sh &
    

    If you forgot to add the &, you can press CtrlZ followed by the bg command to put an already-running program in background.

    Note that programs under job control will still get terminated when you close the SSH connection. To avoid this, you also need to disown each job – you will no longer be able to see its status, but it will also no longer be killed on hangup.

  2. Use a "terminal multiplexer" such as tmux or Screen. Instead of dealing with batch jobs, they let you put a whole terminal in background.

    So you can first run tmux to create a "detachable" terminal, then start your R script from there, and while it's running press CtrlB followed by D to detach it.

    (Even if you forget to detach, it will happen automatically when disconnecting – the programs will remain running inside tmux.) Later you can use tmux at[tach] to reattach.

  3. In some Linux distributions you can also use systemd to start background jobs. This feature first needs to be enabled with sudo loginctl enable-linger $USER.

    Once enabled, you can start background tasks using systemd-run:

    systemd-run --user --same-dir ./myscript.sh
    

    Later you can use systemctl --user ..... to see the job's status.

user1686

Posted 2019-12-01T11:36:52.470

Reputation: 283 655

This is very helpful. Can tell me how to disown a job? I can try to look it up. Or do your Steps 2 & 3 describe how to disown? – Mark Miller – 2019-12-01T12:57:24.543

1Literally use the command disown, after putting the process in background. (Also, these are not steps 2 & 3, these are alternative methods 2 & 3.) – user1686 – 2019-12-01T12:59:20.033

Other pages might also suggest using nohup to run the script; I keep forgetting it. It is very similar in effect to the first method described above ('script&' + 'disown'). – user1686 – 2019-12-03T13:36:47.550