1

I have some autonomous scripts that run commands on remote machines over ssh. These scripts rely on getting stdout, stderr, and the return code of each command run. I want to be able to monitor the progress of the scripts on each target machine so that I can see if something has hung and possibly intervene if necessary.

My initial idea was to have the scripts run commands in a screen session, so that the person monitoring could simply attach to the session with screen -x. However, it was hard to do that from a script since screen is an interactive program. I can send a command to the screen session with screen -S session -X stuff "command^M", but then I don't get the output and return code that I need back.

My second idea was to put script /path/to/log in ~/.bash_profile and log the entire session to a file. Then the monitoring person could simply tail the log file. However, this doesn't provide the interactivity that I was looking for.

Any ideas on how to solve this problem?

Michael
  • 11
  • 2

5 Answers5

1

I use php for a shell script to manage my Minecraft server. I use the server log to get my replies. Here is the code I use to list who is online. I use tail to get the last few lines and parse them for the time to make sure the response is after when I sent the command.

#!/usr/bin/php
<?php
function send_cmd( $command )
{
        exec('screen -S minecraft -X stuff "`printf "\\r' . $command . '\\r"`"');
}

function who()
{
        if (!is_running())
        {
                echo 'Server is not running.' . CRLF;
                return 4;
        }

        // Get the current time and send the command
        $before = time() - 1;
        send_cmd('list');

        // Wait for the server to provide a response
        while(time() < $before + 5) {
                sleep(0.25);
                $result = exec('tail ' . __DIR__ . '/server.log | grep "\[INFO\] Connected players"');
                $stamp = strtotime(substr($result, 0, 19));
                if ($before <= $stamp)
                        break;
                unset($result);
        }

        if (isset($result))
        {
                echo $result . CRLF;
                echo 'Server responded in ' . ($stamp - $before) . ' seconds.' . CRLF;
                return 0;
        }
        else
        {
                echo 'Server did not respond.' . CRLF;
                return 4;
        }
}
Josh Brown
  • 111
  • 2
1

Lots of human interaction.. I'd suggest to:

  1. Deploy Check_MK and set up its log monitoring
  2. Pipe the script output to log files and have Check_MK monitor for errors

In this scenario a person gets alerted only when something goes wrong. They get the chance to review the log message via a web interface, acknowledge them so that other people on the team know things are being handled, and decide when to take action based on the nature of the problem.

ztron
  • 317
  • 1
  • 8
0

I'd like to suggest using "nohup" - this will output the stdout into nohup.out. You can bg the script with nohup and safely log out and it'll continue to run. Check out man nohup for more details.

Typical usage is #> nohup /usr/local/bin/myscript.sh &

It will run the script in the bg, and not require you to stay logged in. The nohup.out is going to be in the current working directory when it's run, or $HOME/nohup.out if perms don't allow it. You can then check the contents of nohup.out for any issues that you need to address and deal with them accordingly.

Now, to address your interactivity, you can run the script using screen and nohup, this way, you can bring the process back into the foreground if it's hung waiting on your input.

Hope this helps

sandroid
  • 1,724
  • 12
  • 16
  • Thanks for the response. It's a good thought but I don't think it will work for my specific scenario (sorry if the question was unclear). The script runs commands like `ssh user@address 'ls -al'`, no script is actually run on the target machines. I want those remote commands to be executed in a screen session so that as an administrator, I could ssh to the target machine, attach to the screen session, and watch the commands as they are being executed by the script. I can't do something like `ssh -t user@address 'screen -S scriptsession'` because my script can't interact with the screen program – Michael Nov 08 '10 at 20:41
  • I could interact with the screen program with Expect, but I don't think that would get me the output I need. – Michael Nov 08 '10 at 20:46
0

You should look at Fabric. It does just what you describe you are looking for. You will need to do some Python-ish scripting. You will be able to run your scripts. You might find it more effective to execute command from Fabric. You'll see what I mean; run the tutorial. http://docs.fabfile.org/0.9.2/

IAPaddler
  • 161
  • 4
0

If you want to use screen, you could try something like this:

ssh user@address screen -d -m <command>

The -d -m argument combination makes screen start a new session but does not attach to it.

I use a variant of this command to run "daemons" that don't daemonize (like Minecraft) on startup and it works nicely. I can connect to the screen session later and see the full output and issue server commands like normal. Below is the startup script variant, which runs the screen session as someone other than root (for /etc/rc.local):

sudo -u <someuser> -i screen -d -m <command>
shiftycow
  • 91
  • 5