I'm running a ZMQ 'push' interface via Ratche on an Apache server, which works very well on the command line, interacting with my server exactly the way I'd like. Here's the code
//script1.php
<?php
echo exec('php script2.php');
?>
//script2.php
<?php
$entryData = array(
'category' => 'modelLmdap'
, 'job_id' => '1234'
, 'text' => ''
, 'status' => ''
);
$context = new ZMQContext();
$socket = $context->getSocket(ZMQ::SOCKET_PUSH, 'my pusher');
$socket->connect("tcp://localhost:5555");
$socket->send(json_encode($entryData));
?>
When I execute script1.php I see a popup in my server window saying 'request logged'. However, if I try to run it via a browser I get a blank page and no output on the server.
I looked around this may be due to Apache not being able to run at the command line and a solution is to add the following to my sudoers (using sudo visudo):
www-data ALL=NOPASSWD: ALL
Still nothing! A blank page. I switched the command to exec() and I get this error:
Fatal error: Class 'ZMQContext' not found in /home/username/server/qap/v2/tools/push.php on line 8
Any ideas?
EDIT: I've been asked why I want to do all this, which is a solid question.
TL;DR I can’t think of a way to execute another php script as a service – curl isn’t working locally and include doesn’t cut it.
I'm performing a number of complex statistical calculations which require a lot of resources and take far too much time to produce any output, resulting in peculiar script execution problems including blank pages for no reason (even when I’ve extended the max script execution time) and AJAX errors despite timeouts being maxed. So I'm farming out the calculations on the server-side to a number of different background processes, all of which are php scripts. This also allows me to update the user as this long script executes. The call in question is to execute a ‘controller’ script which controls the calculation farm (is that the right terminology? Idk). The script then communicates with a Push server (via ZMQ) and pushes out the data to the clients. The original code was AJAX heavy and I really didn’t fancy updating ALL of it to use websockets so I’ve settled for a ‘push’ type style using Ratchet as the socket server. This means I can make my original AJAX call to start the whole process and the page then receives updates from the server, one-way. This has a lot of advantages, my favourite is that the communication is inherently one-way which stops me having to implement lots of security features on the server’s end. Unfortunately, in order for this all to work the Ajax-ed script needs to be able to shell_exec the ‘farm’ scripts as they need to run as background processes, which is how I ran into my problem. I’ve tried curl(), but it just grabs the bare text and include() will just result in the other scripts being run as part of the ajax-ed script which defeats the whole point. I guess long-term I could modify the ‘push’ server to be fully duplex, allowing the initial data to be transmitted from the web-page straight to an asynchronous ‘farm server’ (I know ReactPHP is capable of this, as is node.js), but as a n00b to socket server programming I thought it best to avoid it until I can complete a working model.
If there’s anything I’ve missed then let me know, I hate insecure websites just as much as you!