3

I have a python script that listens and blocks while it waits for data on a redis list.

It runs fine in upstart using the following:

description "stage message consumer"
author "Nilesh Ashra"

start on started mountall
stop on shutdown

respawn

exec sudo -u user REDIS_HOST=0.0.0.0 ENVIRONMENT=my_env /usr/bin/python /path/to/message_consumer.py

My question is, can I use upstart to spin up say 12 of these?

If not, can you recommend a way to do this?

2 Answers2

3

I could be off on this as it's been awhile since I worked on interpreted language daemons, but I think the "right" way to do this is architect your program to listen to the port then fork another process to handle a bundle of requests/jobs on another port while the parent continues to listen for more connections.

You might want to look for code samples on how to create simple web servers to see how to do something like this. Mailq is right that you can't have multiple processes listen to the same IP and port pair at the same time.

So...fork other work processes that do their jobs then sync back up with a control process.

Bart Silverstrim
  • 31,092
  • 9
  • 65
  • 87
-1

No you can't. You can not listen multiple times on the same port. So if the first instance already occupies the port (let's say 80), then no other instance can listen on the same port.

But even if it is not a TCP/IP-listening daemon it is still a strange requirement. Why would one start a command multiple times in parallel that does exactly the same?

mailq
  • 16,882
  • 2
  • 36
  • 66
  • Not necessarily true. If you use a superserver like xinet you can listen on the port in xinet and have it accept the socket then fork the python processes as children, that way it can inherit the file descriptor. – Matthew Ife Nov 27 '11 at 21:08