Python Forum

Full Version: 2 or more processes on the write end of the same pipe
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
i want to start 2 or more processes with stdout of each writing to the write end of the same one pipe. my code will then read from the read end of that pipe (maybe many lines) and eventually kill or signal all the processes (so it also needs to get all of their process ID numbers). of course my code decides what executable (and args) to run in each.

there are tools to run processes. but they only make pipes with each having a different pipe. i want all the output to be combined into one single pipe so that the next read gets output from any.
When the pipe is created, the ends are unique. The only way to get multiple things writing into it is to fork them from the same process. I don't know how you would do that via subprocess.

I would expect a process to instead put all the receiving ends into a select() and then read whichever one wants to. But I don't know that you can select() on pipes on windows. Not sure what the best way is for that platform.
i've done this before in C. it only needs what's in module os, i believe. i was just curious if there was something provided to do more of this. otherwise, i'll do it like i did in C. but that way could still use subprocess. what i did in C was fork a child process over the pipe that forked all the others except the last one which it just did execve(2) on.

or it could do them all and exit() silently.
Here is a solution using socket pairs instead of pipes. The advantage is that there are less blocking issues. Avoiding blocking with pipes may involve using threads and queues to read in the pipe, so I like the option to set sockets unblocking.

The below example shows a program main.py that starts two subprocesses a.py and b.py, both sending their output to the same socket.

# main.py
import subprocess as sp
import socket
import time

def main():
    print('Starting')
    r, w = socket.socketpair()
    r.settimeout(0.2)
    r.shutdown(socket.SHUT_WR)
    w.shutdown(socket.SHUT_RD)
    a = sp.Popen(['python3', 'a.py'], stdout=w)
    b = sp.Popen(['python3', 'b.py'], stdout=w)
    procs = [a, b]
    while procs:
        try:
            x = r.recv(1024)
            print('Read:', repr(x))
        except socket.timeout:
            print('Timed out!')
            procs = [p for p in procs if p.poll() is None]
    r.close()
    w.close()
    print('Bye')

if __name__ == '__main__':
    main()
# a.py
from time import sleep
import sys

for i in range(5):
    print(i)
    sys.stdout.flush()
    sleep(0.5)
# b.py
from time import sleep
import sys

for i in 'abcdef':
    print(i)
    sys.stdout.flush()
    sleep(0.3)
Output:
λ python3 main.py Starting Read: b'a\n' Read: b'0\n' Timed out! Read: b'b\n' Timed out! Read: b'1\n' Read: b'c\n' Timed out! Read: b'd\n' Read: b'2\n' Timed out! Read: b'e\n' Timed out! Read: b'3\nf\n' Timed out! Timed out! Read: b'4\n' Timed out! Timed out! Timed out! Bye
You could do something similar by using r, w = os.pipe() instead of r, w = socket.socketpair() but then you have to manage the blocking problem when you try to read in r.
i got it working using os.pipe() and os.fork(). i omitted any effort to track processes for os.wait()s because i am leaving them all running until i kill them all and exit. every try with subprocess and multiprocessing did not succeed (exceptions kept happening in child processes). but i wasn't trying it the way you did.

these child processes all run various ping commands. the parent reads the pipe to get all ping return events. next i will add code to accumulate stats in real time with output to files and/or stdout. one planned example is a one minute average doing sliding window and comparing loss to multiple IPs in parallel.