Python Forum
2 or more processes on the write end of the same pipe
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
2 or more processes on the write end of the same pipe
#1
i want to start 2 or more processes with stdout of each writing to the write end of the same one pipe. my code will then read from the read end of that pipe (maybe many lines) and eventually kill or signal all the processes (so it also needs to get all of their process ID numbers). of course my code decides what executable (and args) to run in each.

there are tools to run processes. but they only make pipes with each having a different pipe. i want all the output to be combined into one single pipe so that the next read gets output from any.
Tradition is peer pressure from dead people

What do you call someone who speaks three languages? Trilingual. Two languages? Bilingual. One language? American.
Reply
#2
When the pipe is created, the ends are unique. The only way to get multiple things writing into it is to fork them from the same process. I don't know how you would do that via subprocess.

I would expect a process to instead put all the receiving ends into a select() and then read whichever one wants to. But I don't know that you can select() on pipes on windows. Not sure what the best way is for that platform.
Reply
#3
i've done this before in C. it only needs what's in module os, i believe. i was just curious if there was something provided to do more of this. otherwise, i'll do it like i did in C. but that way could still use subprocess. what i did in C was fork a child process over the pipe that forked all the others except the last one which it just did execve(2) on.

or it could do them all and exit() silently.
Tradition is peer pressure from dead people

What do you call someone who speaks three languages? Trilingual. Two languages? Bilingual. One language? American.
Reply
#4
Here is a solution using socket pairs instead of pipes. The advantage is that there are less blocking issues. Avoiding blocking with pipes may involve using threads and queues to read in the pipe, so I like the option to set sockets unblocking.

The below example shows a program main.py that starts two subprocesses a.py and b.py, both sending their output to the same socket.

# main.py
import subprocess as sp
import socket
import time

def main():
    print('Starting')
    r, w = socket.socketpair()
    r.settimeout(0.2)
    r.shutdown(socket.SHUT_WR)
    w.shutdown(socket.SHUT_RD)
    a = sp.Popen(['python3', 'a.py'], stdout=w)
    b = sp.Popen(['python3', 'b.py'], stdout=w)
    procs = [a, b]
    while procs:
        try:
            x = r.recv(1024)
            print('Read:', repr(x))
        except socket.timeout:
            print('Timed out!')
            procs = [p for p in procs if p.poll() is None]
    r.close()
    w.close()
    print('Bye')

if __name__ == '__main__':
    main()
# a.py
from time import sleep
import sys

for i in range(5):
    print(i)
    sys.stdout.flush()
    sleep(0.5)
# b.py
from time import sleep
import sys

for i in 'abcdef':
    print(i)
    sys.stdout.flush()
    sleep(0.3)
Output:
λ python3 main.py Starting Read: b'a\n' Read: b'0\n' Timed out! Read: b'b\n' Timed out! Read: b'1\n' Read: b'c\n' Timed out! Read: b'd\n' Read: b'2\n' Timed out! Read: b'e\n' Timed out! Read: b'3\nf\n' Timed out! Timed out! Read: b'4\n' Timed out! Timed out! Timed out! Bye
You could do something similar by using r, w = os.pipe() instead of r, w = socket.socketpair() but then you have to manage the blocking problem when you try to read in r.
Reply
#5
i got it working using os.pipe() and os.fork(). i omitted any effort to track processes for os.wait()s because i am leaving them all running until i kill them all and exit. every try with subprocess and multiprocessing did not succeed (exceptions kept happening in child processes). but i wasn't trying it the way you did.

these child processes all run various ping commands. the parent reads the pipe to get all ping return events. next i will add code to accumulate stats in real time with output to files and/or stdout. one planned example is a one minute average doing sliding window and comparing loss to multiple IPs in parallel.
Tradition is peer pressure from dead people

What do you call someone who speaks three languages? Trilingual. Two languages? Bilingual. One language? American.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  python Read each xlsx file and write it into csv with pipe delimiter mg24 4 1,308 Nov-09-2023, 10:56 AM
Last Post: mg24
  Convert Excel file into csv with Pipe symbol.. mg24 4 1,288 Oct-18-2022, 02:59 PM
Last Post: Larz60+
  processes shall be parallel flash77 4 1,066 Sep-20-2022, 11:46 AM
Last Post: DeaD_EyE
  Sharing imported modules with Sub Processes? Stubblemonster 2 1,462 May-02-2022, 06:42 AM
Last Post: Stubblemonster
  Killing processes via python Lavina 2 2,560 Aug-04-2021, 06:20 AM
Last Post: warnerarc
  BrokenPipeError: [Errno 32] Broken pipe throwaway34 6 8,940 May-06-2021, 05:39 AM
Last Post: throwaway34
  How to share a numpy array between 2 processes on Windows? qstdy 0 2,137 Jan-29-2021, 04:24 AM
Last Post: qstdy
  Duplex Named Pipe with Python Server and C# Clients raybowman 1 2,336 Dec-03-2020, 09:58 PM
Last Post: Gribouillis
  sharing variables between two processes Kiyoshi767 1 1,848 Nov-07-2020, 04:00 AM
Last Post: ndc85430
  STT: recognition connection failed: [Errno 32] Broken pipe GrahamBerends 0 4,973 Jul-18-2020, 11:00 PM
Last Post: GrahamBerends

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020