Python Forum
faster socket with multiprocessing.Pipe
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
faster socket with multiprocessing.Pipe
#1
I have a socket receiving a TCP stream that WAS getting bottle necked with some of the processing.  I fixed the problem by giving the socket it's own dedicated process with multiprocessing.Process and having it pass its received data to other processes via multiprocessing.Pipe.  It works but it's really ugly and just feels wrong.  Is there a better way to do this?  
from multiprocessing import Pipe, Process
from multiprocessing.manager import SyncManager
import socket, signal

def mySock(HOST,PORT,pipe_tx):
    sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
    sock.connect((HOST, PORT))
    try:
        while True:
            data_chunk = sock.recv(4096)
            pipe_tx.send(data_chunk)
    except KeyboardInterrupt:
        print "mySock received KeyboardInterrupt"
    finally:
         if not isinstance(sock,_sock, socket._closedsocket):
            sock.close()


def worker(pipe_rx):
    ''''this worker receives data from pipe_rx and processes it''''
    pass

def init_mgr():
    '''initialize the process manager'''
    signal.signal(signal.SIGINT, signal.SIG_IGN)

if __name__ == "__main__":
    HOST, PORT = "localhost", 12345
    processes = []

    manager = SyncManager()
    manager.start(init_mgr)

    # simplex pipe
    pipe_rx, pipe_tx = Pipe()

    try:
        # Socket process
        p  = Process(target=mySock, args=(HOST, PORT, pipe_tx))
        p.daemon = True
        p.start()
        processes.append(p)

        # worker process
        p = Process(target=worker, args=(pipe_rx,)
        p.daemon = True
        p.start()
        processes.append(p)
     
        try:
            for process in processes:
                process.join()
        except KeyboardInterrupt:
            print "keyboardInterrupt in __main__"

    finally manager.shutdown()
Reply
#2
I'd be interested in seeing the original code, before you resorted to multiprocessing. Unless it's a lot of data on the line, I don't think you should have needed to do that.
Reply
#3
(Dec-06-2016, 07:55 PM)nilamo Wrote: I'd be interested in seeing the original code, before you resorted to multiprocessing.  Unless it's a lot of data on the line, I don't think you should have needed to do that.
I have a lot of code.  Lets pretend my code is already about as efficient as can be and does require the 10 post-process processors.  Is there an ideal way to read variable length block data from a socket and get it to the post processors?
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Convert Excel file into csv with Pipe symbol.. mg24 4 1,289 Oct-18-2022, 02:59 PM
Last Post: Larz60+
  BrokenPipeError: [Errno 32] Broken pipe throwaway34 6 8,945 May-06-2021, 05:39 AM
Last Post: throwaway34
  Duplex Named Pipe with Python Server and C# Clients raybowman 1 2,337 Dec-03-2020, 09:58 PM
Last Post: Gribouillis
  2 or more processes on the write end of the same pipe Skaperen 4 3,793 Sep-27-2020, 06:41 PM
Last Post: Skaperen
  STT: recognition connection failed: [Errno 32] Broken pipe GrahamBerends 0 4,974 Jul-18-2020, 11:00 PM
Last Post: GrahamBerends
  multiprocessing Pipe.poll very slow seandepagnier 0 2,315 Mar-09-2020, 03:10 AM
Last Post: seandepagnier
  Speech (audio file, wav) to Text - Broken pipe Shobha 1 3,713 Nov-27-2018, 12:41 PM
Last Post: Larz60+
  Python tailing file or named pipe stalls after a while nilsk123 4 4,727 Jul-27-2018, 07:14 AM
Last Post: Gribouillis
  Filtering an interactive CLI command via pipe(s) Skaperen 2 4,115 Nov-23-2016, 09:17 AM
Last Post: Skaperen

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020