Python Forum
How to use multiprocessing with an array that is continually being appended - Printable Version

+- Python Forum (https://python-forum.io)
+-- Forum: Python Coding (https://python-forum.io/forum-7.html)
+--- Forum: General Coding Help (https://python-forum.io/forum-8.html)
+--- Thread: How to use multiprocessing with an array that is continually being appended (/thread-19373.html)



How to use multiprocessing with an array that is continually being appended - Apretext - Jun-25-2019

I have a (long!) script that I wish to speed up considerably by splitting it into 3 modules, and running them in parallel, rather than sequentially, which they currently do. Each script will need to read from several arrays which are constantly being appended, and I'm a little bit confused as to which multiprocessing method is best for that, as each script will potentially be reading the information at different times (ie, process a could be writing value 100 in the array, whilst process b is currently reading 50, and process c 30).

Would a pipe have to send every value in the array I want to pass at once, or can I append it? If it helps, the information only has to travel one way.


RE: How to use multiprocessing with an array that is continually being appended - nilamo - Jun-25-2019

I think a Queue is what you're looking for. One process can add things to it, another can take them out. If each of your tasks/processes has an "in" and an "out" queue, you can chain them together to work on data together.

https://docs.python.org/3/library/multiprocessing.html#multiprocessing.Queue


RE: How to use multiprocessing with an array that is continually being appended - Apretext - Jun-26-2019

Perfect, thank you. Do you know if you can have more than one queue per process? (Ie in my case, rgb values, plus x, y coordinates)

Edit: Or, alternatively, can I use a queue to send a numpy array?


RE: How to use multiprocessing with an array that is continually being appended - nilamo - Jun-26-2019

Queues can hold any serializable python object (which is pretty much anything, except maybe not an open file handle or socket or something like that). And there's no limit to the number of arguments you can send to a process.


RE: How to use multiprocessing with an array that is continually being appended - woooee - Jun-27-2019

You can also use a Manager list or dictionary https://pymotw.com/3/multiprocessing/communication.html#managing-shared-state


RE: How to use multiprocessing with an array that is continually being appended - Apretext - Jun-27-2019

So is put essentially the same as appending? (ish)


RE: How to use multiprocessing with an array that is continually being appended - nilamo - Jun-27-2019

Yes. Instead of appending, you put() things into the queue, and then get() to remove an item from the queue. get() will block the thread until there's something to actually get out of the queue.

The JoinableQueue is also pretty useful, as it allows you to just call .join() on the queue when you're done adding things to it, and it'll block until the queue's been fully processed. https://docs.python.org/3/library/multiprocessing.html#multiprocessing.JoinableQueue

Something like:
from multiprocessing import JoinableQueue as JQueue

# imagine this is the thread
def processor(in_queue):
    while True:
        item = in_queue.get()
        process(item)  # or whatever you do with it
        # now that we're done processing this item, let the queue know we've finished with this item
        in_queue.task_done()

queue = JQueue()
# create the thread here, passing (queue, ) as args so the thread has access to the queue

# now we can add things to the queue, and the thread will process them
queue.put("test thing")
queue.put("something else")

# then we just wait for the thread to finish processing everything we sent it
queue.join()

# at this point, we're in the main thread, the queue is empty, and everything that was in it has been fully processed



RE: How to use multiprocessing with an array that is continually being appended - Apretext - Jun-27-2019

Excellent, thank you for your help.