Python Forum
sharing memory(read-write) between multiple processes - Printable Version

+- Python Forum (https://python-forum.io)
+-- Forum: Python Coding (https://python-forum.io/forum-7.html)
+--- Forum: General Coding Help (https://python-forum.io/forum-8.html)
+--- Thread: sharing memory(read-write) between multiple processes (/thread-22137.html)



sharing memory(read-write) between multiple processes - mike000 - Oct-31-2019

I need multiple processes working to be updating(adding data) to an array in memory, so it will be growing large.
the code I'm posting here doesn't do the desired...

import multiprocessing
import ctypes
import numpy as np
shared_array = None


shared_array_base = multiprocessing.Array(ctypes.c_int64, 1 * 10)
shared_array = np.ctypeslib.as_array(shared_array_base.get_obj())
shared_array = shared_array.reshape(1, 10)
# Parallel processing
def my_func(i):
    '''
    anytime a process runs this code, the desire is for the shared array to be stacked with the contents of arr,
    so by running it for example ten times, after all processes return, printing  shape(shared_array)==(11,10)
    //any indication in the right direction how to do that?

    '''
    global shared_array
    arr=np.array([44,44,44,44,44,44,44,44,44,44])
    shared_array=np.vstack((shared_array,arr))
    #print (shared_array)


if __name__ == '__main__':

    pool = multiprocessing.Pool(processes=2, )
    pool.map(my_func, range(10))

    print(shared_array)  # gives [[0 0 0 0 0 0 0 0 0 0]]



RE: sharing memory(read-write) between multiple processes - woooee - Oct-31-2019

Quote:the code I'm posting here doesn't do the desired...
What in the heck does that mean?

To pass data to/from/between a Process you use a Manager object.