Dec-07-2019, 03:10 PM
I'm now having trouble using multiprocessing on another part.
This part of my code creates multiple images. I was going to make it more efficient by having to create multiple at once, so the process is done quicker. I though it would be as simple as just integrating the code I used before, but it doesn't work.
This time, rather than writing to a file, I am creating multiple images, all of which are large numpy arrays. I pass these arrays into the queue, so now it is holding the data of the images. The next step would be to read the queue, and save all of the images, however, it seems the processe(s) hang when I pass these arrays.
I printed some debug lines and this is the output:
It seems when I call KeyboardInterrupt, sometimes this is the line it is hanging on:
This is the code that creates the processes (it literally what I had last time):
This part of my code creates multiple images. I was going to make it more efficient by having to create multiple at once, so the process is done quicker. I though it would be as simple as just integrating the code I used before, but it doesn't work.
This time, rather than writing to a file, I am creating multiple images, all of which are large numpy arrays. I pass these arrays into the queue, so now it is holding the data of the images. The next step would be to read the queue, and save all of the images, however, it seems the processe(s) hang when I pass these arrays.
I printed some debug lines and this is the output:
Output:create_making_process
processes [<Process(Process-2, started)>]
while loop
create_making_process
processes [<Process(Process-2, started)>, <Process(Process-3, started)>]
while loop
out of loop
This is the correct output, however, after the last line 'out of loop', I would expect to see a numpy array being printed (because I call queue.get()) but nothing happens and it just hangs.It seems when I call KeyboardInterrupt, sometimes this is the line it is hanging on:
lock.acquire(block, timeout)
but sometimes it seems to be n = write(self.??, buf)
(I can't remember the full line, and I couldn't get it to appear).This is the code that creates the processes (it literally what I had last time):
n_frames = Image.open(options['to_create']).n_frames #gets amount of frames in gif queue = multiprocessing.Queue() while (len(processes) < n_frames): #checks for current processes alive being less than all the files that need to be worked on if (len(processes) - len([p for p in processes if not p.is_alive()]) < 3): #if the current amount of live processes is less than 3 - limiting at 3 no matter what because it uses a lot of ram def create_making_process(processes, frame, q): #create some more processes print("create_making_process") p = multiprocessing.Process(target = MakeImage.create_image_gif, args=(frame, q)) #create a new process processes.append(p) #add it to array p.start() #start it create_making_process(processes, index, queue) index+=1 print("processes", processes) print("while loop") print("out of loop")In create_image_gif, I work on the image and then add it to queue as a numpy array like this:
q.put([new_img])What is causing this problem?