Python Forum

Full Version: Multiprocessing Module Running An Infinite Child Process Even After Completion
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I have a sample code which uses joblibs and multiprocessing modules. The code works fine when run from the command line, but when I package it as an executable using Pyinstaller, the multiple processes spawn as new instances infinitely (creating new child process id in backend).

After scoping lot of sources I was suggested to use multiprocessing.freeze_support() at the beginning of my if __name__ == "__main__": block.

This resolved the infinite invocation of my main script, but after completion of the code, there was a left out child process still running in background invoked with a new process id always with below process name :

Output:
user 2741 1 28 14:25 pts/5 00:00:00 /home/user/ParallelProcessing -E -s -B -S -c from multiprocessing.semaphore_tracker import main;main(5)
After few seconds (Process ID changes)

Output:
user 2745 1 28 14:25 pts/5 00:00:00 /home/user/ParallelProcessing -E -s -B -S -c from multiprocessing.semaphore_tracker import main;main(5)
I would just like to know what additional part am I missing in my code to avoid such issues.

Code :
from joblib import Parallel, delayed                            
import multiprocessing                            
import time                            
inputs = range(10)                             
def processInput(i):                            
    print("Started : ",i)                            
    time.sleep(i)                            
    print("Completed : ",i)                            
    return i * i


def main():                            
    num_cores = multiprocessing.cpu_count()                            
    backend = 'threading'                            
    print(num_cores)                            
    results = Parallel(n_jobs=num_cores, backend=backend)(delayed(processInput)(i) for i in inputs)                            
    print(results)


if __name__ == "__main__":                            
    multiprocessing.freeze_support()                            
    main()
Have even tried handling via :
backend=threading argument during Parallel() call.
But to no use.