Python Forum

Full Version: How to collate Multiprocessing-Process results?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Let us consider the following code where I calculate the factorial of 4 really large numbers, saving each output to a separate .txt file (out_mp_{idx}.txt). I use multiprocessing (4 processes) to reduce the computation time. Though this works fine, I want to output all the four results in one file. Of course I can open each of the four files I create (from the code below) and append to a new file, but that's not my choice (below is just a simplistic version of my code, I have too many files to handle, which defeats the purpose of time-saving via multiprocessing). Is there a better way to automate such that the results from the processes are all dumped/appended to some file?

I also tried process.immap route, but that's not as computationally efficient as the below code.

from multiprocessing import Process
import os
import time

tic = time.time()

def factorial(n, idx):  # function to calculate the factorial

    num = 1
    while n >= 1:
        num *= n
        n = n - 1

    with open(f'out_mp_{idx}.txt', 'w') as f0:  # saving output to a separate file
        f0.writelines(str(num))

def My_prog():

    jobs = []
    N = [10000, 20000, 40000, 50000]  # numbers for which factorial is desired
    n_procs = 4
    
    # executing multiple processes
    for i in range(n_procs):  
        p = Process(target=factorial, args=(N[i], i))
        jobs.append(p)

    for j in jobs:
        j.start()

    for j in jobs:
        j.join()

    print(f'Exec. Time:{time.time()-tic} [s]')

if __name__=='__main__':
    My_prog()
The python.org write-up on multiprocessing is quite good here

It's worth a read if you haven't done so already.