The attaced image greatly goes into detail of I'm trying to do. The specific queue passed in:
Also if there's a different way to get the current process id in the worker function, to be able to associate it with the current 'individual' command passed in the queue, this would be appreciated. I would prefer not to use os.getpid().
This is the code, with some pseudocode:
Edit: I was able to use the memory address of the queue as a reference point, to decide which queue to put said command in.
IMO relying on the memory address isn't the ideal solution to getting the args= queue, passed as an argument.
p = multiprocessing.Process(target=worker, name=str(i), args=(qList[i],))I would like to extract, and perform a .put() action, from a different worker instance that is running in parallel.
Also if there's a different way to get the current process id in the worker function, to be able to associate it with the current 'individual' command passed in the queue, this would be appreciated. I would prefer not to use os.getpid().
This is the code, with some pseudocode:
def worker(individualCMDfromASubQue): cmdToExecute = individualCMDfromASubQue.get(block=False) workerPID = qPIDs.get() ~if a certain type of command, assign workerPID: setPID(individualCMDfromASubQue, workerPID) # The problem with this approach is that relies on the multiprocessing # to execute in a specific order. ~if certain condition is met, send present individualCMDfromASubQue ~to a different subqueue using PID as reference number for idx, i in enumerate(jobs): if i.pid == getRecordedPID_ofArecentCMD_stillExecuting(var1): if not subQueue[idx].full(): # order that list is appended subQueue[idx].put(cmdToExecute) # again relies on order; what if subqueue is later deleted? subQueue = [multiprocessing.Queue(maxsize=2) for i in range(2)] qPIDs = queue.Queue() jobs = [] if __name__ == '__main__': subQueue[1] = [<command 1>, <command 2>, <command 3>] subQueue[2] = [<command 4>, <command 5>, <command 6>] for i in range(len(subQueue)): p = multiprocessing.Process(target=worker, args=(subQueue[i],)) jobs.append(p) for p in jobs: p.start() qPIDs.put(p.pid) # Generated after starting
Edit: I was able to use the memory address of the queue as a reference point, to decide which queue to put said command in.
multiprocessing.current_process().pid
was helpful, but would require me to write more code and such. I was able to add more items to each worker queue and have them continue
on to the next while True:
iteration if their qSize() is not 0.IMO relying on the memory address isn't the ideal solution to getting the args= queue, passed as an argument.