here's an idea for a little project: a command that takes 2 (or maybe more, if you want to implement that) commands, in the arguments (or elsewhere, whatever you decide), and executes them in parallel in separate processes, and waits for all of them to be done before exiting.
i have done this in C, but a Python version would be nice example code to have around.
Sure. How's this?
import shlex, sys, subprocess
def SpinTask(task):
task = shlex.split(task)
proc = subprocess.Popen(task, stdout=subprocess.PIPE)
return proc
help_text = '''
Feed me some commands!
I go do them, and return when they're all done!
ANY OUTPUT IS CONSUMED! MY HUNGER IS INSATIABLE!
'''
if __name__ == '__main__':
programs = sys.argv[1:]
if not programs:
print(help_text)
else:
# start things running
processes = [SpinTask(process) for process in programs]
# now that they're all running, wait for them to finish
[process.wait() for process in processes]
asyncio can also spawn subprocesses, so that could be a cool way for someone to do it.
@
nilamo looks good to me
my C version had a process pool size limit and would substitute each line of input (stdin) into a marker character in the command pattern. an option could be used to use a different marker character. i never thought to try it in bash. but now i would do it in python. i plan to convert lots of things i have done in either C or bash, to python.