Python Forum

Full Version: builing a long variable-length command-pipeline
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
there are many tools that allow executing system commands and some can provide input and/or capture output.  but suppose you have a series of commands with the number of commands known only when the commands become known that need to be run as a pipeline (output of a command goes to input of the next).  and you may need to provide input to the first and/or capture output from the last.  how do you do this?
sys.stdout.write() # pipe out
sys.stdin.read()  # pipe in
(Dec-12-2017, 05:24 AM)wavic Wrote: [ -> ]sys.stdout.write() # pipe out
sys.stdin.read()  # pipe in

that builds a pipeline?
You read from standard out and write to standard in, right?

#!/usr/bin/env python3

import sys

if len(sys.argv) > 1:
    pipe_in = sys.stdin.read().strip()
    sys.stdout.write('{} {} \n'.format(pipe_in, ' '.join(sys.argv[1:])))
else:
    pipe_in = sys.stdin.read().strip()
    sys.stdout.write('{} \n'.format(pipe_in))
You have to be careful with the quotes. If you put double quotes for the strings you get EOL error because of the bash quoting. I've needed some time to get it during this example. I have to test it before to publish Confused  

victor@jerry:/tmp$ echo hello | ./piped.py world
hello world

victor@jerry:/tmp$ echo hello | ./piped.py
hello

victor@jerry:/tmp$ echo hello | ./piped.py | ./piped.py
hello

victor@jerry:/tmp$ echo hello | ./piped.py beautiful | ./piped.py world
hello beautiful world

victor@jerry:/tmp$ echo hello | ./piped.py beautiful | ./piped.py beautiful world
hello beautiful beautiful world
shells like bash can take a particular syntax and understand a series of commands to be put into a pipeline, when such a command string with that syntax is given to it.  it makes a number of system calls to construct that pipeline, including the 2 you showed, many times.  what i am looking for, is code in Python that carries out a similar setup, execution, and completion.  i am not looking for a few example syscalls.  i am looking for how Python coders would carry out those steps.  part of the reason is to see what Python library functions might get used, how they would be used, and how they fit into the steps.  i am less interested in how a command pipeline string would parsed.  you could just assumed the commands to be pipelined are in a list ... a list of commands ... a list of lists of strings (a list of lists of parsed command tokens).  this could be defined as a function run_pipeline(list_of_lists_of tokens):.   fyi, like so many things, i've done this, before, in C.
subprocess has PIPE build in.
shell=False is default,and do not use shell=True so can pass in the whole shell pipe command.
Because of security reasons.

So can look at a command like this grep -o "hello" foo.txt | wc -l
It will search for hello in foo.txt,split at |.
stdout=subprocess.PIPE which tells subprocess to send that output to the respective file handler.
import subprocess

# grep -o "hello" foo.txt | wc -l
grep_proc = subprocess.Popen(['grep', '-o', 'hello', 'foo.txt'],stdout=subprocess.PIPE)
wc_proc = subprocess.Popen(['wc', '-l'],stdin=grep_proc.stdout, stdout=subprocess.PIPE)
result, arg_1 = wc_proc.communicate()
print(result.decode())
can you show how to connect TWO commands with that built-in PIPE in subprocess?  can you then expand that to N commands from a list of commands?

btw, the .communicate() method is bad in some cases because it waits for the command to complete before you get any data.  using this in a pipeline would preclude parallel execution of the commands.