Python Forum

Full Version: Windows/linux shell emulation
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
hello. 
I'm looking for a way to run commands and execute programs on both windows and Linux system to make my servers more accessible than running multiple SSH clients. 
I am using 3 scripts. The first is the server code which sends and receives commands. 
the server receives a command in the form of "ID task command"
multiple clients will connect to the server and await a command to be sent to them. 
Finally the controller sends the commands with the I'D of the system needing to be controled. 

The part I need gel with is getting the output (or lack of) to the server without crashing or timing out. 

I understand I need to use sub process but I'm only sending strings to the client's and they are system specific commands with an arbertery number of args. 
example I could send "1 command ls -lh" to a Linux client. 
this breakes down to client 1 run command "ls -lh" which should return its output. 
however some commands won't return an output like running an eve on windows. 
is there a stable method to complete multi platform commands?
You could do it by building an RPC server and clients. A library capable of this is zerorpc. There is an example on the github page. Or mprpc which is based on the MessagePack library. Hm... on msgpack-rpc which uses MessagePack.

So unpack the message ( the command string )and subprocess it:

command = message.split()
subprocess.call(command)
It should be simple.
(Nov-25-2017, 09:02 PM)wavic Wrote: [ -> ]You could do it by building an RPC server and clients. A library capable of this is zerorpc. There is an example on the github page. Or mprpc which is based on the MessagePack library. Hm... on msgpack-rpc which uses MessagePack.

So unpack the message ( the command string )and subprocess it:

command = message.split()
subprocess.call(command)
It should be simple.

Running the shell commands is not really a massive issue, the problem comes when trying to get the output of that.
currently I'm using 
subprocess.check_output(command). However if the command is "C:\\exe\putty.exe" then there will never be an output and the system falls apart as its waiting for an expected output or for the command to be completed (which will never happen).

So my question is without defining every single command and creating my own syntax system is there a way i can both launch executable and arbitrary commands with outputs and set a response timeout (without closing the socket).
Example if i set a 5 second timeout and the client has not sent a response within that time, simple stop waiting to receive any response and let me send a new command through.
One way is to create aliases for all commands and use them on all machines instead of the system commands.
Also is not so hard to create a multiple ssh connection using threading or similar approach. I am saying that is not so hard but I have to push my head to figure it out. This is not I do for a living.

If you decide to do it in Python there will be servers on all machines because all of them are listening for a command. Distribution of the commands as well. PyZMQ is a  good one for distributed messaging. I hope someone whosе job is in the area to join to the discussion.

Looks like a project. It's not going to achieve it in two days.