python - Paramiko - Running commands in "background" -
i've implemented paramiko using exec_command, however, command i'm running on remote machine(s) can take several minutes complete.
during time python script has wait remote command complete , receive stdout.
my goal let remote machine "run in background", , allow local python script continue once sends command via exec_command.
i'm not concerned stdout @ point, i'm interested in bypassing waiting stdout return script can continue on while command runs on remote machine.
any suggestions?
current script:
def function(): ssh_object = paramiko.sshclient() ssh_object.set_missing_host_key_policy(paramiko.autoaddpolicy()) ssh_object.connect(address, port=22, username='un', password='pw') command = 'command run' try: stdin, stdout, stderr = ssh_object.exec_command(command) stdout.readlines() except: else
thank you!
use separate thread run command. threads should cleaned join
command (the exception daemon
threads expect run until program exits). how depends on other stuff program running. example is:
import threading def ssh_exec_thread(ssh_object, command): stdin, stdout, stderr = ssh_object.exec_command(command) stdout.readlines() def function(): ssh_object = paramiko.sshclient() ssh_object.set_missing_host_key_policy(paramiko.autoaddpolicy()) ssh_object.connect(address, port=22, username='un', password='pw') command = 'command run' thread = threading.thread(target=ssh_exec_thread, args=(ssh_object, command) thread.start() ...do else... thread.join()
you can make fancier passing queue
ssh_exec_command
, put result on queue processing program later.
Comments
Post a Comment