I'm ssh
ing into a server and I'm starting a Python script that'll take approx. 24 hours to complete. What if my internet connection dies in the middle? Will that stop the command?
Is there any way to run my long-running command in a way that local disconnects won't affect it and I could continue to see its output after I log in to ssh
again?
The best way is to use
screen
(on the server) to start a session to run the command in and then disconnect the screen so it will keep running, and you can do other things, or just disconnect from the server. The other option is to usenohup
in combination with&
so you would havenohup <command> &
The existing answers can work well, but I needed something for BusyBox (a shell and set of tools for minimal hardware like home routers). My system does not have
screen
,dtach
,at
,disown
, or evennohup
! So thanks to tbc0 on SO (link), I found this gem. It returns immediately but the server process continues to run:Or, if multiple commands are needed:
Explanation:
>&-
- close stdout handle2>&-
- close stderr<&-
- close stdin&
- put process in backgroundThis uses no external programs and should work across ksh, ash, Bourne shell, bash, etc.
You can also use
disown
if you've already started the process withoutscreen
ornohup
If you background a process & close your session the process will get adopted by init (PID 1).
If I have a session and do:
and open another session and run:
My process is still running, and we see that its parent process is now 1 (init).
Alternatively you could set up your script like you would a daemon. A quick search turns up this seemingly useful link: http://onlamp.com/python/pythoncook2/solution.csp?day=1. If you wanted to take that approach.