$ mkfifo ~/.jobqueue
$ parallel :::: ~/.jobqueue
now, from another terminal: echo ls > ~/.jobqueue
Tada!for a remote job queue, use nc instead of a fifo
for a job queue with a buffer, use a file, and run $ parallel :::: <(tail -f thefile) you can even truncate the file from time to time.
Of course, this doesn't capture the context you are in, such as your current directory or the value of shell variables.
tty1$ mkfifo /tmp/jobqueue
tty1$ eval $(tail -n 1 /tmp/jobqueue)
And then, in another terminal: tty2$ echo ls > /tmp/jobqueue sleep 2; echo "A" && sleep 2 ; echo "B" && sleep 2 ; echo "C"
# Quickly Ctrl-Z
fg && sleep 2 ; echo "D"
and you will see A\nB\nC\nD\n slowly printed out in your terminal.> building several targets of a Makefile
As opposed to `make && sudo make install` ?
> downloading multiple files one at a time
As opposed to `(wget url1 &) ; (wget url2 &)` ?
> or simply as a glorified `nohup`
As opposed to `nohup` ?
For example:
nq ./my_awesome_analysis /data/complicated_file_with_long_name.dat
# um... what's the next file?
ls /data
nq ./my_awesome_analysis /data/even_more_complicated_path.dat
man rsync
nq rsync -avh --progress out* somewhere_else.net:The shells have had queueing for, well, a very, very long time....
Queues up 20 concurrent downloads with wget, for example.
tail -n+0 -f list.txt | xargs -n 1 -P 20 wget