So just today I was wondering if there was a cli tool (or maybe a clever use of existing tools...) that could watch the output of one command for a certain string, parse bits of that out, and then execute another command with that parsed bit as input. For example, I have a command I run that spits out a log line with a url on it, I need to usually manually copy out that url and then paste it as an arg to my other command. There are other times when I simply want to wait for something to start up (you'll usually get a line like "Dev server started on port 8080") and then execute another command.
I know that I could obviously grep the output of the first command, and then use sed or awk to manipulate the line I want to get just the url, but I'm not sure about the best way to go about the rest. In addition, I usually want to see all the output of the first command (in this case, it's not done executing, it continues to run after printing out the url), so maybe there's a way to do that with tee? But I usually ALSO don't want to intermix 2 commands in the same shell, i.e. I don't want to just have a big series of pipes, Ideally I could run the 2 commands separately in their own terminals but the 2nd command that needs the url would effectively block until it received the url output from the first command. I have a feeling maybe you could do this with named pipes or something but that's pretty far out of my league...would love to hear if this is something other folks have done or have a need for.
$ mkfifo myfifo
$ while true; do sed -rune 's/^Dev server started on port (.*)/\1/p' myfifo | xargs -n1 -I{} echo "Execute other command here with argument {}"; done
In the other terminal, run your server and tee the output to the fifo you just created:
A named pipe sounds like a good way to fulfill your requirement of having the command runs on separate shells.. In the first terminal, shove the output of commend A into the named pipe. In the second terminal, have a loop that reads from the named pipe line by line and invokes command B with the appropriate arguments.
You can create a named pipe using "mkfifo", which creates a pipe "file" with the specified name. Then, you can tell your programs to read and write to the pipe the same way you'd tell them to read and write from a normal file. You can use "<" and ">" to redirect stdout/stderr, or you can pass the file name if it's a program that expects a file name.
1. Run one command with output to a file, possibly in the background. Since you want to watch the output, run “tail --follow=name filename.log”.
2. In a second terminal, run a second tail --follow on the same log file but pipe the output to a command sequence to find and extract the URL, and then pipe that into a shell while loop; something like “while read -r url; do do-thing-with "$url"; done”.
I know that I could obviously grep the output of the first command, and then use sed or awk to manipulate the line I want to get just the url, but I'm not sure about the best way to go about the rest. In addition, I usually want to see all the output of the first command (in this case, it's not done executing, it continues to run after printing out the url), so maybe there's a way to do that with tee? But I usually ALSO don't want to intermix 2 commands in the same shell, i.e. I don't want to just have a big series of pipes, Ideally I could run the 2 commands separately in their own terminals but the 2nd command that needs the url would effectively block until it received the url output from the first command. I have a feeling maybe you could do this with named pipes or something but that's pretty far out of my league...would love to hear if this is something other folks have done or have a need for.