Thursday, December 01, 2005

Pipelines

You can add as many elements to a pipeline as you want.

ls -l /dev/
ls -l /dev/ | grep -v ,
ls -l /dev/ | grep -v , | grep -v -- '->'
ls -l /dev/ | grep -v , | grep -v -- '->' | grep -v ' 0'
This is an everyday, and brilliant, use of command history: exploratory
programming.

Issue a command, look at the output. Filter that output through something else to get closer to what you want. Look at the result of that, and filter it again. When you finally get what you want, redirect the output to save it.

The metaphor of pipes and filters suffuses shell programming. To make your own programs (including your shell scripts) fit in, all you have to do is design them to take input from standard in, and spit output to standard out.

Unless there are multiple inputs or outputs, don't write to files or read from them; use redirection for that.

Suppose, for example, you write a program that collects all the #ifdef variables into a file, one per line. One design choice is to have it called like this:
ifdefvars CFILE VARFILE
Worse, you could even design it to be called like this:
ifdefvars CFILE
and have it automatically generate CFILE.VARS.

If, instead, you have it read from stdin and write to stdout, so it's called like this
ifdefvars < CFILE > VARS
you can pop together a pipeline, at a moment's notice, to scan your code for mistyped variables:
find . -name '*.[ch]' | xargs cat | ifdefvars | sort | uniq -c | awk '$1 == 1'
(Don't know what all these commands do? Type the first one in and run it. Next, use command history to recall it, add the next stage in the pipeline and watch what changes.)

0 Comments:

Post a Comment

<< Home