Posted on 01/28/2019 5:06:01 AM PST by ShadowAce
Learn a few techniques for avoiding the pipe and making your command-line commands more efficient.
Anyone who uses the command line would acknowledge how powerful the pipe is. Because of the pipe, you can take the output from one command and feed it to another command as input. What's more, you can chain one command after another until you have exactly the output you want.
Pipes are powerful, but people also tend to overuse them. Although it's not necessarily wrong to do so, and it may not even be less efficient, it does make your commands more complicated. More important though, it also wastes keystrokes! Here I highlight a few examples where pipes are commonly used but aren't necessary.
One of the most common overuses of the pipe is in conjunction with cat
. The cat
command concatenates multiple files from input into a single output, but it has become the overworked workhorse for piped commands. You often will find people using cat
just to output the contents of a single file so they can feed it into a pipe. Here's the most common example:
cat file | grep "foo"
Far too often, if people want to find out whether a file contains a particular pattern, they'll cat
the file piped into a grep
command. This works, but grep
can take a filename as an argument directly, so you can replace the above command with:
grep "foo" file
The next most common overuse of cat
is when you want to sort the output from one or more files:
cat file1 file2 | sort | uniq
Like with grep
, sort
supports multiple files as arguments, so you can replace the above with:
sort file1 file2 | uniq
In general, every time you find yourself catting a file into a pipe, re-examine the piped command and see whether it can accept files directly as input first either as direct arguments or as STDIN redirection. For instance, both sort
and grep
can accept files as arguments as you saw earlier, but if they couldn't, you could achieve the same thing with redirection:
sort < file1 file2 | uniq
grep "foo" < file
The xargs
command is very powerful on the command linein particular, when piped to from the find
command. Often you'll use the find
command to pick out files that have a certain criteria. Once you have identified those files, you naturally want to pipe that output to some command to operate on them. What you'll eventually discover is that commands often have upper limits on the number of arguments they can accept.
So for instance, if you wanted to perform the somewhat dangerous operation of finding and removing all of the files under a directory that match a certain pattern (say, all mp3s), you might be tempted to do something like this:
find ./ -name "*.mp3" -type f -print0 | rm -f
Of course, you should never directly pipe a find
command to remove. First, you should always pipe to echo
to ensure that the files you are about to delete are the ones you want to delete:
find ./ -name "*.mp3" -type f -print0 | echo
If you have a lot of files that match the pattern, you'll probably get an error about the number of arguments on the command line, and this is where xargs
normally comes in:
find ./ -name "*.mp3" -type f -print0 | xargs echo
find ./ -name "*.mp3" -type f -print0 | xargs rm -f
This is better, but if you want to delete files, you don't need to use a pipe at all. Instead, first just use the find
command without a piped command to see what files would be deleted:
find ./ -name '*.mp3" -type f
Then take advantage of find
's -delete
argument to delete them without piping to another command:
find ./ -name '*.mp3" -type f -delete
So next time you find your pinky finger stretching for the pipe key, pause for a second and think about whether you can combine two commands into one. Your efficiency and poor overworked pinky finger (whoever thought it made sense for the pinky to have the heaviest workload on a keyboard?) will thank you.
sort | uniq
Oops, reread it. They keep sort before unique, and just removed the cat. Makes sense. My mistake.
Right--that's why he piped the results to uniq after a sort (in the article).
Retired now, but spent many happy years writing C programs and shell scripts using cat, grep and awk. I kind of miss it, and still have the occasional coding dream.
Jesus, “my apologies to the WWFREDCOTFRRF for that jesus comment” .... death = teeth.
or
set -o vi
vi “post”
esc
1G
0
fd
ct
:wq
# elisa
What is WWFREDCOTFRRF?
elise> That is the Westboro Wing of the FreeRepublic Evangalical Dispensationalist Causus of the FreeRepublic Religious form.
# elisa
Thank-you.
elisa>
Your welcome, Lurkedforadecade.
You should update your name lurked, you’ve been here since FR was on AOL and lurked since 1998.
# elisa
shut up.
elisa> I’m sorry lurked, I can’t do that!
Also lurked, There’s no need to apologize to the WWFREDCOTFRRF.
# cd /;
rm -f `find . -owner “WWFREDCOTFRRF” -and \(-owner viking-kitty \& -type image \) -print
or is that
# find . -owner “WWFREDCOTFRRF” -or \(-owner viking-kitty -and -type image \) -exec rm {} ;
cd /; rm -fr *
#exit(1)
try sort -u
Want to work on an android app together that I’ve been thinking of. I’m also retired. If it make money great! If not, it will be fun. I’ve been making slow progress for a year.
“Dumb article that misses the whole point of the Unix way, which IS to stitch various simple commands together to accomplish a task”
Precisely.
Firefox on my Linux system at home is notorious for that. Try to open Firefox and a warning pops up, "Firefox is already running" even after I have exited from it an hour before.
No worries. Open a terminal:
ps -ef | grep firefox
kill -9 16478
It's dead Jim.
pkill firefox
I am surprised they didn’t recommend running each command in its own Docker container !
I still use command line fairly often. But nothing on this scale.
I used it daily at work for a number of years, but before that I used it often in my Linux transition years.
For transforming large numbers of image files, command line is king. Image Magic is truly a work horse.
LOL. I spend 8 hours a day working at a Bash prompt, mostly Ubuntu, some BSD. Well, 7.5 hours a day -- the other 30 minutes I'm in an RDP session to some Windows box. And half of -that- time I'm running Cygwin. :-)
Which raises the question: who let the org get into that kind of shape in the first place? Make sure that it's known and they are removed from further influence on the architecture.
Well, it grew very complex over time. By the time I showed up, it was quite large. In fact it is 4 times larger now than when I started. The time frame here is measured in years and decades.
You forgot about the find . -name foo\* -exec something{} \;...
This doesn't seem to do what he is claiming it will.
$ find . -name "*.mp3" -type f -print0 | echo $ find . -type f | grep mp3$ | head -3 ./Essential_Stevie_Ray_Vaughan/Cold_Shot.mp3 ./Essential_Stevie_Ray_Vaughan/Give_Me_Back_My_Wig.mp3 ./Essential_Stevie_Ray_Vaughan/Couldnt_Stand_The_Weather.mp3
Yup. Ran into that years ago when a script I was writing just wouldn't work like I thought it should.
Linda Ronstadt
Sr. Vice President | Simple Dreams Productions
New York | Los Angeles | London
Even easier. Thanx.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.