Master Essential Linux Shell Tools: find, grep, awk, sed & More
This guide introduces the most commonly used Linux shell utilities for text processing—including find, grep, xargs, sort, uniq, tr, cut, paste, wc, sed, and awk—explaining their key options, practical examples, and best practices to help you efficiently manipulate files and data from the command line.
Linux shell is a fundamental skill; mastering its core utilities greatly enhances text processing and system navigation.
1. find – File Search
Common examples:
<code>find . \( -name "*.txt" -o -name "*.pdf" \) -print</code> <code>find . -regex ".*\(\.txt|\.pdf\)$"</code> <code>find . ! -name "*.txt" -print</code> <code>find . -maxdepth 1 -type f</code>Search by type, time, size, permissions, user, etc. Example – files accessed in the last 7 days:
<code>find . -atime 7 -type f -print</code>Delete all swap files:
<code>find . -type f -name "*.swp" -delete</code>Execute actions with
-exec:
<code>find . -type f -user root -exec chown weber {} \;</code>{} is replaced by each matched filename.
2. grep – Text Search
Basic usage and useful options:
-o: output only matching parts
-v: invert match
-c: count matches
-n: show line numbers
-i: ignore case
-l: list matching file names
<code>grep -c "text" filename</code> <code>grep "class" . -R -n</code> <code>grep -e "class" -e "virtual" file</code> <code>grep "test" file* -lZ | xargs -0 rm</code>3. xargs – Build Command Lines
Convert input into command arguments, often combined with
grepor
find:
<code>cat file.txt | xargs</code> <code>cat single.txt | xargs -n 3</code>Key options:
-d: define delimiter (default space, \n for lines)
-n: number of arguments per command line
-I {}: replace placeholder with input
-0: use \0 as delimiter
<code>cat file.txt | xargs -I {} ./command.sh -p {} -1</code> <code>find source_dir/ -type f -name "*.cpp" -print0 | xargs -0 wc -l</code>4. sort – Sorting
Options:
-n: numeric sort
-d: dictionary order
-r: reverse
-k N: sort by column N
<code>sort -nrk 1 data.txt</code> <code>sort -bd data # ignore leading blanks</code>5. uniq – Remove Duplicate Lines
<code>sort unsort.txt | uniq</code> <code>sort unsort.txt | uniq -c # count occurrences</code> <code>sort unsort.txt | uniq -d # show duplicates only</code>6. tr – Translate Characters
Common uses:
<code>echo 12345 | tr '0-9' '9876543210'</code> <code>cat text | tr '\t' ' '</code> <code>cat file | tr -d '0-9' # delete digits</code> <code>cat file | tr -c '0-9' # keep only digits</code> <code>cat file | tr -s ' ' # squeeze spaces</code>7. cut – Column Extraction
<code>cut -f2,4 filename</code> <code>cut -f3 --complement filename</code> <code>cut -f2 -d ";" filename</code>8. paste – Merge Columns
Combine files side‑by‑side, default delimiter is a tab; use
-dto change:
<code>paste file1 file2 -d ","</code>9. wc – Count Lines, Words, Bytes
<code>wc -l file # lines</code> <code>wc -w file # words</code> <code>wc -c file # bytes</code>10. sed – Stream Editing
Replace first occurrence:
<code>sed 's/text/replace_text/' file</code>Global replacement:
<code>sed 's/text/replace_text/g' file</code>In‑place edit:
<code>sed -i 's/text/replace_text/g' file</code>Delete empty lines:
<code>sed '/^$/d' file</code>11. awk – Powerful Text Processor
Basic script structure:
<code>awk 'BEGIN{...} { ... } END{...}' file</code>Print current line:
<code>awk '{print}' file</code>Print specific fields:
<code>awk '{print $2, $3}' file</code>Count lines:
<code>awk 'END{print NR}' file</code>Sum a column:
<code>awk '{sum+=$1} END{print sum}' file</code>Set field separator:
<code>awk -F: '{print $NF}' /etc/passwd</code>Read command output:
<code>awk '{"grep root /etc/passwd" | getline cmd; print cmd}'</code>12. Iterating Over Files
Line‑by‑line loop:
<code>while read line; do echo $line; done < file.txt</code>Word iteration:
<code>for word in $line; do echo $word; done</code>Character iteration (bash substring):
<code>for ((i=0;i<${#word};i++)); do echo ${word:i:1}; done</code>Source: 大CC, http://www.cnblogs.com/me115/p/3427319.html
Efficient Ops
This public account is maintained by Xiaotianguo and friends, regularly publishing widely-read original technical articles. We focus on operations transformation and accompany you throughout your operations career, growing together happily.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.