Blog Post
5 min read

🐧 The Ultimate Guide to Linux "All" Filter Commands: Short, Sweet & Spicy

Ever feel lost in the Linux terminal? Yeah, us too. But here's the tea β˜• β€” mastering filter commands is like having a superpower. Let's dive in!What A...

Published on March 26, 2026

Ever feel lost in the Linux terminal? Yeah, us too. But here's the tea β˜• β€” mastering filter commands is like having a superpower. Let's dive in!

What Are Filter Commands Anyway?

Think of them as bouncers at a club. They let in what you want and kick out the rest. Simple as that.

1. grep – The Ultimate Bouncer πŸ›‘οΈ

The OG of filtering. Find text patterns like you're Sherlock Holmes.

# Find lines containing "error" in a file
grep "error" logfile.txt

# Case-insensitive search
grep -i "ERROR" logfile.txt

# Count matching lines
grep -c "error" logfile.txt

# Show line numbers
grep -n "error" logfile.txt

# Invert match (show lines WITHOUT the word)
grep -v "success" logfile.txt

Pro tip: grep is your BFF. Seriously.


2. awk – The Data Slicer πŸ”ͺ

For when you need to parse columnar data like a boss.

# Print specific columns (space-separated)
awk '{print $1, $3}' data.txt

# Filter lines where a column meets a condition
awk '$2 > 100 {print $0}' numbers.txt

# Count lines in a file
awk 'END {print NR}' file.txt

# Sum a column
awk '{sum += $1} END {print sum}' numbers.txt

Fun fact: awk was named after its creators' initials (Aho, Weinberger, Kernighan). How cool is that? 😎

3. sed – The Stream Editor πŸ“

Replace, delete, or transform text like a wizard.

# Replace first occurrence on each line
sed 's/old/new/' file.txt

# Replace all occurrences
sed 's/old/new/g' file.txt

# Delete lines containing a pattern
sed '/pattern/d' file.txt

# Delete specific line numbers
sed '5d' file.txt

# Print only lines matching a pattern
sed -n '/pattern/p' file.txt

Warning: sed can be spicy. Always backup before you sed. 🌢️


4. cut – The Minimalist πŸ’‡

Extract columns or characters. Dead simple.

# Extract columns (colon-separated)
cut -d: -f1 /etc/passwd

# Extract character range
cut -c1-5 file.txt

# Extract by byte
cut -b1-10 file.txt

Vibe: If awk is the chef, cut is the sushi knife. Precise. Clean. Elegant.


5. sort – The Organizer πŸ“Š

Put things in order because chaos is overrated.

# Sort alphabetically
sort file.txt

# Sort numerically
sort -n numbers.txt

# Reverse sort
sort -r file.txt

# Sort by specific column
sort -k2 -n data.txt

# Remove duplicates while sorting
sort -u file.txt

6. uniq – The Duplicate Detective πŸ”

Find or remove duplicate lines. It's that simple.

# Show unique lines only
uniq file.txt

# Count occurrences
uniq -c file.txt

# Show only duplicated lines
uniq -d file.txt

# Show only unique lines (no repeats)
uniq -u file.txt

# Case-insensitive
uniq -i file.txt

Remember: uniq works best on sorted files. They're like a coupleβ€”better together! πŸ‘«


7. wc – The Counter πŸ”’

Count words, lines, and bytes. Sounds boring but super useful.

# Count lines
wc -l file.txt

# Count words
wc -w file.txt

# Count bytes
wc -c file.txt

# Count characters
wc -m file.txt

# All the stats
wc file.txt

8. tr – The Transformer πŸ”„

Translate, delete, or squeeze characters.

# Convert lowercase to uppercase
tr a-z A-Z < file.txt

# Delete specific characters
tr -d '0-9' < file.txt

# Squeeze repeated characters
tr -s ' ' < file.txt

# Replace one character with another
tr ':' ',' < file.txt

Cool factor: Only accepts stdin. It's like the minimalist of the group. ✨


9. head & tail – The Boundary Guards πŸ‘€

Look at the beginning or end of files.

# First 10 lines
head file.txt

# First 5 lines
head -n 5 file.txt

# Last 10 lines
tail file.txt

# Last 5 lines
tail -n 5 file.txt

# Live monitoring (follow mode)
tail -f logfile.txt

Hack: tail -f is perfect for watching logs in real-time. You're welcome. 🎁


10. tee – The Splitter 🌳

Split output to both stdout AND a file.

# Write to file AND display
command | tee output.txt

# Append to file
command | tee -a output.txt

# Chain it
echo "hello" | tee file1.txt | tee file2.txt

11. xargs – The Argument Builder πŸ—οΈ

Convert stdin into command arguments. Powerful stuff.

# Pass file list to a command
ls *.txt | xargs wc -l

# Execute with null delimiter
find . -name "*.tmp" -print0 | xargs -0 rm

# Limit arguments per invocation
echo -e "file1\nfile2\nfile3" | xargs -n 1 echo

🎯 Pro Tips: Chaining Magic

The real power? Pipes. Combine these bad boys:

# Find all errors, count occurrences, show top 5
grep "error" app.log | sort | uniq -c | sort -rn | head -5

# Extract usernames, sort, remove duplicates
cut -d: -f1 /etc/passwd | sort | uniq

# Find large files, list details
find . -type f -size +10M | xargs ls -lh

πŸš€ One-Liners to Impress Your Friends

# Count total lines in all .txt files
find . -name "*.txt" -exec wc -l {} + | tail -1

# Find most frequent words in a file
cat file.txt | tr ' ' '\n' | sort | uniq -c | sort -rn | head -10

# Extract unique IP addresses from logs
grep -oE '[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}' log.txt | sort -u

# Replace tabs with spaces in all Python files
find . -name "*.py" -exec sed -i 's/\t/    /g' {} +

πŸŽ“ The Golden Rules

  1. Know your data – Understand what you're filtering.

  2. Test first – Always test on a copy before using -i (in-place edit).

  3. Pipe wisely – Each pipe adds overhead, but readability > micro-optimization.

  4. Combine tools – One tool does one thing well. Mix them!

  5. RTFM – Seriously, man grep is your friend.


🏁 Final Thoughts

Filter commands are the spice rack of Linux. Master them, and you'll cook up command-line magic every day. Start small, combine fearlessly, and soon you'll be writing one-liners that make senior devs jealous. 😎

Now go forth and filter like a boss! πŸš€


Happy filtering,
Your Friendly Neighborhood Linux Enthusiast 🐧