10 Practical Bash Commands to Simplify Complex Tasks

Bash one-liners are a game-changer for anyone who wants to get things done efficiently in the terminal. Below are several examples of “complex” operations that are still wrapped in one-liner Bash commands.

1. Finding and Processing Log Files

This command searches recursively for .log files under /var/log and prints lines containing “ERROR”:

find /var/log -type f -name "*.log" -exec grep -H "ERROR" {} \;

Breakdown:

  • find locates files by name and type.
  • -exec … {} ; executes grep on each file found.
  • grep -H prints matching lines with filenames.

2. Comparing Sorted Contents of Two Files

This command compares two files after sorting them, using process substitution:

diff <(sort file1.txt) <(sort file2.txt)

Breakdown:

  • Process substitution <(command) treats the output of sort as if it were a file.
  • diff then compares these sorted outputs.

3. Calculating the Sum of Numbers from a File

Using awk, this one-liner reads a file with numbers and prints their sum:

awk '{sum += $1} END {print sum}' numbers.txt

Breakdown:

  • awk processes each line, adding the first field to sum.
  • END {print sum} prints the total after reading all lines.

4. Renaming Files by Replacing Spaces with Underscores

This command loops over all .txt files, renaming them by substituting spaces with underscores:

for file in *.txt; do mv "$file" "${file// /_}"; done

Breakdown:

  • for loop iterates over each text file.
  • Parameter expansion ${file// /_} replaces all spaces with underscores.
  • mv renames the file accordingly.

5. Parallel Processing with xargs

Suppose you need to process multiple files with a command concurrently. For instance, this command echoes each file name, processing up to 4 at a time:

ls *.txt | xargs -n1 -P4 -I{} sh -c 'echo "Processing file: {}" && sleep 1'

Breakdown:

  • ls *.txt lists text files.
  • xargs -n1 -P4 processes one file per command with a maximum of 4 processes in parallel.
  • -I{} replaces occurrences of {} with the current filename.
  • sh -c allows running multiple commands in sequence.

Before you move further checkout my newsletter and Twitter for more coding insights.

6. Archiving Recently Modified Files

This command finds all files modified in the last 7 days under a specified directory and archives them into a compressed tarball:

find /path/to/directory -type f -mtime -7 -print0 | tar --null -czvf backup.tar.gz --files-from=-

Breakdown:

  • find /path/to/directory -type f -mtime -7 -print0: Locates files modified within the last 7 days, using a null character as the delimiter.
  • tar — null -czvf backup.tar.gz — files-from=-: Reads the null-separated file list from standard input and creates a gzipped archive.

7. Inline Text Transformation with sed

This one-liner removes duplicate spaces in a file, replacing them with a single space, and edits the file in place:

sed -i 's/  */ /g' file.txt

Breakdown:

  • sed -i: Invokes sed with in-place editing.
  • ‘s/ */ /g’: Uses a substitution regex that finds one or more spaces and replaces them with a single space.

8. Monitoring System Resources

Continuously log the current date and memory usage every 10 seconds into a log file:

while true; do date; free -h; sleep 10; done >> system_usage.log

Breakdown:

  • while true; do …; done: Creates an infinite loop.
  • date; free -h: Prints the current date and human-readable memory usage.
  • sleep 10: Waits 10 seconds before repeating.
  • >> system_usage.log: Appends the output to a log file.

9. Concurrently Downloading URLs

Using xargs, this command reads URLs from a file and downloads them concurrently (up to 5 at a time):

xargs -n 1 -P 5 wget < urls.txt

Breakdown:

  • xargs -n 1 -P 5 wget: Runs wget for each URL, processing one URL per command with up to 5 processes running in parallel.
  • < urls.txt: Feeds the list of URLs from the file.

10. Finding Duplicate Files by Checksum

Compute MD5 checksums for all files in the current directory, sort them, and then use awk to print pairs of files that share the same checksum:

find . -type f -exec md5sum {} + | sort | awk 'BEGIN{lasthash=""; lastfile=""} {if($1==lasthash) print lastfile "\n" $2; lasthash=$1; lastfile=$2}'

Breakdown:

  • find . -type f -exec md5sum {} +: Computes MD5 checksums for each file.
  • sort: Orders the checksums so identical ones are adjacent.
  • awk: Compares consecutive checksum values and prints out filenames that match.

Leave a Reply

Your email address will not be published. Required fields are marked *