Skip to content

Pipes in Linux

A pipe in Linux is a mechanism for inter-process communication that allows the output of one command to be used as the input to another command. It is represented by the | symbol and is extensively used in command-line operations for creating efficient workflows by chaining commands.


How Pipes Work

  1. A pipe takes the stdout (Standard Output) of one command and passes it as the stdin (Standard Input) to another command.
  2. Commands connected by a pipe execute in parallel, ensuring efficient data processing.

Basic Syntax

command1 | command2

  • command1: Generates the output.
  • command2: Processes the output from command1.

For example:

ls -l | grep “.txt”

  • ls -l: Lists files in the current directory.
  • grep “.txt”: Filters the list to show only files with a .txt extension.

Practical Examples

1. Filter Output

ps aux | grep “apache”

  • ps aux: Lists all running processes.
  • grep “apache”: Filters the list to show only processes containing “apache”.

2. Count Lines in Output

ls | wc -l

  • ls: Lists files in the current directory.
  • wc -l: Counts the number of lines (files listed).

3. Combine Multiple Commands

cat file.txt | sort | uniq

  • cat file.txt: Displays the content of file.txt.
  • sort: Sorts the lines.
  • uniq: Removes duplicate lines.

4. Extract Specific Columns

cat /etc/passwd | cut -d “:” -f 1

  • cat /etc/passwd: Displays the content of the /etc/passwd file.
  • cut -d “:” -f 1: Extracts the first field (username) from each line, using : as a delimiter.

5. Paginate Long Output

ls -l /usr/bin | less

  • ls -l /usr/bin: Lists files in /usr/bin.
  • less: Allows scrolling through the long output.

Advantages of Pipes

  1. Efficiency: Commands in a pipeline execute concurrently.
  2. Flexibility: Enables the combination of simple commands to perform complex tasks.
  3. No Temporary Files: Eliminates the need to create intermediate files for storing command outputs.

Combining Pipes with Redirection

Pipes can be combined with redirection for more complex tasks.

1. Save Final Output to a File

ls -l | grep “.txt” > output.txt

  • Filters .txt files and saves the result in output.txt.

2. Redirect Errors While Using Pipes

find / -name “*.conf” 2> errors.log | grep “apache”

  • Redirects errors (e.g., permission denied) to errors.log while processing the output.

Limitations of Pipes

  1. Sequential Dependency:
    1. Each command depends on the output of the previous one.
  2. Text-Based:
    1. Pipes are designed to handle text-based streams. For binary data, special handling is required.
  3. Single Direction:
    1. Data flows in one direction only: from the first command to the last.

Advanced Usage with xargs

Pipes can be enhanced using xargs to handle arguments for commands.

Example:

find . -name “*.txt” | xargs rm

  • find . -name “*.txt”: Finds all .txt files.
  • xargs rm: Deletes all files found by find.

Using Named Pipes (FIFOs)

For persistent data sharing between processes, use named pipes.

  1. Create a Named Pipe:

mkfifo mypipe

  • Write Data to the Pipe:

echo “Hello, Pipe!” > mypipe

  • Read Data from the Pipe:

cat < mypipe


Common Commands with Pipes

  • grep: Filters output.
  • awk: Processes and formats text.
  • cut: Extracts specific fields.
  • sort: Sorts data.
  • uniq: Removes duplicate lines.
  • wc: Counts lines, words, and characters.
  • less: Paginates output.
  • xargs: Builds and executes commands from standard input.

Conclusion

Pipes are a powerful feature of Linux that streamline command-line operations, enabling the efficient processing of data without creating intermediate files. By mastering pipes and combining them with redirection and advanced commands, users can construct robust workflows for complex tasks.