Bash shell basics — pipes, redirection, and coprocesses

Paul Guerin
8 min readJun 18, 2023

--

Compose pipes and pipelines

Bash shell pipes are awesome. Pipe the output of one command into the input of another. The piped commands create a pipeline.

A ‘pipeline’ is a sequence of one or more commands separated by one of
the control operators ‘|’ or ‘|&’.

A pipeline could be a shell application in it’s own right, and you can pipe a number of pipelines together to create another shell application.

Then you can conceptually take the same elements and combine them a different way to create another shell application.

A simple shell application could start with a list of something, then pipe into a filter or two, then pipe into a sort to finish up.

# List directories (use {} to select multiple items)
# then filter by column, then sort.
ls -ld vimfiles/{colors,plugin} | awk '{print $9}' | sort

# determine the pipestatus for each element in the pipeline eg 0 0 0
echo ${PIPESTATUS[*]}

Another example:

# List files, then filter by row, then filter by column, then sort.
ls -l | grep drw | awk '{print $9}' | sort

# determine the pipestatus for each element in the pipeline eg 0 0 0 0
echo ${PIPESTATUS[*]}

So the Bash shell makes it easy to pipe the output of one command into the input of another.

But the concept of feeding the output of a command (or pipeline) into the input of another doesn’t stop there….

Named pipes

FIFOs (also called named pipes) permits independent processes to communicate.

One process opens the FIFO file for writing, and another for reading, after which data can flow as with the usual anonymous pipe in shells or elsewhere.

This is how we can do this.

# FIFOs (ie named pipes)
mkfifo /tmp/testpipe

# should show the named pipe file as 'p'
ls -alhtr /tmp/testpipe

# alternative - check if the file is a named pipe
[[ -p /tmp/testpipe ]] && echo 'named pipe'

# put a text stream into the background while waiting to be read
echo 'ls go1*' >/tmp/testpipe &

# now read the pipe in ssh
ssh -i ~/.ssh/id_rsa -tt me@server </tmp/testpipe

# now the should see a list of the directory after login

# remove the FIFO
unlink /tmp/testpipe
ls -alhtr /tmp/testpipe

Standard input and standard output

A command can operate without arguments. The most basic is the cat command.

# Copy from standard input (ie keyboard) to standard output (ie screen).
cat

The basic cat command copies standard input to standard output. This means that when you execute the command, what you type on the keyboard (standard input) is echoed to the screen (standard output).

So type in ‘hello’ from the keyboard and ‘hello’ is echoed back to the screen.

But if you hardcode an argument (ie a filename) to cat, then the command will not read the standard input. Instead the input is redirected to become the contents of a file, and so now the contents of the file specified is input and echoed on the screen (standard output).

# Now the cat command will readily accept an
# argument as an input (eg file.txt file) instead of a standard input.
cat file.txt

Note: Output will still be to standard output (ie screen).

But not all commands have this auto ability to input from the keyboard or redirect input from a file contents.

For commands that accept hard coded arguments, there are other ways to get an input.

More standard input and standard output

Let’s examine the ls command. It doesn’t natively take standard input from the keyboard.

# List files of a directory.
ls

But ls does take arguments that are hard coded.

# List the files of a directory, and also take an argument as an input.
ls -l

Note: Output will still be to standard output (ie screen).

So in the example above, the ls command expects the first argument to be input as an option. The option is specified as ‘-l’, and is effectively hard coded.

It’s easy to add more arguments as an input, and they are also hard coded.

# list the files of a directory, and also take 2 arguments for input
ls -l Downloads

Note: Output will still be to standard output (ie screen).

Any argument you specify for this command is typically hard coded as an input.

But what if you don’t want to hard code the argument, and instead make the argument as the redirected input of the output of a pipeline?

This is where input redirection comes in using the ‘<’ operator.

Input redirection

So examine a simple pipeline, where the output of ls is piped into the input of grep.

We can just list the contents of a directory, and filter to get the sub-directories only.

# Grep will take an argument as the first input, 
# then take the output of a pipe of ls as the second input.
ls -l | grep drw

Note: Output will still be to standard output (ie screen).

Now lets do a similar command, but this time with input redirection. The ‘<’ operator is going to be used to signify that the input is not hard coded, but instead is going to be the output of a pipeline/command.

# In fact, grep can take arguments, pipes, and redirection as inputs.
grep drw <(ls -l)

Note: Output will still be to standard output (ie screen).

Redirection can be used with any command, but let’s re-examine the cat command again.

This time we’ll using the cat command with a redirected input, plus a redirected output using the ‘>’ operator.

# In fact, as inputs, the cat command will also take a redirected pipe.
cat <(ls -l| grep drw) >test.txt

Also the output can be redirected away from standard output to a file.

So now the output is redirected to a file instead of to the screen (standard output). Then we can use cat to echo the contents of the file to the screen.

Another input redirection

This time let’s pipe the output of a pipeline into a text editor. eg gvim.

One way to do this is just to tell gvim to use the standard input as an input. ie with the dash ‘-’ operator.

# Another pipe, but this time as in input to gvim.
# Note: dash '-' tells gvim to expect standard input as the input
(ls -l | grep drw) | gvim -

But we can also use the ‘<’ operator to redirect the input into gvim from a pipeline.

# Another redirection, and this time redirect the output of a pipeline
# as an argument of gvim
gvim <(ls -l | grep drw)

When the gvim editor opens, then we see the output of the pipeline was redirected as the input.

Coprocesses

But it doesn’t stop there — Bash also gives you pipes with redirection, but without the pipe file.

It’s a coprocess.

A coprocess is a shell command preceded by the coproc reserved word. A coprocess is executed asynchronously in a subshell, as if the command had been terminated with the ‘&’ control operator, with a two-way pipe established between the executing shell and the coprocess.

This is how to do a basic one with the coprocess setup with an ls command.

# setup a coprocess to execute an ls command, and then send an output back
coproc { echo $(ls -alhtr); }

# the coprocess will be sent to the background so view like this:
jobs

# now read the output from the coprocess, and echo it
read -r out1 <& "${COPROC[0]}"
echo $out1

Unfortunately, the control characters are stripped away in the output, but this is only for the basic example.

A more complex example is as follows, where the command is sent after the coprocess is created.

# setup a coprocess to execute a command that will be sent later
# the coprocess is named testcoproc
coproc testcoproc { bash; }

# the coprocess will be sent to the background so view like this:
jobs

# now send a command to the coprocess for execution
echo 'echo hello |sed "s/hello/goodby/"' >&"${testcoproc[1]}"

# now read the output from the coprocess, and echo it
read output <&"${testcoproc[0]}"
echo $output

Another more complex example is to send multiple commands to the coprocess.

# setup a coprocess to execute a command that will be sent later
coproc testcoproc { ssh -i ~/.ssh/id_rsa -T me@server; }

# the coprocess will be sent to the background so view like this:
jobs

# send a command to the coprocess for execution
echo -e 'ls -alhtr\n' >&"${testcoproc[1]}"

# send anther command to the coprocess for execution ie exit
echo -e 'exit\n' >&"${testcoproc[1]}"

# confirm that the coprocess is stopped
jobs

# But can still read the output from the coprocess
while read output <&"${testcoproc[0]}"; do echo $output; done

One limitation:

There may be only one active coprocess at a time.

There you are — use pipelines, redirection, and coprocesses to make your next shell application the best one yet!

--

--

No responses yet