This is an old revision of the document!
Table of Contents
BASH - Files - Read a file
Read a file (data stream, variable) line-by-line (and/or field-by-field).
Field splitting, white-space trimming, and other input processing
When not to use the -r option:
- -r: Prevents backslash interpretation (usually used as a backslash newline pair, to continue over multiple lines or to escape the delimiters).
- Without this option, any unescaped backslashes in the input will be discarded.
- You should almost always use the -r option with read.
The most common exception to this rule is when -e is used, which uses Readline to obtain the line from an interactive shell.
- In that case, tab completion will add backslashes to escape spaces and such, and you do not want them to be literally included in the variable.
- This would never be used when reading anything line-by-line, though, and -r should always be used when doing so.
Input source selection
The redirection < “$file” tells the while loop to read from the file whose name is in the variable file. If you would prefer to use a literal pathname instead of a variable, you may do that as well. If your input source is the script's standard input, then you don't need any redirection at all.
If your input source is the contents of a variable/parameter, bash can iterate over its lines using a here string:
while IFS= read -r line; do printf '%s\n' "$line" done <<< "$var"
The same can be done in any Bourne-type shell by using a “here document” (although read -r is POSIX, not Bourne):
while IFS= read -r line; do printf '%s\n' "$line" done <<EOF $var EOF
Read from a command instead of a regular file
some command | while IFS= read -r line; do printf '%s\n' "$line" done
This method is especially useful for processing the output of find with a block of commands:
find . -type f -print0 | while IFS= read -r -d '' file; do mv "$file" "${file// /_}" done
NOTE: This reads one filename at a time from the find command and renames the file, replacing spaces with underscores.
- -print0: uses NUL bytes as filename delimiters.
- -d '': Instructs it to read all text into the file variable until it finds a NUL byte.
- By default, find and read delimit their input with newlines; however, since filenames can potentially contain newlines themselves, this default behavior will split up those filenames at the newlines and cause the loop body to fail.
- IFS= : Set to an empty string, because otherwise read would still strip leading and trailing whitespace.
- |: Pipes the output from the find command into the while loop.
- This places the loop in a “sub shell”, which means any state changes you make (changing variables, cd, opening and closing files, etc.) will be lost when the loop finishes.
- To avoid that, you may use a ProcessSubstitution:
while IFS= read -r line; do printf '%s\n' "$line" done < <(some command)
My text files are broken! They lack their final newlines!
If there are some characters after the last line in the file (or to put it differently, if the last line is not terminated by a newline character), then read will read it but return false, leaving the broken partial line in the read variable(s). You can process this after the loop:
# Emulate cat while IFS= read -r line; do printf '%s\n' "$line" done < "$file" [[ -n $line ]] && printf %s "$line"
or:
# This does not work: printf 'line 1\ntruncated line 2' | while read -r line; do echo $line; done # This does not work either: printf 'line 1\ntruncated line 2' | while read -r line; do echo "$line"; done; [[ $line ]] && echo -n "$line" # This works: printf 'line 1\ntruncated line 2' | { while read -r line; do echo "$line"; done; [[ $line ]] && echo "$line"; }
The first example, beyond missing the after-loop test, is also missing quotes. See Quotes or Arguments for an explanation why. The Arguments page is an especially important read.
For a discussion of why the second example above does not work as expected, see FAQ #24.
Alternatively, you can simply add a logical OR to the while test:
while IFS= read -r line || [[ -n $line ]]; do printf '%s\n' "$line" done < "$file" printf 'line 1\ntruncated line 2' | while read -r line || [[ -n $line ]]; do echo "$line"; done
How to keep other commands from "eating" the input
Some commands greedily eat up all available data on standard input. The examples above do not take precautions against such programs. For example,
while read -r line; do cat > ignoredfile printf '%s\n' "$line" done < "$file"
will only print the contents of the first line, with the remaining contents going to “ignoredfile”, as cat slurps up all available input.
One workaround is to use a numeric FileDescriptor rather than standard input:
# Bash while IFS= read -r -u 9 line; do cat > ignoredfile printf '%s\n' "$line" done 9< "$file" # Note that read -u is not portable to every shell. Use a redirect to ensure it works in any POSIX compliant shell: while IFS= read -r line <&9; do cat > ignoredfile printf '%s\n' "$line" done 9< "$file"
or:
exec 9< "$file" while IFS= read -r line <&9; do cat > ignoredfile printf '%s\n' "$line" done exec 9<&-
This example will wait for the user to type something into the file ignoredfile at each iteration instead of eating up the loop input.
You might need this, for example, with mencoder which will accept user input if there is any, but will continue silently if there isn't. Other commands that act this way include ssh and ffmpeg. Additional workarounds for this are discussed in FAQ #89.