The following is the GNU All-permissive License as recommended in https://www.gnu.org/licenses/license-recommendations.en.html
Copyright (C) 2024 Free Software Foundation sysadmin@fsf.org
Copying and distribution of this file, with or without modification, are permitted in any medium without royalty provided the copyright notice and this notice are preserved. This file is offered as-is, without any warranty.
FSF Tech Team Bash style guide
Contributions are welcome
(git svn clone --stdlayout svn://svn.savannah.gnu.org/administration/
,
https://savannah.gnu.org/maintenance/HowToAdminThisWiki/).
Send patches to
https://lists.fsf.org/mailman/listinfo/tech-volunteer-meeting. Info on
volunteering with the FSF tech team:
https://libreplanet.org/wiki/Group:FSF:Tech_Team_Volunteers
This is a general style guide, but focused on FSF tech team and the volunteers we work with administering about 130 GNU/Linux systems.
Please donate at https://www.fsf.org/ to help advance computer user freedom.
About other documentation
Useful:
info bash
(95% of that is also inman bash
)Stackoverflow is good, but often has answers that are not robust, and posting to the site may require running javascript. For a common example, see https://mywiki.wooledge.org/ParsingLs
About "portability": If you read something online that says, you should do it this way because is "more portable", you often want the exact opposite. It often means that bash or GNU coreutils can do it in a more readable or otherwise better way. We all deserve to use systems where we can install and use GNU software, don't hamstring yourself for the sake of catering to less capable systems unless you really have some good reason to do so. It is usually easier to wait and port your software if you need it.
Don't be like google and publish a proprietary licensed bash style guide.
Things mostly relevant to sharing
Always use bash, never /bin/sh, etc.
Use common sense and be consistent. If you change script style, and it is in a git repo, try to make a commit which only changes the style.
Scripts must be executable, preferably with no extension, but .sh is okay. lowercase-dash-separated. Scripts meant to be sourced (libraries) are more okay to have .sh extension.
If a script is getting published, follow https://www.gnu.org/licenses/license-recommendations.en.html.
Shellcheck
The shebang should not have arguments; so BASH script-name
works the
same. Use the set
command instead.
Before using any script, it should have no output from:
shellcheck -x -e 2046,2068,2086,2206,2029,2033,2054,2164,2254,2317 SCRIPT_FILE
Reasons for the exceptions
The numbers are exceptions from the rules.
The following exceptions are for quoting variables. Shellcheck complains that unquoted variables will do splitting and globbing. You should simply know this and account for this it in your code. Most variables are things that we know don't have spaces, and it would be too verbose to quote them all the time. If the variable could have characters such as spaces, *, ?, [, etc, then definitely quote it unless you want globbing. If the variable is set to one of the program's CLI args, always quote it, because the person running the script can easily make errors. For scripts which are expected to take un-trusted input, or process files or variables which could have spaces, run shellcheck with no exceptions to verify quoting.
# 2046 = unquoted $(cmd)
# 2068 = Double quote array expansions to avoid re-splitting elements.
# 2086 = unquoted $var
# 2206 = unquoted $var in array=($var)
# 2254 = unquoted $var in case x in $var)
Miscellaneous exceptions:
# 2029 = Using un-escaped variable in ssh call. Reason: This never
works, warning is generally a false positive.
# 2032 = Warning about calling shell function over ssh. There are ways
for this to be a false positive, and we generally don't need a warning
that shellcheck doesn't know if a command exists.
# 2033 = Shell function as argument to another shell. Reason: This never
works, warning is generally a false positive.
# 2054 = Array declaration contains a comma. Reason: This is warning of a
mistake based on how other languages work; but commas are not special
in BASH, and even unquoted, they are legitimate array elements.
# 2164 = `cd` without condition for failure. Reason: We use automatic error
handling.
# 2317 = Unreachable code. Too many false positives.
This is based on shellcheck version: 0.8.0 from Trisquel 11. A newer shellcheck probably has new rules worth ignoring.
Other shellcheck issues
If there is a need for something shellcheck warns about, use a directive to override the warning, eg:
# shellcheck disable=SC2035 # justification
If the shellcheck you are using is too old for -x, get a new one using package pinning or Haskell stack.
For shellcheck SC2207/2206, how to avoid x=( $(cmd) )
and y=( $var )
since
it unnecessarily expands globs:
For splitting a variable on spaces without glob expansion:
tmpstr=$(cmd)
IFS=" " read -r -a array <<<"$tmpstr"
And for splitting on lines:
var=$(cmd)
mapfile -t array <<<"$var"
Documentation
Add comments describing functions that are not short and obvious.
Document function arguments. Preferably document global variables used and nonzero returns. See the https://savannah.nongnu.org/git/?group=bash-bear-trap for example.
Comment any tricky or important parts of code.
Use TODO comments, eg:
# TODO iank: add x functionality
Document whenever a script is not idempotent. Try to make scripts be as idempotent as feasible. Idempotent means running the same script twice results in the same state, either repeating some steps or skipping steps that were already done, and it handles the case of a previous run which exited mid-way through.
Code style
Indent 2 spaces.
Try to wrap lines longer than ~80 chars. Use \ followed by newline where appropriate.
Use here documents to print multiple lines instead multiple echo lines.
cat <<END
I have lots
of stuff to say.
END
Use one line for if ..;then
and for ..;do
:
if x; then
true
fi
for x; do
echo 'here $x is $1, then $2, up until $# - 1'
done
Variables:
Environment variables are all caps. Other vars are lowercase. Underscore separates words. Declare constants and environment variables at the top of the file when possible.
Never quote literal integers.
Prefer single-quoting strings that have no substitution (things that
start with $
).
Command arguments and variable assignment can be optionally quoted, eg: x='oo'
.
Functions:
Put all functions together near the top of the file after includes,
constants, and set
statements.
Error handling
There are 2 options: automatic and manual. Pick one and use it consistently throughout your script.
Automatically exit on failed command:
For scripts with no functions add:
if ! test "$BASH_VERSION"; then echo "error: shell is not bash" >&2; exit 1; fi
shopt -s inherit_errexit 2>/dev/null ||: # ignore fail in bash < 4.4
set -eE -o pipefail
trap 'echo "$0:$LINENO:error: \"$BASH_COMMAND\" returned $?" >&2' ERR
For scripts with functions, copy/paste or source
the bash-bear script
https://savannah.nongnu.org/git/?group=bash-bear-trap , which will exit on error and print a stack trace.
When you want to manually handle some error, use a variable, for example:
set -e; . /path/to/bash-bear; set +e
test-func() {
result=true
echo do stuff
result=false
}
test-func
$result || echo handling false result
echo this prints too
To allow a specific command to to fail, make it part of a conditional. eg:
iptables -D rule || [[ $? == 1 ]] # allow exit code 1 when rule doesn't exist
Gotchas in automatic error handling:
Functions in conditionals
Don't put functions in conditionals generally, because the trap is ignored in that function call. Instead, use a result variable. However, it is okay for very short functions when only the last line can fail. eg:
afunc() { echo ok; "$@"; }
Process Substitution
Don't use process substitution when the command inside could fail, because the trap won't catch it. Eg, do NOT do this:
while read -r l; do something; done < <(command-that-could-fail)
Instead, do this:
tmp=$(mktemp)
command-that-could-fail >$tmp
while read -r l; do something; done <$tmp
Command substitution
This won't be caught by error trap:
echo hello $(command-that-could-fail)
Instead, do this:
var=$(command-that-could-fail)
echo hello "$var"
Same deal for all of these:
for f in $(command-that-could-fail); do true; done
if [[ $(command-that-could-fail) != x ]]; do true; done
local var=$(command-that-could-fail)
declare var=$(command-that-could-fail)
readonly var=$(command-that-could-fail)
Declare, then assign:
local var
var=$(command-that-could-fail)
Arithmetic expressions
If there is a syntax error in a conditional expression (( expression ))
,
the condition is false-ish; but it will not cause the program to exit
nor traps to fire. This is generally the case for conditionals
(eg: if
statements); but it is a common, yet avoidable source of bugs.
# Avoid this
x=/s
if (( x )); then
do something
fi
# Instead, pick one of the following:
# option a:
x=/s
# This will fail due to syntax error.
tmp=$(( x ))
if (( tmp )); then
do something
fi
# option b:
x=/s
if (( x )); then
do something
else
echo some error happened >&2
exit 1
fi
Manual error checking
Explicit handling should be done for almost all commands not builtin to BASH.
Add this to the top of your script. It is like automatic exit on error, but just prints when a command failed. It will help identify if your manual error checking was incomplete.
set -E -o pipefail
trap 'echo "$0:$LINENO:error: \"$BASH_COMMAND\" returned $?" >&2' ERR
Or for scripts with functions, you may want to manually print a stack trace, see https://savannah.nongnu.org/projects/bash-bear-trap/.
Manual error handling example:
ret=0
iptables -D $rule || ret=$?
if [[ $ret != [01] ]]; then
err-exit exiting due to iptables failure with exit code: $ret
fi
Additional error handling notes:
In some circumstances pipefail may be too blunt. In that case, check
${PIPESTATUS[@]}
.
Good practices
Make variables set at the beginning of a script constant when possible.
# Constant
readonly conf_dir=/some/path
# Constant and environment
declare -xr conf_dir=/some/path
# A constant set based on condition needs to be set readonly after it's set.
verbose=false
if [[ $1 ]]; then
verbose=true
fi
readonly verbose
Use local
to properly scope variables as "function-local". This adds
readability and avoids errors from using variables in unintended
scopes. eg: local y=5
Always use return 0
, never return
, because you can return nonzero
by accident otherwise.
Prefer the use of built-ins such as the Parameter Expansion functions
(see info '(bash)Shell Parameter Expansion'
) as they are more robust.
# Prefer this:
addition=$((x + y))
substitution="${string/#foo/bar}"
# Instead of this:
addition="$(expr $x + $y)"
substitution="$(echo "${string}" | sed -e 's/^foo/bar/')"
Avoid eval
.
Avoid $_
. It is easy to alter the environment such that it is broken.
So far, we have seen this happen only in the interactive shell.
When a var might have spaces, double quote it.
When input is generally un-trusted, protect against input that starts with dash.
rm -v -- "$var"
rm -v ./* # a file could be named -rf
Check for empty & unset variables with [[ $var ]]
.
Optionally use set -u
when using automatic error handling. It can
prevent errors in somewhat complicated scripts where you set variables
within conditionals. It also makes code verbose because unset/empty
variables are often useful and so then you need to set variables to
empty, eg x=
, complicating your code and potentially also leading to
bugs. Avoid relying on set -u. If you see a situation where you think
set -u might catch an error, add an explicit test for an empty variable.
Prefer $var
except when ${var}
is necessary or adds clarity, like "comb${x}ining strings"
.
Never use [
. [[
is a special syntax which doesn't do variable expansion
and avoids edge cases.
[[ $filename == *.png ]] # right-hand globs over left-hand - matches filename with .png extension
# Quote only the right-hand side when globbing is not desired.
[[ $filename == "*.png" ]] # matches filename with literal asterisk "*.png"
Use double-equal for string equality, eg: [[ $x == 2 ]]
.
Use getopt to parse options when the script supports 2 or more
arguments. The below snippet is a good template. Reference info on most
distros is at /usr/share/doc/util-linux/examples/
.
usage() {
cat <<EOF
Usage: ${0##*/} [OPTIONS] ARG1 ARG2
One line description
-b Set this boolean option.
-l|--long-opt OPTION Set this this long option.
-h|--help Print help and exit.
Note: Uses GNU getopt options parsing style
EOF
exit $1
}
##### begin command line parsing ########
bool_opt=false
long_opt=foo
temp=$(getopt -l help,long-opt: hbl: "$@") || usage 1
eval set -- "$temp"
while true; do
case $1 in
-b) bool_opt=true ;;
-l|--long-opt) long_opt="$2"; shift ;;
-h|--help) usage ;;
--) shift; break ;;
*) echo "$0: Internal error! unexpected args: $*" ; exit 1 ;;
esac
shift
done
arg1="$1"
arg2="$2"
if (( $# != 2 )); then
echo "$0: error: expected 2 options, got $#." >&2
exit 1
fi
##### end command line parsing ########
Very common code snippets
Print script name & command, then execute command:
m() { printf "${0##*/}: %s\n" "$*"; "$@"; }
Make sure the script is run as root:
[[ $EUID == 0 ]] || exec sudo -E "${BASH_SOURCE[0]}" "$@"
cd to the directory of the script file. For example, if data files are stored in that directory.
this_file="$(readlink -f -- "${BASH_SOURCE[0]}")"
readonly this_file this_dir="${this_file%/*}"
cd "$this_dir"
Globing files needs shopt -s nullglob
to do succinctly. Once you
realize you need nullglob, add it at the top of your script. It is very
easy to forget, so it is a good idea to put it at the top of all your
scripts.
shopt -s nullglob
for f in dir/*; do
some-command $f
done