You are not logged in.
Hey all, I found this blog that lists some interesting tips for bash script writing. I'm not experienced enough to vet all of these, but a few of them I can definitely agree with.
I'm not sure about what #13 does and the whole magic variable thing. What does this accomplish?
Anyway, does anyone agree/disagree with the list?
1. Use long options (logger --priority vs logger -p). If you're on cli, abbreviations make sense for efficiency. but when you're writing reusable scripts a few extra keystrokes will pay off in readability and avoid ventures into man pages in the future by you or your collaborators.
2. Use set -o errexit (a.k.a. set -e) to make your script exit when a command fails.
3. Then add || true to commands that you allow to fail.
4. Use set -o nounset (a.k.a. set -u) to exit when your script tries to use undeclared variables.
5. Use set -o xtrace (a.k.a set -x) to trace what gets executed. Useful for debugging.
6. Use set -o pipefail in scripts to catch mysqldump fails in e.g. mysqldump |gzip. The exit status of the last command that threw a non-zero exit code is returned.
7. #!/usr/bin/env bash is more portable than #!/bin/bash.
8. Avoid using #!/usr/bin/env bash -e (vs set -e), because when someone runs your script as bash ./script.sh, the exit on error will be ignored.
9. Surround your variables with {}. Otherwise bash will try to access the $ENVIRONMENT_app variable in /srv/$ENVIRONMENT_app, whereas you probably intended /srv/${ENVIRONMENT}_app.
10. You don't need two equal signs when checking if [ "${NAME}" = "Kevin" ].
11. Surround your variable with " in if [ "${NAME}" = "Kevin" ], because if $NAME isn't declared, bash will throw a syntax error (also see nounset).
12. Use :- if you want to test variables that could be undeclared. For instance: if [ "${NAME:-}" = "Kevin" ] will set $NAME to be empty if it's not declared. You can also set it to noname like so if [ "${NAME:-noname}" = "Kevin" ]
13. Set magic variables for current file, basename, and directory at the top of your script for convenience.
Summarizing, why not start your next bash script like this:
#!/usr/bin/env bash
# Bash3 Boilerplate. Copyright (c) 2014, kvz.io
set -o errexit
set -o pipefail
set -o nounset
# set -o xtrace
# Set magic variables for current file & dir
__dir="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
__file="${__dir}/$(basename "${BASH_SOURCE[0]}")"
__base="$(basename ${__file} .sh)"
__root="$(cd "$(dirname "${__dir}")" && pwd)" # <-- change this as it depends on your app
arg1="${1:-}"
Here's the link to the original author: Bash- Github List
Last edited by Horizon_Brave (2016-03-26 19:48:38)
"I have not failed, I have found 10,000 ways that will not work" -Edison
Offline
I agree with some of the list, but there are a few points that I don't like:
#2: sometimes you want to catch errors and try to *repair* the behavior of the script. See https://github.com/olmokramer/.dotfiles … ue2ogg#L63 for example, where I convert the audio file to a flac file before splitting if the original codec isn't recognized. The errexit option would cause the script to fail immediately if shnsplit doesn't recognize the codec of the original file, although ffmpeg may very well recognize the filetype and may happily convert it for you.
(In case you don't know the commands: the script I linked to takes an audio file and a .cue file (which it tries to determine from the name of the audio file in case it isn't passed) and splits it into the separate tracks, after which it converts the split files to .ogg and moves them to the current $PWD. The important part is that it tries to convert the input to a known file type when it isn't recognized (although the detection could be better, of course). This all requires the shnsplit program to fail the first time)
#4 can cause some confusion when the script author wants to test if the user passed some argument to the script in $1, $2, ... so be sure that's really what you want. A workaround is to check that before you set the "set -o nounused" option, but then it goes against the advise to put the option at the top of your script.
#5 adds a lot (and I mean a lot) of output the end user isn't interested in. It's nice for debugging, but be sure to get it out before shipping the production code.
#10 is just confusing IMO, because most programming languages use the == for equality test and a single = for assignment.
I believe that #12 is just plain wrong; at least when I run
A="";
if [ "${A:-test}" = "Kevin" ]; then
echo yes
else
echo no
fi
echo ${A}
it just prints "no" and an empty line, indicating that $A isn't set to the value passed after the ":-". All it does is compare the value after the ":-" to the other value in the equality test instead of $A if $A is indeed empty. The value of $A will not be changed after the equality comparison.
#13 I think refers to the variables starting with the double underscores in the snippet at the end of your post. Setting them at the start of your script allows you to use them anywhere else in the script
So although there are definitely some useful things in this list, you should think about what you want to achieve and use the options in accordance with that.
Hope this helps
Offline
Ha, thanks for the reply to this Olm, I actually forgot about this thread... was hoping it'd take off with others contributing their experience as well. You bring up some really good points. I suppose #2 is definitely a case by case scenario, rather the failure is required for another set of actions to occur. That hadn't occurred to me
Thanks for your input, and please feel free to share your own 'guidelines' when writing a script.
"I have not failed, I have found 10,000 ways that will not work" -Edison
Offline
I'll just drop this and am on my way again.
That..in every literal sense of the word, is a wall of text.
"I have not failed, I have found 10,000 ways that will not work" -Edison
Offline
While not strictly scripting ... CLI Magic is a nice collection of useful cli tips - http://climagic.org/
Offline
I'll just drop this and am on my way again.
tl;dr: The Bourne-Again SHell is the only shell.
I can't fault this policy, as bash is on practically every GNU/Linux installation everywhere.
Be excellent to each other, and...party on, dudes!
BunsenLabs Forum Rules
Tending and defending the Flame since 2009
Offline
Some notes from the actual "project" i've been working on lately
I've been using this
file=$(readlink -f "$1")
baseext=$(basename "${1}") # file.ext
base="${baseext%.*}" # file
inside the while loop and this
# script name into "$me"
me=$(basename $BASH_SOURCE)
for noncritical echo $me (which is useful if one runs multiple scripts in parallel or call the scripts from scripts)
Also been thinking about some more modular approach to things to avoid long spaghetti code, for example this one runs bunch of others (separated cfg, separated audio encoder, separated hasVideo short script, ect).
I could not figure out how to communicate stuff between multiple scripts calling each other and still keep each script to be able to run alone nicely, ugly stuff like this appears
# indirect? (if called from another script then also overwrite $out)
# example call: thisScript scriptcall "/some/writable/dir" inputfile
if [ "$1" == "scriptcall" ]; then # this is called from another script
# now check 2nd parameter (new $out) which should be writable and directory
if [ -d "$2" ] && [ -w "$2" ] ; then
echo "$2 is writable directory"
out="$2" # new writable output directory is the one that was passed by calling script (its $tmp)
shift ; shift # remove 1st two, so only files are left for further loops
else
echo "$2 is not directory or writable"; exit 1
fi
else
echo "direct call"
fi
At the end of the day, if it works... (At this point I'am well over 1000 encodes, so mostly happy with stuff)
Also bunch of errors may occur, but at the end one should just really care if something was produced at the output and exit if not, like
test -f "$out/$baseout.mp4" && echo "$me output $out/$baseout.mp4" || { echo "$me No output file found, error, exiting" ; exit 1; }
(Unlikely that some sort of null output would happen due to previous steps)
Generally speaking, if you prepare lots of input check (In this case hasVideo, hasAudio) you do also cover a lot of unhappy scenarios, but it appears hard to cover them all.
p.s. So far everything evil was found with enabling set -x when script miss-behaved.
p.s.2. About the filesystem: In this case I have a virtual machine which has readonly ~/input patched to the host directory where input media files are stored and rw ~/output patched to single dir. This alows for a lot of experimenting knowing it would be hard to harm original filesystem or files. fstab looks like this:
# virtual box shared folders, source is readonly, tmp is readwrite.
source /home/ticho/input vboxsf uid=1000,gid=100,ro,dmode=700,fmode=600,comment=systemd.automount 0 0
tmp /home/ticho/output vboxsf uid=1000,gid=100,rw,dmode=700,fmode=600,comment=systemd.automount 0 0
This:
So although there are definitely some useful things in this list, you should think about what you want to achieve and use the options in accordance with that.
Last edited by brontosaurusrex (2016-06-10 05:31:17)
Offline
dot|not wrote:I'll just drop this and am on my way again.
That..in every literal sense of the word, is a wall of text.
you have to enable javascript for that page.
Offline
Horizon_Brave wrote:dot|not wrote:I'll just drop this and am on my way again.
That..in every literal sense of the word, is a wall of text.
you have to enable javascript for that page.
hehe, you're right. I had my often over agressive NoScript running. I love it, but it sometimes breaks everything without me even realizing.
"I have not failed, I have found 10,000 ways that will not work" -Edison
Offline
you have to enable javascript for that page.
I read through it without javascript -- I just presumed that the author disliked whitespace...
Offline