onlyforbopi
10/3/2019 - 4:19 AM

Linux.Bash.Reference

#linux #bash #ksh #sh #reference #quickreference #script

#Shells and modes
#Interactive mode
#Non-interactive mode
#Exit codes
#Comments
#Variables
#Local variables
#Environment variables
#Positional parameters
#Shell expansions
#Brace expansion
#Command substitution
#Arithmetic expansion
#Double and single quotes
#Arrays
#Array declaration
#Array expansion
#Array slice
#Adding elements into an array
#Deleting elements from an array
#Streams, pipes and lists
#Streams
#Pipes
#Lists of commands
#Conditional statements
#Primary and combining expressions
#Using an if statement
#Using a case statement
#Loops
#for loop
#while loop
#until loop
#select loop
#Loop control
#Functions
#Debugging
#Afterword
#Want to learn more?
#Other resources
#License


# Shells and modes
# The user bash shell can work in two modes - interactive and non-interactive.
# 
# Interactive mode
# If you are working on Ubuntu, you have seven virtual terminals available to you. The desktop environment takes place in the seventh virtual terminal, so you can return to a friendly GUI using the Ctrl-Alt-F7 keybinding.
# 
# You can open the shell using the Ctrl-Alt-F1 keybinding. After that, the familiar GUI will disappear and one of the virtual terminals will be shown.
# 
# If you see something like this, then you are working in interactive mode:
# 
# user@host:~$
# Here you can enter a variety of Unix commands, such as ls, grep, cd, mkdir, rm and see the result of their execution.
# 
# We call this shell interactive because it interacts directly with the user.
# 
# Using a virtual terminal is not really convenient. For example, if you want to edit a document and execute another command at the same time, you are better off using virtual terminal emulators like:
# 
# GNOME Terminal
# Terminator
# iTerm2
# ConEmu
# Non-interactive mode
# In non-interactive mode, the shell reads commands from a file or a pipe and executes them. When the interpreter reaches the end of the file, the shell process terminates the session and returns to the parent process.
# 
# Use the following commands for running the shell in non-interactive mode:

sh /path/to/script.sh
bash /path/to/script.sh

#In the example above, script.sh is just a regular text file that consists of commands the shell interpreter can evaluate and sh or bash is the shell's interpreter program. You can create script.sh using your preferred text editor (e.g. vim, nano, Sublime Text, Atom, etc).
#
#You can also simplify invoking the script by making it an executable file using the chmod command:

chmod +x /path/to/script.sh

# Additionally, the first line in the script must indicate which program it should use to run the file, like so:

#!/bin/bash
echo "Hello, world!"

#Or if you prefer to use sh instead of bash, change #!/bin/bash to #!/bin/sh. This #! character sequence is known as the #shebang. Now you can run the script like this:

/path/to/script.sh

#A handy trick we used above is using echo to print text to the terminal screen.
#
#Another way to use the shebang line is as follows:

#!/usr/bin/env bash
echo "Hello, world!"

# The advantage of this shebang line is it will search for the program (in this case bash) based on the PATH environment variable. This is often preferred over the first method shown above, as the location of a program on a filesystem cannot always be assumed. This is also useful if the PATH variable on a system has been configured to point to an alternate version of the program. For instance, one might install a newer version of bash while preserving the original version and insert the location of the newer version into the PATH variable. The use of #!/bin/bash would result in using the original bash, while #!/usr/bin/env bash would make use of the newer version.

# Exit codes
# Every command returns an exit code (return status or exit status). A successful command always returns 0 (zero-code), and a command that has failed returns a non-zero value (error code). Failure codes must be positive integers between 1 and 255.

# Another handy command we can use when writing a script is exit. This command is used to terminate the current execution and deliver an exit code to the shell. Running an exit code without any arguments, will terminate the running script and return the exit code of the last command executed before exit.

# When a program terminates, the shell assigns its exit code to the $? environment variable. The $? variable is how we usually test whether a script has succeeded or not in its execution.

# In the same way we can use exit to terminate a script, we can use the return command to exit a function and return an exit code to the caller. You can use exit inside a function too and this will exit the function and terminate the program.

# Comments
# Scripts may contain comments. Comments are special statements ignored by the shell interpreter. They begin with a # symbol and continue on to the end of the line.

# For example:

#!/bin/bash
# This script will print your username.
whoami

# Tip: Use comments to explain what your script does and why.

# Variables
# Like in most programming languages, you can also create variables in bash.

# Bash knows no data types. Variables can contain only numbers or a string of one or more characters. There are three kinds of variables you can create: local variables, environment variables and variables as positional arguments.

# Local variables
# Local variables are variables that exist only within a single script. They are inaccessible to other programs and scripts.

# A local variable can be declared using = sign (as a rule, there should not be any spaces between a variable's name, = and its value) and its value can be retrieved using the $ sign. For example:

username="denysdovhan"  # declare variable
echo $username          # display value
unset username          # delete variable


# We can also declare a variable local to a single function using the local keyword. Doing so causes the variable to disappear when the function exits.

local local_var="I'm a local value"

# Environment variables
# Environment variables are variables accessible to any program or script running in current shell session. They are created just like local variables, but using the keyword export instead.

export GLOBAL_VAR="I'm a global variable"

# There are a lot of global variables in bash. You will meet these variables fairly often, so here is a quick lookup table with the most practical ones:

# Variable	Description
# $HOME	The current user's home directory.
# $PATH	A colon-separated list of directories in which the shell looks for commands.
# $PWD	The current working directory.
# $RANDOM	Random integer between 0 and 32767.
# $UID	The numeric, real user ID of the current user.
# $PS1	The primary prompt string.
# $PS2	The secondary prompt string.
# Follow this link to see an extended list of environment variables in Bash.

# Positional parameters
# Positional parameters are variables allocated when a function is evaluated and are given positionally. The following table lists positional parameter variables and other special variables and their meanings when you are inside a function.

# Parameter	Description
# $0	Script's name.
# $1 … $9	The parameter list elements from 1 to 9.
# ${10} … ${N}	The parameter list elements from 10 to N.
# $* or $@	All positional parameters except $0.
# $#	The number of parameters, not counting $0.
# $FUNCNAME	The function name (has a value only inside a function).
# In the example below, the positional parameters will be $0='./script.sh', $1='foo' and $2='bar':

./script.sh foo bar

#Variables may also have default values. We can define as such using the following syntax:

 # if variables are empty, assign them default values
: ${VAR:='default'}
: ${$1:='first'}
# or
FOO=${FOO:-'default'}

# Shell expansions
# Expansions are performed on the command line after it has been split into tokens. In other words, these expansions are a mechanism to calculate arithmetical operations, to save results of commands' executions and so on.

# If you are interested, you can read more about shell expansions.

# Brace expansion
# Brace expansion allows us to generate arbitrary strings. It's similar to filename expansion. For example:

echo beg{i,a,u}n # begin began begun

# Also brace expansions may be used for creating ranges, which are iterated over in loops.

echo {0..5} # 0 1 2 3 4 5
echo {00..8..2} # 00 02 04 06 08


# Command substitution
# Command substitution allow us to evaluate a command and substitute its value into another command or variable assignment. Command substitution is performed when a command is enclosed by `` or $(). For example, we can use it as follows:

now=`date +%T`
# or
now=$(date +%T)

echo $now # 19:08:26


# Arithmetic expansion
# In bash we are free to do any arithmetical operations. But the expression must enclosed by $(( )) The format for arithmetic expansions is:

result=$(( ((10 + 5*3) - 7) / 2 ))
echo $result # 9


# Within arithmetic expansions, variables should generally be used without a $ prefix:

x=4
y=7
echo $(( x + y ))     # 11
echo $(( ++x + y++ )) # 12
echo $(( x + y ))     # 13


# Double and single quotes
# There is an important difference between double and single quotes. Inside double quotes variables or command substitutions are expanded. Inside single quotes they are not. For example:

echo "Your home: $HOME" # Your home: /Users/<username>
echo 'Your home: $HOME' # Your home: $HOME


# Take care to expand local variables and environment variables within quotes if they could contain whitespace. As an innocuous example, consider using echo to print some user input:

INPUT="A string  with   strange    whitespace."
echo $INPUT   # A string with strange whitespace.
echo "$INPUT" # A string  with   strange    whitespace.


# The first echo is invoked with 5 separate arguments — $INPUT is split into separate words, echo prints a single space character between each. In the second case, echo is invoked with a single argument (the entire $INPUT value, including whitespace).

# Now consider a more serious example:

FILE="Favorite Things.txt"
cat $FILE   # attempts to print 2 files: `Favorite` and `Things.txt`
cat "$FILE" # prints 1 file: `Favorite Things.txt`


# While the issue in this example could be resolved by renaming FILE to Favorite-Things.txt, consider input coming from an environment variable, a positional parameter, or the output of another command (find, cat, etc). If the input might contain whitespace, take care to wrap the expansion in quotes.

# Arrays
# Like in other programming languages, an array in bash is a variable that allows you to refer to multiple values. In bash, arrays are also zero-based, that is, the first element in an array has index 0.

# When dealing with arrays, we should be aware of the special environment variable IFS. IFS, or Input Field Separator, is the character that separates elements in an array. The default value is an empty space IFS=' '.

# Array declaration
# In bash you create an array by simply assigning a value to an index in the array variable:

fruits[0]=Apple
fruits[1]=Pear
fruits[2]=Plum

# Array variables can also be created using compound assignments such as:

fruits=(Apple Pear Plum)

# Array expansion
# Individual array elements are expanded similar to other variables:

echo ${fruits[1]} # Pear

# The entire array can be expanded by using * or @ in place of the numeric index:

echo ${fruits[*]} # Apple Pear Plum
echo ${fruits[@]} # Apple Pear Plum

# There is an important (and subtle) difference between the two lines above: consider an array element containing whitespace:

fruits[0]=Apple
fruits[1]="Desert fig"
fruits[2]=Plum

# We want to print each element of the array on a separate line, so we try to use the printf builtin:

printf "+ %s\n" ${fruits[*]}
# + Apple
# + Desert
# + fig
# + Plum

# Why were Desert and fig printed on separate lines? Let's try to use quoting:

printf "+ %s\n" "${fruits[*]}"
# + Apple Desert fig Plum

# Now everything is on one line — that's not what we wanted! Here's where ${fruits[@]} comes into play:

printf "+ %s\n" "${fruits[@]}"
# + Apple
# + Desert fig
# + Plum

# Within double quotes, ${fruits[@]} expands to a separate argument for each element in the array; whitespace in the array elements is preserved.

# Array slice
# Besides, we can extract a slice of array using the slice operators:

echo ${fruits[@]:0:2} # Apple Desert fig

# In the example above, ${fruits[@]} expands to the entire contents of the array, and :0:2 extracts the slice of length 2, that starts at index 0.

# Adding elements into an array
# Adding elements into an array is quite simple too. Compound assignments are specially useful in this case. We can use them like this:

fruits=(Orange "${fruits[@]}" Banana Cherry)
echo ${fruits[@]} # Orange Apple Desert fig Plum Banana Cherry

# The example above, ${fruits[@]} expands to the entire contents of the array and substitutes it into the compound assignment, then assigns the new value into the fruits array mutating its original value.

# Deleting elements from an array
# To delete an element from an array, use the unset command:

unset fruits[0]
echo ${fruits[@]} # Apple Desert fig Plum Banana Cherry


# Streams, pipes and lists
# Bash has powerful tools for working with other programs and their outputs. Using streams we can send the output of a program into another program or file and thereby write logs or whatever we want.

# Pipes give us opportunity to create conveyors and control the execution of commands.

# It is paramount we understand how to use this powerful and sophisticated tool.

# Streams
# Bash receives input and sends output as sequences or streams of characters. These streams may be redirected into files or one into another.

# There are three descriptors:

# Code	Descriptor	Description
# 0	stdin	The standard input.
# 1	stdout	The standard output.
# 2	stderr	The errors output.
# Redirection makes it possible to control where the output of a command goes to, and where the input of a command comes from. For redirecting streams these operators are used:

# Operator	Description
# >	Redirecting output
# &>	Redirecting output and error output
# &>>	Appending redirected output and error output
# <	Redirecting input
# <<	Here documents syntax
# <<<	Here strings
# Here are few examples of using redirections:

output of ls will be written to list.txt
ls -l > list.txt

# append output to list.txt
ls -a >> list.txt

# all errors will be written to errors.txt
grep da * 2> errors.txt

# read from errors.txt
less < errors.txt


# Pipes
# We could redirect standard streams not only in files, but also to other programs. Pipes let us use the output of a program as the input of another.

# In the example below, command1 sends its output to command2, which then passes it on to the input of command3:

# command1 | command2 | command3
# Constructions like this are called pipelines.

# In practice, this can be used to process data through several programs. For example, here the output of ls -l is sent to the grep program, which prints only files with a .md extension, and this output is finally sent to the less program:

ls -l | grep .md$ | less

# The exit status of a pipeline is normally the exit status of the last command in the pipeline. The shell will not return a status until all the commands in the pipeline have completed. If you want your pipelines to be considered a failure if any of the commands in the pipeline fail, you should set the pipefail option with:

set -o pipefail


# Lists of commands
A list of commands is a sequence of one or more pipelines separated by ;, &, && or || operator.

# If a command is terminated by the control operator &, the shell executes the command asynchronously in a subshell. In other words, this command will be executed in the background.

# Commands separated by a ; are executed sequentially: one after another. The shell waits for the finish of each command.

# command2 will be executed after command1
command1 ; command2

# which is the same as
command1
command2

# Lists separated by && and || are called AND and OR lists, respectively.

# The AND-list looks like this:

# command2 will be executed if, and only if, command1 finishes successfully (returns 0 exit status)
command1 && command2

# The OR-list has the form:

# command2 will be executed if, and only if, command1 finishes unsuccessfully (returns code of error)
command1 || command2

# The return code of an AND or OR list is the exit status of the last executed command.

Conditional statements
# Like in other languages, Bash conditionals let us decide to perform an action or not. The result is determined by evaluating an expression, which should be enclosed in [[ ]].

# Conditional expression may contain && and || operators, which are AND and OR accordingly. Besides this, there many other handy expressions.

# There are two different conditional statements: if statement and case statement.

# Primary and combining expressions
# Expressions enclosed inside [[ ]] (or [ ] for sh) are called test commands or primaries. These expressions help us to indicate results of a conditional. In the tables below, we are using [ ], because it works for sh too. Here is an answer about the difference between double and single square brackets in bash.

# Working with the file system:

# Primary	Meaning
# [ -e FILE ]	True if FILE exists.
# [ -f FILE ]	True if FILE exists and is a regular file.
# [ -d FILE ]	True if FILE exists and is a directory.
# [ -s FILE ]	True if FILE exists and not empty (size more than 0).
# [ -r FILE ]	True if FILE exists and is readable.
# [ -w FILE ]	True if FILE exists and is writable.
# [ -x FILE ]	True if FILE exists and is executable.
# [ -L FILE ]	True if FILE exists and is symbolic link.
# [ FILE1 -nt FILE2 ]	FILE1 is newer than FILE2.
# [ FILE1 -ot FILE2 ]	FILE1 is older than FILE2.
# Working with strings:

# Primary	Meaning
# [ -z STR ]	STR is empty (the length is zero).
# [ -n STR ]	STR is not empty (the length is non-zero).
# [ STR1 == STR2 ]	STR1 and STR2 are equal.
# [ STR1 != STR2 ]	STR1 and STR2 are not equal.
# Arithmetic binary operators:

# Primary	Meaning
# [ ARG1 -eq ARG2 ]	ARG1 is equal to ARG2.
# [ ARG1 -ne ARG2 ]	ARG1 is not equal to ARG2.
# [ ARG1 -lt ARG2 ]	ARG1 is less than ARG2.
# [ ARG1 -le ARG2 ]	ARG1 is less than or equal to ARG2.
# [ ARG1 -gt ARG2 ]	ARG1 is greater than ARG2.
# [ ARG1 -ge ARG2 ]	ARG1 is greater than or equal to ARG2.
# Conditions may be combined using these combining expressions:

# Operation	Effect
# [ ! EXPR ]	True if EXPR is false.
# [ (EXPR) ]	Returns the value of EXPR.
# [ EXPR1 -a EXPR2 ]	Logical AND. True if EXPR1 and EXPR2 are true.
# [ EXPR1 -o EXPR2 ]	Logical OR. True if EXPR1 or EXPR2 are true.
Sure, there are more useful primaries and you can easily find them in the Bash man pages.

# Using an if statement
# if statements work the same as in other programming languages. If the expression within the braces is true, the code between then and fi is executed. fi indicates the end of the conditionally executed code.

# Single-line
if [[ 1 -eq 1 ]]; then echo "true"; fi

# Multi-line
if [[ 1 -eq 1 ]]; then
  echo "true"
fi

# Likewise, we could use an if..else statement such as:

# Single-line
if [[ 2 -ne 1 ]]; then echo "true"; else echo "false"; fi

# Multi-line
if [[ 2 -ne 1 ]]; then
  echo "true"
else
  echo "false"
fi

# Sometimes if..else statements are not enough to do what we want to do. In this case we shouldn't forget about the existence of if..elif..else statements, which always come in handy.

# Look at the example below:

if [[ `uname` == "Adam" ]]; then
  echo "Do not eat an apple!"
elif [[ `uname` == "Eva" ]]; then
  echo "Do not take an apple!"
else
  echo "Apples are delicious!"
fi

# Using a case statement
# If you are confronted with a couple of different possible actions to take, then using a case statement may be more useful than nested if statements. For more complex conditions use case like below:

case "$extension" in
  "jpg"|"jpeg")
    echo "It's image with jpeg extension."
  ;;
  "png")
    echo "It's image with png extension."
  ;;
  "gif")
    echo "Oh, it's a giphy!"
  ;;
  *)
    echo "Woops! It's not image!"
  ;;
esac

# Each case is an expression matching a pattern. The | sign is used for separating multiple patterns, and the ) operator terminates a pattern list. The commands for the first match are executed. * is the pattern for anything else that doesn't match the defined patterns. Each block of commands should be divided with the ;; operator.

# Loops
# Here we won't be surprised. As in any programming language, a loop in bash is a block of code that iterates as long as the control conditional is true.

# There are four types of loops in Bash: for, while, until and select.

# for loop
# The for is very similar to its sibling in C. It looks like this:

for arg in elem1 elem2 ... elemN
do
  # statements
done

# During each pass through the loop, arg takes on the value from elem1 to elemN. Values may also be wildcards or brace expansions.

# Also, we can write for loop in one line, but in this case there needs to be a semicolon before do, like below:

for i in {1..5}; do echo $i; done

# By the way, if for..in..do seems a little bit weird to you, you can also write for in C-like style such as:

for (( i = 0; i < 10; i++ )); do
  echo $i
done

# for is handy when we want to do the same operation over each file in a directory. For example, if we need to move all .bash files into the script folder and then give them execute permissions, our script would look like this:

#!/bin/bash

for FILE in $HOME/*.bash; do
  mv "$FILE" "${HOME}/scripts"
  chmod +x "${HOME}/scripts/${FILE}"
done

# while loop
# The while loop tests a condition and loops over a sequence of commands so long as that condition is true. A condition is nothing more than a primary as used in if..then conditions. So a while loop looks like this:

while [[ condition ]]
do
  # statements
done

# Just like in the case of the for loop, if we want to write do and condition in the same line, then we must use a semicolon before do.

# A working example might look like this:

#!/bin/bash

# Squares of numbers from 0 through 9
x=0
while [[ $x -lt 10 ]]; do # value of x is less than 10
  echo $(( x * x ))
  x=$(( x + 1 )) # increase x
done
until loop

# The until loop is the exact opposite of the while loop. Like a while it checks a test condition, but it keeps looping as long as this condition is false:

until [[ condition ]]; do
  #statements
done
select loop

# The select loop helps us to organize a user menu. It has almost the same syntax as the for loop:

select answer in elem1 elem2 ... elemN
do
  # statements
done

# The select prints all elem1..elemN on the screen with their sequence numbers, after that it prompts the user. Usually it looks like $? (PS3 variable). The answer will be saved in answer. If answer is the number between 1..N, then statements will execute and select will go to the next iteration — that's because we should use the break statement.

# A working example might look like this:

#!/bin/bash

PS3="Choose the package manager: "
select ITEM in bower npm gem pip
do
  echo -n "Enter the package name: " && read PACKAGE
  case $ITEM in
    bower) bower install $PACKAGE ;;
    npm)   npm   install $PACKAGE ;;
    gem)   gem   install $PACKAGE ;;
    pip)   pip   install $PACKAGE ;;
  esac
  break # avoid infinite loop
done

# This example, asks the user what package manager {s,he} would like to use. Then, it will ask what package we want to install and finally proceed to install it.

# If we run this, we will get:

$ ./my_script
1) bower
2) npm
3) gem
4) pip

# Choose the package manager: 2
# Enter the package name: bash-handbook
# <installing bash-handbook>
# Loop control
# There are situations when we need to stop a loop before its normal ending or step over an iteration. In these cases, we can use the shell built-in break and continue statements. Both of these work with every kind of loop.

# The break statement is used to exit the current loop before its ending. We have already met with it.

# The continue statement steps over one iteration. We can use it as such:

for (( i = 0; i < 10; i++ )); do
  if [[ $(( i % 2 )) -eq 0 ]]; then continue; fi
  echo $i
done

# If we run the example above, it will print all odd numbers from 0 through 9.

# Functions
# In scripts we have the ability to define and call functions. As in any programming language, functions in bash are chunks of code, but there are differences.

# In bash, functions are a sequence of commands grouped under a single name, that is the name of the function. Calling a function is the same as calling any other program, you just write the name and the function will be invoked.

We can declare our own function this way:

my_func () {
  # statements
}

my_func # call my_func

# We must declare functions before we can invoke them.

# Functions can take on arguments and return a result — exit code. Arguments, within functions, are treated in the same manner as arguments given to the script in non-interactive mode — using positional parameters. A result code can be returned using the return command.

# Below is a function that takes a name and returns 0, indicating successful execution.

# function with params
greeting () {
  if [[ -n $1 ]]; then
    echo "Hello, $1!"
  else
    echo "Hello, unknown!"
  fi
  return 0
}

greeting Denys  # Hello, Denys!
greeting        # Hello, unknown!

# We already discussed exit codes. The return command without any arguments returns the exit code of the last executed command. Above, return 0 will return a successful exit code. 0.

# Debugging
# The shell gives us tools for debugging scripts. If we want to run a script in debug mode, we use a special option in our script's shebang:

#!/bin/bash options

# These options are settings that change shell behavior. The following table is a list of options which might be useful to you:

# Short	Name	Description
# -f	noglob	Disable filename expansion (globbing).
# -i	interactive	Script runs in interactive mode.
# -n	noexec	Read commands, but don't execute them (syntax check).
# pipefail	Make pipelines fail if any commands fail, not just if the final command fail.
# -t	—	Exit after first command.
# -v	verbose	Print each command to stderr before executing it.
# -x	xtrace	Print each command and its expanded arguments to stderr before executing it.
# For example, we have script with -x option such as:

#!/bin/bash -x

for (( i = 0; i < 3; i++ )); do
  echo $i
done

This will print the value of the variables to stdout along with other useful information:

$ ./my_script
+ (( i = 0 ))
+ (( i < 3 ))
+ echo 0
0
+ (( i++  ))
+ (( i < 3 ))
+ echo 1
1
+ (( i++  ))
+ (( i < 3 ))
+ echo 2
2
+ (( i++  ))
+ (( i < 3 ))

# Sometimes we need to debug a part of a script. In this case using the set command is convenient. This command can enable and disable options. Options are turned on using - and turned off using +:

#!/bin/bash

echo "xtrace is turned off"
set -x
echo "xtrace is enabled"
set +x
echo "xtrace is turned off again"#Shells and modes
#Interactive mode
#Non-interactive mode
#Exit codes
#Comments
#Variables
#Local variables
#Environment variables
#Positional parameters
#Shell expansions
#Brace expansion
#Command substitution
#Arithmetic expansion
#Double and single quotes
#Arrays
#Array declaration
#Array expansion
#Array slice
#Adding elements into an array
#Deleting elements from an array
#Streams, pipes and lists
#Streams
#Pipes
#Lists of commands
#Conditional statements
#Primary and combining expressions
#Using an if statement
#Using a case statement
#Loops
#for loop
#while loop
#until loop
#select loop
#Loop control
#Functions
#Debugging
#Afterword
#Want to learn more?
#Other resources
#License


# Shells and modes
# The user bash shell can work in two modes - interactive and non-interactive.
# 
# Interactive mode
# If you are working on Ubuntu, you have seven virtual terminals available to you. The desktop environment takes place in the seventh virtual terminal, so you can return to a friendly GUI using the Ctrl-Alt-F7 keybinding.
# 
# You can open the shell using the Ctrl-Alt-F1 keybinding. After that, the familiar GUI will disappear and one of the virtual terminals will be shown.
# 
# If you see something like this, then you are working in interactive mode:
# 
# user@host:~$
# Here you can enter a variety of Unix commands, such as ls, grep, cd, mkdir, rm and see the result of their execution.
# 
# We call this shell interactive because it interacts directly with the user.
# 
# Using a virtual terminal is not really convenient. For example, if you want to edit a document and execute another command at the same time, you are better off using virtual terminal emulators like:
# 
# GNOME Terminal
# Terminator
# iTerm2
# ConEmu
# Non-interactive mode
# In non-interactive mode, the shell reads commands from a file or a pipe and executes them. When the interpreter reaches the end of the file, the shell process terminates the session and returns to the parent process.
# 
# Use the following commands for running the shell in non-interactive mode:

sh /path/to/script.sh
bash /path/to/script.sh

#In the example above, script.sh is just a regular text file that consists of commands the shell interpreter can evaluate and sh or bash is the shell's interpreter program. You can create script.sh using your preferred text editor (e.g. vim, nano, Sublime Text, Atom, etc).
#
#You can also simplify invoking the script by making it an executable file using the chmod command:

chmod +x /path/to/script.sh

# Additionally, the first line in the script must indicate which program it should use to run the file, like so:

#!/bin/bash
echo "Hello, world!"

#Or if you prefer to use sh instead of bash, change #!/bin/bash to #!/bin/sh. This #! character sequence is known as the #shebang. Now you can run the script like this:

/path/to/script.sh

#A handy trick we used above is using echo to print text to the terminal screen.
#
#Another way to use the shebang line is as follows:

#!/usr/bin/env bash
echo "Hello, world!"

# The advantage of this shebang line is it will search for the program (in this case bash) based on the PATH environment variable. This is often preferred over the first method shown above, as the location of a program on a filesystem cannot always be assumed. This is also useful if the PATH variable on a system has been configured to point to an alternate version of the program. For instance, one might install a newer version of bash while preserving the original version and insert the location of the newer version into the PATH variable. The use of #!/bin/bash would result in using the original bash, while #!/usr/bin/env bash would make use of the newer version.

# Exit codes
# Every command returns an exit code (return status or exit status). A successful command always returns 0 (zero-code), and a command that has failed returns a non-zero value (error code). Failure codes must be positive integers between 1 and 255.

# Another handy command we can use when writing a script is exit. This command is used to terminate the current execution and deliver an exit code to the shell. Running an exit code without any arguments, will terminate the running script and return the exit code of the last command executed before exit.

# When a program terminates, the shell assigns its exit code to the $? environment variable. The $? variable is how we usually test whether a script has succeeded or not in its execution.

# In the same way we can use exit to terminate a script, we can use the return command to exit a function and return an exit code to the caller. You can use exit inside a function too and this will exit the function and terminate the program.

# Comments
# Scripts may contain comments. Comments are special statements ignored by the shell interpreter. They begin with a # symbol and continue on to the end of the line.

# For example:

#!/bin/bash
# This script will print your username.
whoami

# Tip: Use comments to explain what your script does and why.

# Variables
# Like in most programming languages, you can also create variables in bash.

# Bash knows no data types. Variables can contain only numbers or a string of one or more characters. There are three kinds of variables you can create: local variables, environment variables and variables as positional arguments.

# Local variables
# Local variables are variables that exist only within a single script. They are inaccessible to other programs and scripts.

# A local variable can be declared using = sign (as a rule, there should not be any spaces between a variable's name, = and its value) and its value can be retrieved using the $ sign. For example:

username="denysdovhan"  # declare variable
echo $username          # display value
unset username          # delete variable


# We can also declare a variable local to a single function using the local keyword. Doing so causes the variable to disappear when the function exits.

local local_var="I'm a local value"

# Environment variables
# Environment variables are variables accessible to any program or script running in current shell session. They are created just like local variables, but using the keyword export instead.

export GLOBAL_VAR="I'm a global variable"

# There are a lot of global variables in bash. You will meet these variables fairly often, so here is a quick lookup table with the most practical ones:

# Variable	Description
# $HOME	The current user's home directory.
# $PATH	A colon-separated list of directories in which the shell looks for commands.
# $PWD	The current working directory.
# $RANDOM	Random integer between 0 and 32767.
# $UID	The numeric, real user ID of the current user.
# $PS1	The primary prompt string.
# $PS2	The secondary prompt string.
# Follow this link to see an extended list of environment variables in Bash.

# Positional parameters
# Positional parameters are variables allocated when a function is evaluated and are given positionally. The following table lists positional parameter variables and other special variables and their meanings when you are inside a function.

# Parameter	Description
# $0	Script's name.
# $1 … $9	The parameter list elements from 1 to 9.
# ${10} … ${N}	The parameter list elements from 10 to N.
# $* or $@	All positional parameters except $0.
# $#	The number of parameters, not counting $0.
# $FUNCNAME	The function name (has a value only inside a function).
# In the example below, the positional parameters will be $0='./script.sh', $1='foo' and $2='bar':

./script.sh foo bar

#Variables may also have default values. We can define as such using the following syntax:

 # if variables are empty, assign them default values
: ${VAR:='default'}
: ${$1:='first'}
# or
FOO=${FOO:-'default'}

# Shell expansions
# Expansions are performed on the command line after it has been split into tokens. In other words, these expansions are a mechanism to calculate arithmetical operations, to save results of commands' executions and so on.

# If you are interested, you can read more about shell expansions.

# Brace expansion
# Brace expansion allows us to generate arbitrary strings. It's similar to filename expansion. For example:

echo beg{i,a,u}n # begin began begun

# Also brace expansions may be used for creating ranges, which are iterated over in loops.

echo {0..5} # 0 1 2 3 4 5
echo {00..8..2} # 00 02 04 06 08


# Command substitution
# Command substitution allow us to evaluate a command and substitute its value into another command or variable assignment. Command substitution is performed when a command is enclosed by `` or $(). For example, we can use it as follows:

now=`date +%T`
# or
now=$(date +%T)

echo $now # 19:08:26


# Arithmetic expansion
# In bash we are free to do any arithmetical operations. But the expression must enclosed by $(( )) The format for arithmetic expansions is:

result=$(( ((10 + 5*3) - 7) / 2 ))
echo $result # 9


# Within arithmetic expansions, variables should generally be used without a $ prefix:

x=4
y=7
echo $(( x + y ))     # 11
echo $(( ++x + y++ )) # 12
echo $(( x + y ))     # 13


# Double and single quotes
# There is an important difference between double and single quotes. Inside double quotes variables or command substitutions are expanded. Inside single quotes they are not. For example:

echo "Your home: $HOME" # Your home: /Users/<username>
echo 'Your home: $HOME' # Your home: $HOME


# Take care to expand local variables and environment variables within quotes if they could contain whitespace. As an innocuous example, consider using echo to print some user input:

INPUT="A string  with   strange    whitespace."
echo $INPUT   # A string with strange whitespace.
echo "$INPUT" # A string  with   strange    whitespace.


# The first echo is invoked with 5 separate arguments — $INPUT is split into separate words, echo prints a single space character between each. In the second case, echo is invoked with a single argument (the entire $INPUT value, including whitespace).

# Now consider a more serious example:

FILE="Favorite Things.txt"
cat $FILE   # attempts to print 2 files: `Favorite` and `Things.txt`
cat "$FILE" # prints 1 file: `Favorite Things.txt`


# While the issue in this example could be resolved by renaming FILE to Favorite-Things.txt, consider input coming from an environment variable, a positional parameter, or the output of another command (find, cat, etc). If the input might contain whitespace, take care to wrap the expansion in quotes.

# Arrays
# Like in other programming languages, an array in bash is a variable that allows you to refer to multiple values. In bash, arrays are also zero-based, that is, the first element in an array has index 0.

# When dealing with arrays, we should be aware of the special environment variable IFS. IFS, or Input Field Separator, is the character that separates elements in an array. The default value is an empty space IFS=' '.

# Array declaration
# In bash you create an array by simply assigning a value to an index in the array variable:

fruits[0]=Apple
fruits[1]=Pear
fruits[2]=Plum

# Array variables can also be created using compound assignments such as:

fruits=(Apple Pear Plum)

# Array expansion
# Individual array elements are expanded similar to other variables:

echo ${fruits[1]} # Pear

# The entire array can be expanded by using * or @ in place of the numeric index:

echo ${fruits[*]} # Apple Pear Plum
echo ${fruits[@]} # Apple Pear Plum

# There is an important (and subtle) difference between the two lines above: consider an array element containing whitespace:

fruits[0]=Apple
fruits[1]="Desert fig"
fruits[2]=Plum

# We want to print each element of the array on a separate line, so we try to use the printf builtin:

printf "+ %s\n" ${fruits[*]}
# + Apple
# + Desert
# + fig
# + Plum

# Why were Desert and fig printed on separate lines? Let's try to use quoting:

printf "+ %s\n" "${fruits[*]}"
# + Apple Desert fig Plum

# Now everything is on one line — that's not what we wanted! Here's where ${fruits[@]} comes into play:

printf "+ %s\n" "${fruits[@]}"
# + Apple
# + Desert fig
# + Plum

# Within double quotes, ${fruits[@]} expands to a separate argument for each element in the array; whitespace in the array elements is preserved.

# Array slice
# Besides, we can extract a slice of array using the slice operators:

echo ${fruits[@]:0:2} # Apple Desert fig

# In the example above, ${fruits[@]} expands to the entire contents of the array, and :0:2 extracts the slice of length 2, that starts at index 0.

# Adding elements into an array
# Adding elements into an array is quite simple too. Compound assignments are specially useful in this case. We can use them like this:

fruits=(Orange "${fruits[@]}" Banana Cherry)
echo ${fruits[@]} # Orange Apple Desert fig Plum Banana Cherry

# The example above, ${fruits[@]} expands to the entire contents of the array and substitutes it into the compound assignment, then assigns the new value into the fruits array mutating its original value.

# Deleting elements from an array
# To delete an element from an array, use the unset command:

unset fruits[0]
echo ${fruits[@]} # Apple Desert fig Plum Banana Cherry


# Streams, pipes and lists
# Bash has powerful tools for working with other programs and their outputs. Using streams we can send the output of a program into another program or file and thereby write logs or whatever we want.

# Pipes give us opportunity to create conveyors and control the execution of commands.

# It is paramount we understand how to use this powerful and sophisticated tool.

# Streams
# Bash receives input and sends output as sequences or streams of characters. These streams may be redirected into files or one into another.

# There are three descriptors:

# Code	Descriptor	Description
# 0	stdin	The standard input.
# 1	stdout	The standard output.
# 2	stderr	The errors output.
# Redirection makes it possible to control where the output of a command goes to, and where the input of a command comes from. For redirecting streams these operators are used:

# Operator	Description
# >	Redirecting output
# &>	Redirecting output and error output
# &>>	Appending redirected output and error output
# <	Redirecting input
# <<	Here documents syntax
# <<<	Here strings
# Here are few examples of using redirections:

output of ls will be written to list.txt
ls -l > list.txt

# append output to list.txt
ls -a >> list.txt

# all errors will be written to errors.txt
grep da * 2> errors.txt

# read from errors.txt
less < errors.txt


# Pipes
# We could redirect standard streams not only in files, but also to other programs. Pipes let us use the output of a program as the input of another.

# In the example below, command1 sends its output to command2, which then passes it on to the input of command3:

# command1 | command2 | command3
# Constructions like this are called pipelines.

# In practice, this can be used to process data through several programs. For example, here the output of ls -l is sent to the grep program, which prints only files with a .md extension, and this output is finally sent to the less program:

ls -l | grep .md$ | less

# The exit status of a pipeline is normally the exit status of the last command in the pipeline. The shell will not return a status until all the commands in the pipeline have completed. If you want your pipelines to be considered a failure if any of the commands in the pipeline fail, you should set the pipefail option with:

set -o pipefail


# Lists of commands
A list of commands is a sequence of one or more pipelines separated by ;, &, && or || operator.

# If a command is terminated by the control operator &, the shell executes the command asynchronously in a subshell. In other words, this command will be executed in the background.

# Commands separated by a ; are executed sequentially: one after another. The shell waits for the finish of each command.

# command2 will be executed after command1
command1 ; command2

# which is the same as
command1
command2

# Lists separated by && and || are called AND and OR lists, respectively.

# The AND-list looks like this:

# command2 will be executed if, and only if, command1 finishes successfully (returns 0 exit status)
command1 && command2

# The OR-list has the form:

# command2 will be executed if, and only if, command1 finishes unsuccessfully (returns code of error)
command1 || command2

# The return code of an AND or OR list is the exit status of the last executed command.

Conditional statements
# Like in other languages, Bash conditionals let us decide to perform an action or not. The result is determined by evaluating an expression, which should be enclosed in [[ ]].

# Conditional expression may contain && and || operators, which are AND and OR accordingly. Besides this, there many other handy expressions.

# There are two different conditional statements: if statement and case statement.

# Primary and combining expressions
# Expressions enclosed inside [[ ]] (or [ ] for sh) are called test commands or primaries. These expressions help us to indicate results of a conditional. In the tables below, we are using [ ], because it works for sh too. Here is an answer about the difference between double and single square brackets in bash.

# Working with the file system:

# Primary	Meaning
# [ -e FILE ]	True if FILE exists.
# [ -f FILE ]	True if FILE exists and is a regular file.
# [ -d FILE ]	True if FILE exists and is a directory.
# [ -s FILE ]	True if FILE exists and not empty (size more than 0).
# [ -r FILE ]	True if FILE exists and is readable.
# [ -w FILE ]	True if FILE exists and is writable.
# [ -x FILE ]	True if FILE exists and is executable.
# [ -L FILE ]	True if FILE exists and is symbolic link.
# [ FILE1 -nt FILE2 ]	FILE1 is newer than FILE2.
# [ FILE1 -ot FILE2 ]	FILE1 is older than FILE2.
# Working with strings:

# Primary	Meaning
# [ -z STR ]	STR is empty (the length is zero).
# [ -n STR ]	STR is not empty (the length is non-zero).
# [ STR1 == STR2 ]	STR1 and STR2 are equal.
# [ STR1 != STR2 ]	STR1 and STR2 are not equal.
# Arithmetic binary operators:

# Primary	Meaning
# [ ARG1 -eq ARG2 ]	ARG1 is equal to ARG2.
# [ ARG1 -ne ARG2 ]	ARG1 is not equal to ARG2.
# [ ARG1 -lt ARG2 ]	ARG1 is less than ARG2.
# [ ARG1 -le ARG2 ]	ARG1 is less than or equal to ARG2.
# [ ARG1 -gt ARG2 ]	ARG1 is greater than ARG2.
# [ ARG1 -ge ARG2 ]	ARG1 is greater than or equal to ARG2.
# Conditions may be combined using these combining expressions:

# Operation	Effect
# [ ! EXPR ]	True if EXPR is false.
# [ (EXPR) ]	Returns the value of EXPR.
# [ EXPR1 -a EXPR2 ]	Logical AND. True if EXPR1 and EXPR2 are true.
# [ EXPR1 -o EXPR2 ]	Logical OR. True if EXPR1 or EXPR2 are true.
Sure, there are more useful primaries and you can easily find them in the Bash man pages.

# Using an if statement
# if statements work the same as in other programming languages. If the expression within the braces is true, the code between then and fi is executed. fi indicates the end of the conditionally executed code.

# Single-line
if [[ 1 -eq 1 ]]; then echo "true"; fi

# Multi-line
if [[ 1 -eq 1 ]]; then
  echo "true"
fi

# Likewise, we could use an if..else statement such as:

# Single-line
if [[ 2 -ne 1 ]]; then echo "true"; else echo "false"; fi

# Multi-line
if [[ 2 -ne 1 ]]; then
  echo "true"
else
  echo "false"
fi

# Sometimes if..else statements are not enough to do what we want to do. In this case we shouldn't forget about the existence of if..elif..else statements, which always come in handy.

# Look at the example below:

if [[ `uname` == "Adam" ]]; then
  echo "Do not eat an apple!"
elif [[ `uname` == "Eva" ]]; then
  echo "Do not take an apple!"
else
  echo "Apples are delicious!"
fi

# Using a case statement
# If you are confronted with a couple of different possible actions to take, then using a case statement may be more useful than nested if statements. For more complex conditions use case like below:

case "$extension" in
  "jpg"|"jpeg")
    echo "It's image with jpeg extension."
  ;;
  "png")
    echo "It's image with png extension."
  ;;
  "gif")
    echo "Oh, it's a giphy!"
  ;;
  *)
    echo "Woops! It's not image!"
  ;;
esac

# Each case is an expression matching a pattern. The | sign is used for separating multiple patterns, and the ) operator terminates a pattern list. The commands for the first match are executed. * is the pattern for anything else that doesn't match the defined patterns. Each block of commands should be divided with the ;; operator.

# Loops
# Here we won't be surprised. As in any programming language, a loop in bash is a block of code that iterates as long as the control conditional is true.

# There are four types of loops in Bash: for, while, until and select.

# for loop
# The for is very similar to its sibling in C. It looks like this:

for arg in elem1 elem2 ... elemN
do
  # statements
done

# During each pass through the loop, arg takes on the value from elem1 to elemN. Values may also be wildcards or brace expansions.

# Also, we can write for loop in one line, but in this case there needs to be a semicolon before do, like below:

for i in {1..5}; do echo $i; done

# By the way, if for..in..do seems a little bit weird to you, you can also write for in C-like style such as:

for (( i = 0; i < 10; i++ )); do
  echo $i
done

# for is handy when we want to do the same operation over each file in a directory. For example, if we need to move all .bash files into the script folder and then give them execute permissions, our script would look like this:

#!/bin/bash

for FILE in $HOME/*.bash; do
  mv "$FILE" "${HOME}/scripts"
  chmod +x "${HOME}/scripts/${FILE}"
done

# while loop
# The while loop tests a condition and loops over a sequence of commands so long as that condition is true. A condition is nothing more than a primary as used in if..then conditions. So a while loop looks like this:

while [[ condition ]]
do
  # statements
done

# Just like in the case of the for loop, if we want to write do and condition in the same line, then we must use a semicolon before do.

# A working example might look like this:

#!/bin/bash

# Squares of numbers from 0 through 9
x=0
while [[ $x -lt 10 ]]; do # value of x is less than 10
  echo $(( x * x ))
  x=$(( x + 1 )) # increase x
done
until loop

# The until loop is the exact opposite of the while loop. Like a while it checks a test condition, but it keeps looping as long as this condition is false:

until [[ condition ]]; do
  #statements
done
select loop

# The select loop helps us to organize a user menu. It has almost the same syntax as the for loop:

select answer in elem1 elem2 ... elemN
do
  # statements
done

# The select prints all elem1..elemN on the screen with their sequence numbers, after that it prompts the user. Usually it looks like $? (PS3 variable). The answer will be saved in answer. If answer is the number between 1..N, then statements will execute and select will go to the next iteration — that's because we should use the break statement.

# A working example might look like this:

#!/bin/bash

PS3="Choose the package manager: "
select ITEM in bower npm gem pip
do
  echo -n "Enter the package name: " && read PACKAGE
  case $ITEM in
    bower) bower install $PACKAGE ;;
    npm)   npm   install $PACKAGE ;;
    gem)   gem   install $PACKAGE ;;
    pip)   pip   install $PACKAGE ;;
  esac
  break # avoid infinite loop
done

# This example, asks the user what package manager {s,he} would like to use. Then, it will ask what package we want to install and finally proceed to install it.

# If we run this, we will get:

$ ./my_script
1) bower
2) npm
3) gem
4) pip

# Choose the package manager: 2
# Enter the package name: bash-handbook
# <installing bash-handbook>
# Loop control
# There are situations when we need to stop a loop before its normal ending or step over an iteration. In these cases, we can use the shell built-in break and continue statements. Both of these work with every kind of loop.

# The break statement is used to exit the current loop before its ending. We have already met with it.

# The continue statement steps over one iteration. We can use it as such:

for (( i = 0; i < 10; i++ )); do
  if [[ $(( i % 2 )) -eq 0 ]]; then continue; fi
  echo $i
done

# If we run the example above, it will print all odd numbers from 0 through 9.

# Functions
# In scripts we have the ability to define and call functions. As in any programming language, functions in bash are chunks of code, but there are differences.

# In bash, functions are a sequence of commands grouped under a single name, that is the name of the function. Calling a function is the same as calling any other program, you just write the name and the function will be invoked.

We can declare our own function this way:

my_func () {
  # statements
}

my_func # call my_func

# We must declare functions before we can invoke them.

# Functions can take on arguments and return a result — exit code. Arguments, within functions, are treated in the same manner as arguments given to the script in non-interactive mode — using positional parameters. A result code can be returned using the return command.

# Below is a function that takes a name and returns 0, indicating successful execution.

# function with params
greeting () {
  if [[ -n $1 ]]; then
    echo "Hello, $1!"
  else
    echo "Hello, unknown!"
  fi
  return 0
}

greeting Denys  # Hello, Denys!
greeting        # Hello, unknown!

# We already discussed exit codes. The return command without any arguments returns the exit code of the last executed command. Above, return 0 will return a successful exit code. 0.

# Debugging
# The shell gives us tools for debugging scripts. If we want to run a script in debug mode, we use a special option in our script's shebang:

#!/bin/bash options

# These options are settings that change shell behavior. The following table is a list of options which might be useful to you:

# Short	Name	Description
# -f	noglob	Disable filename expansion (globbing).
# -i	interactive	Script runs in interactive mode.
# -n	noexec	Read commands, but don't execute them (syntax check).
# pipefail	Make pipelines fail if any commands fail, not just if the final command fail.
# -t	—	Exit after first command.
# -v	verbose	Print each command to stderr before executing it.
# -x	xtrace	Print each command and its expanded arguments to stderr before executing it.
# For example, we have script with -x option such as:

#!/bin/bash -x

for (( i = 0; i < 3; i++ )); do
  echo $i
done

This will print the value of the variables to stdout along with other useful information:

$ ./my_script
+ (( i = 0 ))
+ (( i < 3 ))
+ echo 0
0
+ (( i++  ))
+ (( i < 3 ))
+ echo 1
1
+ (( i++  ))
+ (( i < 3 ))
+ echo 2
2
+ (( i++  ))
+ (( i < 3 ))

# Sometimes we need to debug a part of a script. In this case using the set command is convenient. This command can enable and disable options. Options are turned on using - and turned off using +:

#!/bin/bash

echo "xtrace is turned off"
set -x
echo "xtrace is enabled"
set +x
echo "xtrace is turned off again"




#!/bin/bash
##############################################################################
# SHORTCUTS
##############################################################################


CTRL+A  # move to beginning of line
CTRL+B  # moves backward one character
CTRL+C  # halts the current command
CTRL+D  # deletes one character backward or logs out of current session, similar to exit
CTRL+E  # moves to end of line
CTRL+F  # moves forward one character
CTRL+G  # aborts the current editing command and ring the terminal bell
CTRL+J  # same as RETURN
CTRL+K  # deletes (kill) forward to end of line
CTRL+L  # clears screen and redisplay the line
CTRL+M  # same as RETURN
CTRL+N  # next line in command history
CTRL+O  # same as RETURN, then displays next line in history file
CTRL+P  # previous line in command history
CTRL+R  # searches backward
CTRL+S  # searches forward
CTRL+T  # transposes two characters
CTRL+U  # kills backward from point to the beginning of line
CTRL+V  # makes the next character typed verbatim
CTRL+W  # kills the word behind the cursor
CTRL+X  # lists the possible filename completions of the current word
CTRL+Y  # retrieves (yank) last item killed
CTRL+Z  # stops the current command, resume with fg in the foreground or bg in the background

ALT+B   # moves backward one word
ALT+D   # deletes next word
ALT+F   # moves forward one word

DELETE  # deletes one character backward
!!      # repeats the last command
exit    # logs out of current session


##############################################################################
# BASH BASICS
##############################################################################

env                 # displays all environment variables

echo $SHELL         # displays the shell you're using
echo $BASH_VERSION  # displays bash version

bash                # if you want to use bash (type exit to go back to your previously opened shell)
whereis bash        # finds out where bash is on your system
which bash          # finds out which program is executed as 'bash' (default: /bin/bash, can change across environments)

clear               # clears content on window (hide displayed lines)


##############################################################################
# FILE COMMANDS
##############################################################################


ls                            # lists your files in current directory, ls <dir> to print files in a specific directory
ls -l                         # lists your files in 'long format', which contains the exact size of the file, who owns the file and who has the right to look at it, and when it was last modified
ls -a                         # lists all files, including hidden files (name beginning with '.')
ln -s <filename> <link>       # creates symbolic link to file
touch <filename>              # creates or updates (edit) your file
cat <filename>                # prints file raw content (will not be interpreted)
any_command > <filename>      # '>' is used to perform redirections, it will set any_command's stdout to file instead of "real stdout" (generally /dev/stdout)
more <filename>               # shows the first part of a file (move with space and type q to quit)
head <filename>               # outputs the first lines of file (default: 10 lines)
tail <filename>               # outputs the last lines of file (useful with -f option) (default: 10 lines)
vim <filename>                # opens a file in VIM (VI iMproved) text editor, will create it if it doesn't exist
mv <filename1> <dest>         # moves a file to destination, behavior will change based on 'dest' type (dir: file is placed into dir; file: file will replace dest (tip: useful for renaming))
cp <filename1> <dest>         # copies a file
rm <filename>                 # removes a file
diff <filename1> <filename2>  # compares files, and shows where they differ
wc <filename>                 # tells you how many lines, words and characters there are in a file. Use -lwc (lines, word, character) to ouput only 1 of those informations
chmod -options <filename>     # lets you change the read, write, and execute permissions on your files (more infos: SUID, GUID)
gzip <filename>               # compresses files using gzip algorithm
gunzip <filename>             # uncompresses files compressed by gzip
gzcat <filename>              # lets you look at gzipped file without actually having to gunzip it
lpr <filename>                # prints the file
lpq                           # checks out the printer queue
lprm <jobnumber>              # removes something from the printer queue
genscript                     # converts plain text files into postscript for printing and gives you some options for formatting
dvips <filename>              # prints .dvi files (i.e. files produced by LaTeX)
grep <pattern> <filenames>    # looks for the string in the files
grep -r <pattern> <dir>       # search recursively for pattern in directory


##############################################################################
# DIRECTORY COMMANDS
##############################################################################


mkdir <dirname>  # makes a new directory
cd               # changes to home
cd <dirname>     # changes directory
pwd              # tells you where you currently are


##############################################################################
# SSH, SYSTEM INFO & NETWORK COMMANDS
##############################################################################


ssh user@host            # connects to host as user
ssh -p <port> user@host  # connects to host on specified port as user
ssh-copy-id user@host    # adds your ssh key to host for user to enable a keyed or passwordless login

whoami                   # returns your username
passwd                   # lets you change your password
quota -v                 # shows what your disk quota is
date                     # shows the current date and time
cal                      # shows the month's calendar
uptime                   # shows current uptime
w                        # displays whois online
finger <user>            # displays information about user
uname -a                 # shows kernel information
man <command>            # shows the manual for specified command
df                       # shows disk usage
du <filename>            # shows the disk usage of the files and directories in filename (du -s give only a total)
last <yourUsername>      # lists your last logins
ps -u yourusername       # lists your processes
kill <PID>               # kills the processes with the ID you gave
killall <processname>    # kill all processes with the name
top                      # displays your currently active processes
bg                       # lists stopped or background jobs ; resume a stopped job in the background
fg                       # brings the most recent job in the foreground
fg <job>                 # brings job to the foreground

ping <host>              # pings host and outputs results
whois <domain>           # gets whois information for domain
dig <domain>             # gets DNS information for domain
dig -x <host>            # reverses lookup host
wget <file>              # downloads file


##############################################################################
# VARIABLES
##############################################################################


varname=value                # defines a variable
varname=value command        # defines a variable to be in the environment of a particular subprocess
echo $varname                # checks a variable's value
echo $$                      # prints process ID of the current shell
echo $!                      # prints process ID of the most recently invoked background job
echo $?                      # displays the exit status of the last command
export VARNAME=value         # defines an environment variable (will be available in subprocesses)

array[0]=valA                # how to define an array
array[1]=valB
array[2]=valC
array=([2]=valC [0]=valA [1]=valB)  # another way
array=(valA valB valC)              # and another

${array[i]}                  # displays array's value for this index. If no index is supplied, array element 0 is assumed
${#array[i]}                 # to find out the length of any element in the array
${#array[@]}                 # to find out how many values there are in the array

declare -a                   # the variables are treaded as arrays
declare -f                   # uses function names only
declare -F                   # displays function names without definitions
declare -i                   # the variables are treaded as integers
declare -r                   # makes the variables read-only
declare -x                   # marks the variables for export via the environment

${varname:-word}             # if varname exists and isn't null, return its value; otherwise return word
${varname:=word}             # if varname exists and isn't null, return its value; otherwise set it word and then return its value
${varname:?message}          # if varname exists and isn't null, return its value; otherwise print varname, followed by message and abort the current command or script
${varname:+word}             # if varname exists and isn't null, return word; otherwise return null
${varname:offset:length}     # performs substring expansion. It returns the substring of $varname starting at offset and up to length characters

${variable#pattern}          # if the pattern matches the beginning of the variable's value, delete the shortest part that matches and return the rest
${variable##pattern}         # if the pattern matches the beginning of the variable's value, delete the longest part that matches and return the rest
${variable%pattern}          # if the pattern matches the end of the variable's value, delete the shortest part that matches and return the rest
${variable%%pattern}         # if the pattern matches the end of the variable's value, delete the longest part that matches and return the rest
${variable/pattern/string}   # the longest match to pattern in variable is replaced by string. Only the first match is replaced
${variable//pattern/string}  # the longest match to pattern in variable is replaced by string. All matches are replaced

${#varname}                  # returns the length of the value of the variable as a character string

*(patternlist)               # matches zero or more occurrences of the given patterns
+(patternlist)               # matches one or more occurrences of the given patterns
?(patternlist)               # matches zero or one occurrence of the given patterns
@(patternlist)               # matches exactly one of the given patterns
!(patternlist)               # matches anything except one of the given patterns

$(UNIX command)              # command substitution: runs the command and returns standard output


##############################################################################
# FUNCTIONS
##############################################################################


# The function refers to passed arguments by position (as if they were positional parameters), that is, $1, $2, and so forth.
# $@ is equal to "$1" "$2"... "$N", where N is the number of positional parameters. $# holds the number of positional parameters.


function functname() {
  shell commands
}

unset -f functname  # deletes a function definition
declare -f          # displays all defined functions in your login session


##############################################################################
# FLOW CONTROLS
##############################################################################


statement1 && statement2  # and operator
statement1 || statement2  # or operator

-a                        # and operator inside a test conditional expression
-o                        # or operator inside a test conditional expression

# STRINGS

str1 == str2               # str1 matches str2
str1 != str2               # str1 does not match str2
str1 < str2                # str1 is less than str2 (alphabetically)
str1 > str2                # str1 is greater than str2 (alphabetically)
-n str1                    # str1 is not null (has length greater than 0)
-z str1                    # str1 is null (has length 0)

# FILES

-a file                   # file exists
-d file                   # file exists and is a directory
-e file                   # file exists; same -a
-f file                   # file exists and is a regular file (i.e., not a directory or other special type of file)
-r file                   # you have read permission
-s file                   # file exists and is not empty
-w file                   # your have write permission
-x file                   # you have execute permission on file, or directory search permission if it is a directory
-N file                   # file was modified since it was last read
-O file                   # you own file
-G file                   # file's group ID matches yours (or one of yours, if you are in multiple groups)
file1 -nt file2           # file1 is newer than file2
file1 -ot file2           # file1 is older than file2

# NUMBERS

-lt                       # less than
-le                       # less than or equal
-eq                       # equal
-ge                       # greater than or equal
-gt                       # greater than
-ne                       # not equal

if condition
then
  statements
[elif condition
  then statements...]
[else
  statements]
fi

for x in {1..10}
do
  statements
done

for name [in list]
do
  statements that can use $name
done

for (( initialisation ; ending condition ; update ))
do
  statements...
done

case expression in
  pattern1 )
    statements ;;
  pattern2 )
    statements ;;
esac

select name [in list]
do
  statements that can use $name
done

while condition; do
  statements
done

until condition; do
  statements
done

##############################################################################
# COMMAND-LINE PROCESSING CYCLE
##############################################################################


# The default order for command lookup is functions, followed by built-ins, with scripts and executables last.
# There are three built-ins that you can use to override this order: `command`, `builtin` and `enable`.

command  # removes alias and function lookup. Only built-ins and commands found in the search path are executed
builtin  # looks up only built-in commands, ignoring functions and commands found in PATH
enable   # enables and disables shell built-ins

eval     # takes arguments and run them through the command-line processing steps all over again


##############################################################################
# INPUT/OUTPUT REDIRECTORS
##############################################################################


cmd1|cmd2  # pipe; takes standard output of cmd1 as standard input to cmd2
< file     # takes standard input from file
> file     # directs standard output to file
>> file    # directs standard output to file; append to file if it already exists
>|file     # forces standard output to file even if noclobber is set
n>|file    # forces output to file from file descriptor n even if noclobber is set
<> file    # uses file as both standard input and standard output
n<>file    # uses file as both input and output for file descriptor n
n>file     # directs file descriptor n to file
n<file     # takes file descriptor n from file
n>>file    # directs file description n to file; append to file if it already exists
n>&        # duplicates standard output to file descriptor n
n<&        # duplicates standard input from file descriptor n
n>&m       # file descriptor n is made to be a copy of the output file descriptor
n<&m       # file descriptor n is made to be a copy of the input file descriptor
&>file     # directs standard output and standard error to file
<&-        # closes the standard input
>&-        # closes the standard output
n>&-       # closes the ouput from file descriptor n
n<&-       # closes the input from file descripor n


##############################################################################
# PROCESS HANDLING
##############################################################################


# To suspend a job, type CTRL+Z while it is running. You can also suspend a job with CTRL+Y.
# This is slightly different from CTRL+Z in that the process is only stopped when it attempts to read input from terminal.
# Of course, to interrupt a job, type CTRL+C.

myCommand &  # runs job in the background and prompts back the shell

jobs         # lists all jobs (use with -l to see associated PID)

fg           # brings a background job into the foreground
fg %+        # brings most recently invoked background job
fg %-        # brings second most recently invoked background job
fg %N        # brings job number N
fg %string   # brings job whose command begins with string
fg %?string  # brings job whose command contains string

kill -l      # returns a list of all signals on the system, by name and number
kill PID     # terminates process with specified PID

ps           # prints a line of information about the current running login shell and any processes running under it
ps -a        # selects all processes with a tty except session leaders

trap cmd sig1 sig2  # executes a command when a signal is received by the script
trap "" sig1 sig2   # ignores that signals
trap - sig1 sig2    # resets the action taken when the signal is received to the default

disown <PID|JID>    # removes the process from the list of jobs

wait                # waits until all background jobs have finished


##############################################################################
# TIPS & TRICKS
##############################################################################


# set an alias
cd; nano .bash_profile
> alias gentlenode='ssh admin@gentlenode.com -p 3404'  # add your alias in .bash_profile

# to quickly go to a specific directory
cd; nano .bashrc
> shopt -s cdable_vars
> export websites="/Users/mac/Documents/websites"

source .bashrc
cd $websites


##############################################################################
# DEBUGGING SHELL PROGRAMS
##############################################################################


bash -n scriptname  # don't run commands; check for syntax errors only
set -o noexec       # alternative (set option in script)

bash -v scriptname  # echo commands before running them
set -o verbose      # alternative (set option in script)

bash -x scriptname  # echo commands after command-line processing
set -o xtrace       # alternative (set option in script)

trap 'echo $varname' EXIT  # useful when you want to print out the values of variables at the point that your script exits

function errtrap {
  es=$?
  echo "ERROR line $1: Command exited with status $es."
}

trap 'errtrap $LINENO' ERR  # is run whenever a command in the surrounding script or function exits with non-zero status

function dbgtrap {
  echo "badvar is $badvar"
}

trap dbgtrap DEBUG  # causes the trap code to be executed before every statement in a function or script
# ...section of code in which the problem occurs...
trap - DEBUG  # turn off the DEBUG trap

function returntrap {
  echo "A return occurred"
}

trap returntrap RETURN  # is executed each time a shell function or a script executed with the . or source commands finishes executing
© 2019 GitHub, Inc.