UNIX.CafeUNIX.Cafe

Why don’t we make the backup process as simple as possible? I mean by issuing one command, the website contents and database should be backed up remotely and locally. Let’s see how can we achieve this.

Intro:

Backup vault

If we think about it, we’ll know that we need two files/scripts. One of them should stay in the website server (remotely) with one job, which is taking care of the backup process. While the Other script should be on our computer (locally), Which has two jobs. The first one, is telling the remote script to start the backup process. And the second job is downloading those backups to our computer once they have been completed.

We also want to organize our backups by storing them in sub directories named as the date of when that backup was being taken. We might also want to keep a copy of these backups on our website server as a second copy. So the goal is achieving all that by executing one simple command. Sounds interesting? Great, let’s see how to do it then.

GitLab repository:

Can be found in unix.cafe/website-backup. Or clone it by executing the following in terminal:

git clone https://gitlab.com/unix.cafe/website-backup.git

On the remote side (website server):

Requirements:

  • SSH: is needed to securely communicate with the website’s server.
  • Rsync: is needed to transfer the backups to the local machine.

Contents index

Installing the requirements:

The following commands need to be executed with super user permissions. So, either login as root by: ‘su -‘ or prefix each command with ‘sudo’ if you are a sudoer.

Enabling & starting the SSH service:

The following commands need to be executed with super user permissions. So, either login as root by: ‘su -‘ or prefix each command with ‘sudo’ if you are a sudoer.
  • On SystemD Linux distros: To enable and start the SSH service type: systemctl enable --now sshd.
  • On OpenRC Linux distros: To enable the SSH service type: rc-update add sshd, and to start it type: service sshd start.
  • On FreeBSD: To enable the SSH service type sysrc sshd_enable=YES, and to start it type: service sshd start.
  • On MacOSX: To enable SSH login from remote connections, type: systemsetup -setremotelogin on.
  • If you’re using another services manager, have a look at: HowTo: Manage a service in systemd, SysVinit, Upstart, runit and OpenRC

Preparation for our backup script:

First, we need to create a the backup script file. It’s up to you to name it whatever you want, and store it wherever you like. But please DO NOT store inside your web-server root directory (which is usually called public_html) So that no one else but you can have access to it. In this example I’m going to name it bkp and will put it in ~/bin/. So its full path is ~/bin/bkp. Let’s do that:

Let’s create the directory ~/bin:

[deadsoul@remote ~]$ mkdir ~/bin

Now, Let’s create the file bkp:

[deadsoul@remote ~]$ touch ~/bin/bkp

Now, Let’s make ~/bin/bkp executable by us as its owner:

[deadsoul@remote ~]$ chmod u+x ~/bin/bkp

Backup script (source code):

Great, now let’s copy the following source and paste it inside ~/bin/bkp. You may use nano, vim or whatever editor you like.

#!/bin/bash

# configuration
DSTDIR="/path/to/where/to/store/backups"
SRCDIR="/path/to/what/to/backup"
DBNAME='db-name'
DBUSER='db-user'
DBPASS='db-pass'

# ------------------------------------------------------------------------------------------
# You don't have to modify anything else below this line, unless you know what you're doing
# ------------------------------------------------------------------------------------------

# prepare sub-directory name
DATE=$(date +'%Y%m%d')
TIME=$(date +'%H%M%S')
SUBDIR="${DSTDIR}/${DATE}"
DIRNAME=$(basename "$SRCDIR")

# create the sub-directory
[[ -e "${SUBDIR}" ]] && SUBDIR="${SUBDIR}-${TIME}"
mkdir -p "${SUBDIR}"

# backing up files
echo -e '\n * Backing up files …'; sleep 2
tar -zcpf "${SUBDIR}/${DIRNAME}.tar.gz" "$SRCDIR"

# backing up db
echo -e '\n * Backing up database …\n'; sleep 2
mysqldump ${DBNAME} -u ${DBUSER} -p${DBPASS} > "${DBNAME}.sql"
tar -zcpf "${SUBDIR}/${DBNAME}.sql.tar.gz" "${DBNAME}.sql"
rm -f "${DBNAME}.sql"

What does it do? (Explanation):

# configuration
DSTDIR="/path/to/where/to/store/backups"
SRCDIR="/path/to/what/to/backup"
DBNAME='db-name'
DBUSER='db-user'
DBPASS='db-pass'
  • DSTDIR: is a variable that holds the location of where all backups should be stored in. (without trailing slash)
  • SRCDIR: is a variable that holds the location of which folder we should back up. (without trailing slash)
  • DBNAME: is a variable that holds the database name.
  • DBUSER: is a variable that holds the database user.
  • DBPASS: is a variable that holds the database password.
# prepare sub-directory name
DATE=$(date +'%Y%m%d')
TIME=$(date +'%H%M%S')
SUBDIR="${DSTDIR}/${DATE}"
DIRNAME=$(basename "$SRCDIR")
  • DATE: is a variable that holds the current date YearMonthDay. (e.g: 20200603)
  • TIME: is a variable that holds the current time HourMinuteSecond. (e.g: 193520)
  • SUBDIR: is a variable that holds the value of $DSTDIR followed by/ followed by the value of $DATE.
  • DIRNAME: is a variable that holds the name of the folder we want to back up. Which is public_html in our example.
# create the sub-directory
[[ -e "${SUBDIR}" ]] && SUBDIR="${SUBDIR}-${TIME}"
mkdir -p "${SUBDIR}"
  • [[ -e "${SUBDIR}" ]] && SUBDIR="${SUBDIR}-${TIME}" It checks for the existent of the generated sub-directory’s name ($SUBDIR). If it exists, then we’re telling it to append a dash - and the value of $TIME to the generated sub-directory’s name, in order to make it a unique one.
  • After that we create that directory by the command mkdir -p "${SUBDIR}". (We used the flag -p to also create any folder in the path of $SUBDIR that doesn’t exist. Which is helpful in case of we’re using the script for the first time and we forgot to create the $DSTDIR).
# backing up files
echo -e '\n * Backing up files …'; sleep 2
tar -zcpf "${SUBDIR}/${DIRNAME}.tar.gz" "$SRCDIR"
  • echo -e '\n * Backing up files …'; We’re printing what we’re about to do. And sleep 2 means wait for 2 seconds before jumping to the next BASH statement.
  • tar -zcpf "${SUBDIR}/${DIRNAME}.tar.gz" "$SRCDIR" We’re creating a compressed tarball of public_html and its ownership and permissions.
# backing up db
echo -e '\n * Backing up database …\n'; sleep 2
mysqldump ${DBNAME} -u ${DBUSER} -p${DBPASS} > "${DBNAME}.sql"
tar -zcpf "${SUBDIR}/${DBNAME}.sql.tar.gz" "${DBNAME}.sql"
rm -f "${DBNAME}.sql"
  • echo -e '\n * Backing up database …\n'; Again, we’re printing what we’re about to do. And sleep 2 means wait for 2 seconds before jumping to the next BASH statement.
  • mysqldump ${DBNAME} -u ${DBUSER} -p${DBPASS} > "${DBNAME}.sql" We’re dumping the database into a file named as the value of $DBNAME followed by .sql.
  • tar -zcpf "${SUBDIR}/${DBNAME}.sql.tar.gz" "${DBNAME}.sql" Then, we’re compressing that dumped file.
  • rm -f "${DBNAME}.sql" Finally, we remove that sql file. since we already have a compressed version of it.

Configuration:

Now is time to modify some lines in it. (from line #4 to line #8). According to your settings, adjust the following values:

DSTDIR="/path/to/where/to/store/backups"
SRCDIR='/path/to/what/to/backup'
DBNAME='db-name'
DBUSER='db-user'
DBPASS='db-pass'

In this example we will assume the desired folder to store the taken backups is ~/backups, and the folder we want to back up is ~/public_html, then we will modify it as the following

DSTDIR="$HOME/backups"
SRCDIR="$HOME/public_html"

Let’s assume our database credential as the following:

DBNAME='wp_blog'
DBUSER='deadsoul'
DBPASS='password'

And that’s it for the remote side (website server). Let’s see now what should be done on the local side (our computer).


On local side (your computer):

Requirements:

  • SSH: is needed to securely communicate with the website’s server.
  • Rsync: is needed to transfer the backups to local machine.
  • Expect (is optional. Only if you don’t have a password-less ssh key, or you mind entering your password more than once for every backup)

Installation:

The following commands need to be executed with super user permissions. So, either login as root by: ‘su -‘ or prefix each command with ‘sudo’ if you are a sudoer.

Preparation for our backup script:

First, we need to create a the backup script file. It’s up to you to name it whatever you want, and store it wherever you like. In this example I’m going to name it bkp and will put it in ~/bin/. So its full path is ~/bin/bkp. Let’s do that:

Let’s create the directory ~/bin:

[deadsoul@local ~]$ mkdir ~/bin

Now, Let’s create the file bkp:

[deadsoul@local ~]$ touch ~/bin/bkp

Now, Let’s make ~/bin/bkp executable by us as its owner:

[deadsoul@local ~]$ chmod u+x ~/bin/bkp

Let’s make sure that this location ~/bin is in the $PATH variable. By editing ~/.bashrc, and add the following to the end:

if ! [[ "$PATH" =~ "$HOME/bin" ]]; then
    PATH="$HOME/bin:$PATH"
    export PATH
fi

Then just save the file and exit. If you edited via terminal, type source ~/.bashrc in order to apply your changes.

Backup script Notes:

  • If you’re SSH-ing to your server with a password or key passphrase and you don’t want to enter your password more than once, use the Expect version. By copying its content to the bkp file.
  • If you don’t mind entering your password more than once, or you’re using a password-less key, then use the BASH version. By copying its content to the bkp file.

Backup script (Expect version):

#!/usr/bin/expect

set timeout -1

# grab the password
stty -echo
puts " "
send_user -- "Enter your ssh password: "
expect_user -re "(.*)\n"
send_user "\n"
stty echo
set SSHPASS $expect_out(1,string)

# configuration
set SRV_USER your_username
set SRV_HOST your_server_host
set SRV_BKP_SCRIPT path_to_bkp_on_server
set SRV_BACKUPS_DIR path_to_backups_on_server_with_trailing_slash
set LOCAL_ARCHIVE_DIR path_to_local_archive_with_trailing_slash

# ask bkp on server to take a backup
puts " "
puts "Connecting to website.."
spawn ssh -t $SRV_USER@$SRV_HOST "$SRV_BKP_SCRIPT"
expect {
    "Enter passphrase for key" { send -- "$SSHPASS\r" }
    "password" { send -- "$SSHPASS\r" }
}
interact

# download the taken backup
puts " "
puts "Downloading the backups.."
spawn rsync -avhPe 'ssh' $SRV_USER@$SRV_HOST:"$SRV_BACKUPS_DIR" "$LOCAL_ARCHIVE_DIR"
expect {
    "Enter passphrase for key" { send -- "$SSHPASS\r" }
    "password" { send -- "$SSHPASS\r" }
}
interact

What does it do? (Explaintion):

set timeout -1

# grab the password
stty -echo
puts " "
send_user -- "Enter your ssh password: "
expect_user -re "(.*)\n"
send_user "\n"
stty echo
set SSHPASS $expect_out(1,string)
  • set timeout -1: Disables the timeout from our expect script.
  • stty -echo: Disables echoing of terminal input.
  • puts " ": Prints empty line.
  • send_user -- "Enter your ssh password: ": Prints a password prompt.
  • expect_user -re "(.*)\n": Reads our input.
  • send_user "\n": Prints new line.
  • stty echo: Re-enables echoing of terminal input.
  • set SSHPASS $expect_out(1,string): Assigns the entered password to the variable SSHPASS.
# configuration
set SRV_USER your_username
set SRV_HOST your_server_host
set SRV_BKP_SCRIPT path_to_bkp_on_server
set SRV_BACKUPS_DIR path_to_backups_on_server_with_trailing_slash
set LOCAL_ARCHIVE_DIR path_to_local_archive_with_trailing_slash
  • SRV_USER: is a variable that holds the remote SSH user.
  • SRV_HOST: is a variable that holds the remote SSH host.
  • SRV_BKP_SCRIPT: is a variable that holds the path to the bkp script in server side.
  • SRV_BACKUPS_DIR: is a variable that holds the path of the stored backups in server side, and it should be identical to the value of $DSTDIR which is in bkp script that lives in the server side. (with trailing slash)
  • LOCAL_ARCHIVE_DIR: is a variable that holds the path to a local folder which we would like to use to store all the remotely taken backups. (with trailing slash)
# ask bkp on server to take a backup
puts " "
puts "Connecting to website.."
spawn ssh -t $SRV_USER@$SRV_HOST "$SRV_BKP_SCRIPT"
expect {
    "Enter passphrase for key" { send -- "$SSHPASS\r" }
    "password" { send -- "$SSHPASS\r" }
}
interact
  • puts " ": Prints empty line.
  • puts "Connecting to website..": Prints connecting message.
  • spawn ssh -t $SRV_USER@$SRV_HOST "$SRV_BKP_SCRIPT": Execute the bkp script on the server side via ssh.
  • expect {: Handles the pre-defined situations between expect { … and … }.
  • "Enter passphrase for key" { send -- "$SSHPASS\r" }: If the server respond to us with something that contains Enter passphrase for key then we reply with the value of $SSHPASS followed by enter.
  • "password" { send -- "$SSHPASS\r" }: Again, if the server respond to us with something that contains password then we reply with the value of $SSHPASS followed by enter.
  • interact: Interacts with the process.
# download the taken backup
puts " "
puts "Downloading the backups.."
spawn rsync -avhPe 'ssh' $SRV_USER@$SRV_HOST:"$SRV_BACKUPS_DIR" "$LOCAL_ARCHIVE_DIR"
expect {
    "Enter passphrase for key" { send -- "$SSHPASS\r" }
    "password" { send -- "$SSHPASS\r" }
}
interact
  • spawn rsync -avhPe 'ssh' $SRV_USER@$SRV_HOST:"$SRV_BACKUPS_DIR" "$LOCAL_ARCHIVE_DIR": Executes the rsync command to download all existing backups.
  • The rest of the lines already were explained above.

Configuration:

Now is time to configure it. (from line #15 to line #19). According to your settings, adjust the following values:

  • We will assume that our username is deadsoul and our website host called unix.cafe.
  • And we will also assume that our local folder that we’re going to use to store the backups in is /mnt/website_backups/.
  • Keep in mind that the SRV_BACKUPS_DIR shloud be identical to the value of the variable $DSTDIR which is in the server side bkp script. which is ~/backups.
  • Remember that we already assumed the path of bkp on server side is ~/bin/bkp.

That means the following is our configuration:

# configuration
set SRV_USER deadsoul
set SRV_HOST unix.cafe
set SRV_BKP_SCRIPT /home/deadsoul/bin/bkp
set SRV_BACKUPS_DIR /home/deadsoul/backups/
set LOCAL_ARCHIVE_DIR /mnt/website_backups/

Backup script (BASH version):

#!/usr/bin/bash

# configuration
SRV_USER='your_username'
SRV_HOST='your_server_host'
SRV_BKP_SCRIPT='path_to_bkp_on_server'
SRV_BACKUPS_DIR='path_to_backups_on_server' # must ends with trailing slash
LOCAL_ARCHIVE_DIR='path_to_local_archive' # must ends with trailing slash

# ask bkp on server to take a backup
ssh -t $SRV_USER@$SRV_HOST "$SRV_BKP_SCRIPT"

# download the taken backup
rsync -avhPe 'ssh' $SRV_USER@$SRV_HOST:"$SRV_BACKUPS_DIR" "$LOCAL_ARCHIVE_DIR"

What does it do? (Explaintion):

# configuration
SRV_USER='your_username'
SRV_HOST='your_server_host'
SRV_BKP_SCRIPT='path_to_bkp_on_server'
SRV_BACKUPS_DIR='path_to_backups_on_server' # must ends with trailing slash
LOCAL_ARCHIVE_DIR='path_to_local_archive' # must ends with trailing slash
  • We already explained the configuration in the Expect version above, check it.
# ask bkp on server to take a backup
ssh -t $SRV_USER@$SRV_HOST "$SRV_BKP_SCRIPT"

# download the taken backup
rsync -avhPe 'ssh' $SRV_USER@$SRV_HOST:"$SRV_BACKUPS_DIR" "$LOCAL_ARCHIVE_DIR"
  • ssh -t $SRV_USER@$SRV_HOST "$SRV_BKP_SCRIPT": Execute the bkp script on the server side via ssh.
  • rsync -avhPe 'ssh' $SRV_USER@$SRV_HOST:"$SRV_BACKUPS_DIR" "$LOCAL_ARCHIVE_DIR": Executes the rsync command to download all existing backups.

Configuration:

We already talked about this configuration above in the Expect version, so we will use it here too.

# configuration
SRV_USER='deadsoul'
SRV_HOST='unix.cafe'
SRV_BKP_SCRIPT='/home/deadsoul/bin/bkp'
SRV_BACKUPS_DIR='/home/deadsoul/backups/' # must ends with trailing slash
LOCAL_ARCHIVE_DIR='/mnt/website_backups/' # must ends with trailing slash

Using it:

Just open up a terminal on your local computer, and type: bkp. That’s it.

By DeaDSouL

A big fan of UNIX & Linux.. Who adores programming..

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.