# Site backup

### Backup stategy

For our backup strategy, we choose offsite backup as it is the best way to save and recover data in case of EC2 failure. We shall use Amazon S3 (Simple Storage Service) as our backup data store. We shall opt for an incremental backup for daily backup and full backup for weekly and monthly backup.

We choose incremental backup for daily backup because in case we have to manage big data set and as data will be growing by time, it will be resources and network bandwith consuming to make daily full backup. Full backup can be done weekly and monthly.

Daily backup will be made with Amazon AWS service CloudWatch and weekly/monthly backup with Drupal's module "Backup and Migrate" .

With **Amazon CloudWatch,** we can schedule daily snapshot creation of our whole instance. Periodic snapshots creation of a volume are incremental (as described here <https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-creating-snapshot.html>). One advantage of using snapshots for backup is that snapshots are backup of the whole instance. It means in case of restoration from a snapshot, all our configurations, packages, files and databases will be restored. The process is also asynchrone.

Drupal module **"Backup and Migrate"** manage back-up and data migration. It also comes with built-in support for Amazon S3. We can use it to backup. We will use it to backup all the site (code, files, DBs) to Amazon S3

### Daily backup

As our daily backup will be snapshot, we log in Amazon AWS console and go to **Services -> Management Tools -> CloudWatch** In the left tab, we click on Rules to create a rule that will create our daily snapshot. Then we click on the button **"Create rule"**

Under **"Event Source"** select **"Schedule"** option. We set a Fixed rate of 24 hours .

At the right side of **"Event Source"**, we have **"Targets"** panel. We add a target and choose **"EC2 CreateSnapshot API Call"**. We put in the Volume ID of our EC2 instance's volume and click **"Configure details"**. In step 2, we add a name and a description to our rule. Name : Create-Instance-snapshot Description : A rule to create daily snapshot of the instance.

After complete the creation, we will have so a daily backup of our EC2 instance

### Weekly and monthly backup

#### Installation and configuration of Drupal module "Backup and Migrate"

First we download, install and enable the module from this link <https://ftp.drupal.org/files/projects/backup_migrate-7.x-3.5.tar.gz>

#### Configure Backup Migrate with a new destination.

On <https://task.woezzon.com/?q=admin/config/system/backup_migrate/export/advanced> , we create a new destination for our backup data. In BACKUP DESTINATION tab, we click on "Create new destination" and choose "Amazon S3 Bucket" .

After, we download the amazon S3 PHP library (<https://github.com/tpyo/amazon-s3-php-class/tarball/master>) needed by Drupal to use S3 and place it in **sites/all/libraries/s3-php5-curl**

```
$ cd /var/www/drupal/sites/all/libraries
$ sudo mkdir s3-php5-curl
$ sudo chown -R  www-data:www-data s3-php5-curl/
$ wget https://github.com/tpyo/amazon-s3-php-class/tarball/master
$ sudo mv master s3_php.tar.gz
$ sudo chown www-data:www-data s3_php.tar.gz
$ sudo tar -xzvf s3_php.tar.gz
$ sudo chown www-data:www-data tpyo-amazon-s3-php-class-9cf2eec/
$ cd tpyo-amazon-s3-php-class-9cf2eec/
$ sudo cp * ../s3-php5-curl/
$ sudo chown -R  www-data:www-data s3-php5-curl/
	
```

Now we go to our AWS console to

#### Create our backup S3 bucket

Console -> Services -> S3 -> "Create bucket"

Bucket name : "task-for-junior-devops"

For best results, we use the US-Default region According to some unsubstantiated reports, as of 2015-04, the stable version of the PHP-S3 library does not support any other regions quite as well. (<https://www.drupal.org/node/2465951>)

#### Create a backup user on IAM

For security reason, it is strongly advised to not use admin credentials to get your backup script to connect for you. Instead, we create a limited user access account. Console -> Services -> Security, Identity, & Compliance -> IAM -> Users

We click on *"Add user"*

Name : drupal\_backup

Access Type : Programmatic access

In next step we choose "Attach existing policy directly" and then "Create policy" and go to JSON tab

&#x20;Policy name : drupal\_backup\_policy

In the JSON tab we put

```javascript
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowUserToFindBucket",
            "Action": [
                "s3:ListAllMyBuckets",
                "s3:ListBucket",
                "s3:GetBucketLocation",
                "s3:ListObjects"
            ],
            "Effect": "Allow",
            "Resource": "arn:aws:s3:::*"
        },
        {
            "Sid": "AllowUserToStoreFiles",
            "Action": [
                "s3:DeleteObject",
                "s3:GetBucketAcl",
                "s3:GetObject",
                "s3:GetObjectAcl",
                "s3:ListAllMyBuckets",
                "s3:ListBucket",
                "s3:ListObjects",
                "s3:PutObject",
                "s3:PutObjectAcl"
            ],
            "Effect": "Allow",
            "Resource": "arn:aws:s3:::task-for-junior-devops/*"
        }
	]
}
```

Then we attach the newly create policy **"drupal\_backup\_policy"** to the user. After created the user we have

```
Access Key ID : AKIAIZUFPBEN5WS3KCSQ
Secret access key : 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
```

We return to Drupal console and fill in details for the Amazon S3 destination

|                   |                                          |
| ----------------- | ---------------------------------------- |
| Destination name  | Amazon S3 bucket                         |
| S3 Bucket         | task-for-junior-devops                   |
| Access Key ID     | AKIAIZUFPBEN5WS3KCSQ                     |
| Secret Access Key | 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 |

We can test if everything was properly set by going to <https://task.woezzon.com/?q=admin/config/system/backup_migrate/settings/destination> , and choose "Destinations" in SETTINGS tab. In the "Amazon S3 bucket" destination we try to "List files" . We do not get any error so it means everything is OK.

To test our configuration, we can make a Quick Backup in "Backup" tab. We select "Default Database" and **"Amazon S3 bucket"** as destination. We click **"Backup Now"**

The backup succeed and we have the message ***"Default Database backed up successfully to TaskforJuniorDevopsSysadmin-2018-11-25T19-52-06 (216.31 KB) in destination Amazon S3 bucket in 433.9 ms"***

### Backing up of CiviCRM

The Default Database is our Drupal's DB **drupal\_db** running on port 3307. To backup CiviCRM's database too, we have to create a second data Source. In **"SETTINGS"** tab we click on **"Create a new source"** We select "MySQL Database" and fill in the form

|               |                  |
| ------------- | ---------------- |
| Source name   | CiviCRM Database |
| Host          | 127.0.0.1:3308   |
| Database name | civicrm\_db      |
| Username      | civicrm          |
| Password      | civicrm\_pswd    |

After this operation, we will have two data source. One for Drupal and the second for CiviCRM To make a quick backup we can choose whether Default Database for Drupal or CiviCRM Database.

### **Schedule backups**

To automate backup with the Drupal module, we have to create a **Schedule**.

In Schedule tab, we click on "Add Schedule" and we set :

|                                                                                  |                                |
| -------------------------------------------------------------------------------- | ------------------------------ |
| Schedule Name                                                                    | Weekly backup Entire Site      |
| BACKUP SOURCE                                                                    | Entire Site (code, files & DB) |
| We check "Enabled" and "Run using Drupal's cron" and choose Backup every 1 Weeks |                                |
| Backup Destination                                                               | Amazon S3 Bucket               |

And we click on **"Save schedule"**

After this schedule, we shall do the same to create a **"Monthly backup Entire Site"** schedule with a frequency of Backup every 4 Weeks.

When backing up the entire site, just the Drupal database is backed up. Not CiviCRM's one. We have to create a specific schedule to also back up CiviCRM database. We then create "**Weekly CiviCRM database backup"** and **"Monthly CiviCRM database backup"** schedules. For CiviCRM , the BACKUP SOURCE will be **"CiviCRM database".**&#x20;

[**https://task.woezzon.com/#overlay=%3Fq%3Dadmin/config/system/backup\_migrate/schedule**](https://task.woezzon.com/#overlay=%3Fq%3Dadmin/config/system/backup_migrate/schedule)

{% embed url="<https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-creating-snapshot.html>" %}

{% embed url="<https://www.drupal.org/node/2465951>" %}

{% embed url="<https://github.com/tpyo/amazon-s3-php-class/tarball/master>" %}

{% embed url="<https://www.drupal.org/docs/7/modules/backup-and-migrate/backup-and-migrate>" %}


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://task-devops.gitbook.io/documentation/backup-stategy.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
