Skip to content

Database Backups to S3 Storage

Overview

Automated Database Backups

Backup your databases directly to S3 storage with automated scripts. Supports PostgreSQL, MySQL, MongoDB, and other popular databases.

Prerequisites

Before You Start

  • S3 storage configured with valid credentials
  • AWS CLI installed and configured
  • Database access credentials
  • Sufficient storage space for backups

PostgreSQL Backups

Basic PostgreSQL Backup

Simple PostgreSQL Backup Script

Create postgres_backup.sh:

#!/bin/bash

# Configuration
DB_NAME="your_database"
DB_USER="postgres"
DB_HOST="localhost"
DB_PORT="5432"
S3_BUCKET="database-backups"
BACKUP_DIR="/tmp/db_backups"

# S3 Configuration
export AWS_ACCESS_KEY_ID="YOUR_ACCESS_KEY"
export AWS_SECRET_ACCESS_KEY="YOUR_SECRET_KEY"

# Create backup directory
mkdir -p $BACKUP_DIR

# Create backup filename with timestamp
BACKUP_FILE="postgres_${DB_NAME}_$(date +%Y%m%d_%H%M%S).sql.gz"

# Create backup
echo "Creating PostgreSQL backup..."
pg_dump -h $DB_HOST -p $DB_PORT -U $DB_USER $DB_NAME | gzip > "$BACKUP_DIR/$BACKUP_FILE"

# Upload to S3
echo "Uploading to S3..."
aws s3 cp "$BACKUP_DIR/$BACKUP_FILE" "s3://$S3_BUCKET/postgresql/" \
    --endpoint-url https://eu-west-1.euronodes.com

# Clean up local backup
rm "$BACKUP_DIR/$BACKUP_FILE"

echo "Backup completed: $BACKUP_FILE"

Advanced PostgreSQL Backup

Production-Ready PostgreSQL Backup

Create postgres_backup_advanced.sh:

#!/bin/bash

# Configuration
DB_NAME="your_database"
DB_USER="postgres"
DB_HOST="localhost"
DB_PORT="5432"
S3_BUCKET="database-backups"
BACKUP_DIR="/tmp/db_backups"
RETENTION_DAYS=30

# S3 Configuration
export AWS_ACCESS_KEY_ID="YOUR_ACCESS_KEY"
export AWS_SECRET_ACCESS_KEY="YOUR_SECRET_KEY"

# Create backup directory
mkdir -p $BACKUP_DIR

# Create backup filename with timestamp
BACKUP_FILE="postgres_${DB_NAME}_$(date +%Y%m%d_%H%M%S).sql.gz"

# Create backup with custom format for faster restore
echo "Creating PostgreSQL backup..."
pg_dump -h $DB_HOST -p $DB_PORT -U $DB_USER -Fc $DB_NAME | gzip > "$BACKUP_DIR/$BACKUP_FILE"

# Verify backup was created successfully
if [ $? -eq 0 ]; then
    echo "Backup created successfully"

    # Upload to S3
    echo "Uploading to S3..."
    aws s3 cp "$BACKUP_DIR/$BACKUP_FILE" "s3://$S3_BUCKET/postgresql/" \
        --endpoint-url https://eu-west-1.euronodes.com

    if [ $? -eq 0 ]; then
        echo "Upload successful"

        # Clean up old backups (keep last 30 days)
        echo "Cleaning up old backups..."
        aws s3 ls "s3://$S3_BUCKET/postgresql/" --endpoint-url https://eu-west-1.euronodes.com | \
        while read -r line; do
            createDate=$(echo $line | awk '{print $1" "$2}')
            createDate=$(date -d "$createDate" +%s)
            olderThan=$(date -d "$RETENTION_DAYS days ago" +%s)
            if [[ $createDate -lt $olderThan ]]; then
                fileName=$(echo $line | awk '{print $4}')
                if [[ $fileName != "" ]]; then
                    aws s3 rm "s3://$S3_BUCKET/postgresql/$fileName" \
                        --endpoint-url https://eu-west-1.euronodes.com
                    echo "Deleted old backup: $fileName"
                fi
            fi
        done
    else
        echo "Upload failed"
        exit 1
    fi
else
    echo "Backup failed"
    exit 1
fi

# Clean up local backup
rm "$BACKUP_DIR/$BACKUP_FILE"

echo "Backup completed: $BACKUP_FILE"

PostgreSQL Restore

Restore from S3 Backup

#!/bin/bash

# Configuration
DB_NAME="your_database"
DB_USER="postgres"
S3_BUCKET="database-backups"
BACKUP_FILE="postgres_your_database_20241201_120000.sql.gz"

# Download backup from S3
aws s3 cp "s3://$S3_BUCKET/postgresql/$BACKUP_FILE" ./ \
    --endpoint-url https://eu-west-1.euronodes.com

# Restore database
gunzip -c $BACKUP_FILE | psql -h localhost -U $DB_USER $DB_NAME

# Clean up
rm $BACKUP_FILE

MySQL Backups

Basic MySQL Backup

Simple MySQL Backup Script

Create mysql_backup.sh:

#!/bin/bash

# Configuration
DB_NAME="your_database"
DB_USER="root"
DB_PASSWORD="your_password"
DB_HOST="localhost"
S3_BUCKET="database-backups"
BACKUP_DIR="/tmp/db_backups"

# S3 Configuration
export AWS_ACCESS_KEY_ID="YOUR_ACCESS_KEY"
export AWS_SECRET_ACCESS_KEY="YOUR_SECRET_KEY"

# Create backup directory
mkdir -p $BACKUP_DIR

# Create backup filename with timestamp
BACKUP_FILE="mysql_${DB_NAME}_$(date +%Y%m%d_%H%M%S).sql.gz"

# Create backup
echo "Creating MySQL backup..."
mysqldump -h $DB_HOST -u $DB_USER -p$DB_PASSWORD $DB_NAME | gzip > "$BACKUP_DIR/$BACKUP_FILE"

# Upload to S3
echo "Uploading to S3..."
aws s3 cp "$BACKUP_DIR/$BACKUP_FILE" "s3://$S3_BUCKET/mysql/" \
    --endpoint-url https://eu-west-1.euronodes.com

# Clean up local backup
rm "$BACKUP_DIR/$BACKUP_FILE"

echo "Backup completed: $BACKUP_FILE"

MySQL All Databases Backup

Backup All MySQL Databases

Create mysql_all_backup.sh:

#!/bin/bash

# Configuration
DB_USER="root"
DB_PASSWORD="your_password"
DB_HOST="localhost"
S3_BUCKET="database-backups"
BACKUP_DIR="/tmp/db_backups"

# S3 Configuration
export AWS_ACCESS_KEY_ID="YOUR_ACCESS_KEY"
export AWS_SECRET_ACCESS_KEY="YOUR_SECRET_KEY"

# Create backup directory
mkdir -p $BACKUP_DIR

# Create backup filename with timestamp
BACKUP_FILE="mysql_all_databases_$(date +%Y%m%d_%H%M%S).sql.gz"

# Create backup of all databases
echo "Creating MySQL backup of all databases..."
mysqldump -h $DB_HOST -u $DB_USER -p$DB_PASSWORD --all-databases | gzip > "$BACKUP_DIR/$BACKUP_FILE"

# Upload to S3
echo "Uploading to S3..."
aws s3 cp "$BACKUP_DIR/$BACKUP_FILE" "s3://$S3_BUCKET/mysql/" \
    --endpoint-url https://eu-west-1.euronodes.com

# Clean up local backup
rm "$BACKUP_DIR/$BACKUP_FILE"

echo "Backup completed: $BACKUP_FILE"

MongoDB Backups

MongoDB Backup Script

MongoDB Backup to S3

Create mongodb_backup.sh:

#!/bin/bash

# Configuration
DB_NAME="your_database"
MONGO_HOST="localhost"
MONGO_PORT="27017"
MONGO_USER="admin"
MONGO_PASSWORD="your_password"
S3_BUCKET="database-backups"
BACKUP_DIR="/tmp/db_backups"

# S3 Configuration
export AWS_ACCESS_KEY_ID="YOUR_ACCESS_KEY"
export AWS_SECRET_ACCESS_KEY="YOUR_SECRET_KEY"

# Create backup directory
mkdir -p $BACKUP_DIR

# Create backup filename with timestamp
BACKUP_FILE="mongodb_${DB_NAME}_$(date +%Y%m%d_%H%M%S)"

# Create backup
echo "Creating MongoDB backup..."
mongodump --host $MONGO_HOST:$MONGO_PORT \
          --username $MONGO_USER \
          --password $MONGO_PASSWORD \
          --db $DB_NAME \
          --out "$BACKUP_DIR/$BACKUP_FILE"

# Compress backup
tar -czf "$BACKUP_DIR/$BACKUP_FILE.tar.gz" -C "$BACKUP_DIR" "$BACKUP_FILE"

# Upload to S3
echo "Uploading to S3..."
aws s3 cp "$BACKUP_DIR/$BACKUP_FILE.tar.gz" "s3://$S3_BUCKET/mongodb/" \
    --endpoint-url https://eu-west-1.euronodes.com

# Clean up local backup
rm -rf "$BACKUP_DIR/$BACKUP_FILE"
rm "$BACKUP_DIR/$BACKUP_FILE.tar.gz"

echo "Backup completed: $BACKUP_FILE.tar.gz"

Automated Scheduling

Cron Jobs

Schedule Database Backups

# Edit crontab
crontab -e

# Daily PostgreSQL backup at 2 AM
0 2 * * * /path/to/postgres_backup.sh >> /var/log/postgres_backup.log 2>&1

# Daily MySQL backup at 3 AM
0 3 * * * /path/to/mysql_backup.sh >> /var/log/mysql_backup.log 2>&1

# Weekly MongoDB backup on Sundays at 4 AM
0 4 * * 0 /path/to/mongodb_backup.sh >> /var/log/mongodb_backup.log 2>&1

Systemd Timers (Linux)

Using Systemd for Scheduling

Create /etc/systemd/system/db-backup.service:

[Unit]
Description=Database Backup Service

[Service]
Type=oneshot
ExecStart=/path/to/postgres_backup.sh
User=postgres

Create /etc/systemd/system/db-backup.timer:

[Unit]
Description=Run database backup daily
Requires=db-backup.service

[Timer]
OnCalendar=daily
Persistent=true

[Install]
WantedBy=timers.target

Enable and start:

sudo systemctl enable db-backup.timer
sudo systemctl start db-backup.timer

Monitoring and Alerts

Backup Verification

Verify Your Backups

#!/bin/bash

# Check if backup exists in S3
BACKUP_FILE="postgres_mydb_$(date +%Y%m%d)*.sql.gz"

aws s3 ls "s3://database-backups/postgresql/" \
    --endpoint-url https://eu-west-1.euronodes.com | grep "$(date +%Y%m%d)"

if [ $? -eq 0 ]; then
    echo "Backup found for today"
else
    echo "ERROR: No backup found for today"
    # Send alert email or notification
fi

Email Notifications

Send Backup Status Emails

# Add to backup script
if [ $? -eq 0 ]; then
    echo "Database backup completed successfully" | \
    mail -s "Backup Success - $(date)" [email protected]
else
    echo "Database backup failed" | \
    mail -s "Backup FAILED - $(date)" [email protected]
fi

FAQ

How often should I backup my database?

For production databases, daily backups are recommended minimum. Critical databases may need hourly backups.

How long should I keep database backups?

Common retention policies: 7 daily, 4 weekly, 12 monthly backups. Adjust based on your recovery requirements.

Can I backup large databases efficiently?

Yes, use compression and incremental backups where possible. Consider using database-specific tools for large datasets.

How do I test my backup restoration?

Regularly test restoring backups to a test environment to ensure they're valid and complete.

What about database encryption?

Encrypt sensitive data before backup or use database-native encryption features.

Contact Support

Need Help?

  • Backup Issues: Open support ticket through client portal
  • S3 Configuration: Include database type and error messages
  • Performance Problems: Specify database size and backup duration

For S3 setup, see S3 Configuration | For file backups, see Restic Backups