S3 Storage Configuration
Access Your S3 Credentials
S3 Access Keys + URL
- Login to Client Panel - Access your Euronodes client portal
- Navigate to Storage - Find the S3 storage section
- Enable S3 Storage - Click the green "Enable S3 Storage" button
- Endpoint -
https://eu-west-1.euronodes.com - Copy Keys - Click "Access Keys" in the top right corner
- Region: Not required (leave empty or use
eu-west-1) - Bucket Naming: Standard S3 bucket naming rules apply (lowercase, no spaces, etc.)
Basic Configuration
Creating Your First Bucket
GUI
Using AWS CLI
AWS CLI Setup
# Install AWS CLI (if not already installed)
# Mac: brew install awscli
# Linux: sudo apt install awscli
# Windows: Download from AWS website
# Configure AWS CLI
aws configure set aws_access_key_id YOUR_ACCESS_KEY
aws configure set aws_secret_access_key YOUR_SECRET_KEY
aws configure set default.region eu-west-1
# Create a bucket
aws s3 mb s3://my-backup-bucket --endpoint-url https://eu-west-1.euronodes.com
# List buckets
aws s3 ls --endpoint-url https://eu-west-1.euronodes.com
# Upload a file
aws s3 cp myfile.txt s3://my-backup-bucket/ --endpoint-url https://eu-west-1.euronodes.com
# Download a file
aws s3 cp s3://my-backup-bucket/myfile.txt ./ --endpoint-url https://eu-west-1.euronodes.com
Using S3 Browser Tools
GUI Applications
Popular S3 browser applications that work with Euronodes S3:
- S3 Browser (Windows) - Free S3 client
- Cyberduck (Mac/Windows) - Free FTP/S3 client
- CloudBerry Explorer (Windows/Mac) - Professional S3 client
- S3 Organizer (Web-based) - Browser-based interface
- Transmit (Mac) - Premium file transfer app
Configuration for GUI Tools
Common Settings
When configuring GUI tools, use these settings:
- Server/Endpoint:
eu-west-1.euronodes.com - Protocol: HTTPS
- Port: 443
- Access Key: Your generated access key
- Secret Key: Your generated secret key
- Region:
eu-west-1(if required)
Common S3 Operations
File Management
Basic File Operations
# Upload files
aws s3 cp file.txt s3://my-bucket/ --endpoint-url https://eu-west-1.euronodes.com
aws s3 cp folder/ s3://my-bucket/folder/ --recursive --endpoint-url https://eu-west-1.euronodes.com
# Download files
aws s3 cp s3://my-bucket/file.txt ./ --endpoint-url https://eu-west-1.euronodes.com
aws s3 cp s3://my-bucket/folder/ ./folder/ --recursive --endpoint-url https://eu-west-1.euronodes.com
# List files
aws s3 ls s3://my-bucket/ --endpoint-url https://eu-west-1.euronodes.com
aws s3 ls s3://my-bucket/folder/ --recursive --endpoint-url https://eu-west-1.euronodes.com
# Delete files
aws s3 rm s3://my-bucket/file.txt --endpoint-url https://eu-west-1.euronodes.com
aws s3 rm s3://my-bucket/folder/ --recursive --endpoint-url https://eu-west-1.euronodes.com
Directory Synchronization
Sync Local and S3 Directories
# Sync local directory to S3 (upload changes)
aws s3 sync ~/Documents s3://my-documents/ \
--endpoint-url https://eu-west-1.euronodes.com
# Sync S3 to local directory (download changes)
aws s3 sync s3://my-documents/ ~/Documents \
--endpoint-url https://eu-west-1.euronodes.com
# Sync with delete (removes files not in source)
aws s3 sync ~/Photos s3://photo-backup/ \
--endpoint-url https://eu-west-1.euronodes.com \
--delete
Quick Examples
Common S3 Operations
File Upload/Download
# Upload single file
aws s3 cp myfile.txt s3://my-bucket/ --endpoint-url https://eu-west-1.euronodes.com
# Upload directory
aws s3 cp mydir/ s3://my-bucket/mydir/ --recursive --endpoint-url https://eu-west-1.euronodes.com
# Download file
aws s3 cp s3://my-bucket/myfile.txt ./ --endpoint-url https://eu-west-1.euronodes.com
Bucket Management
# Create bucket
aws s3 mb s3://my-new-bucket --endpoint-url https://eu-west-1.euronodes.com
# List bucket contents
aws s3 ls s3://my-bucket --endpoint-url https://eu-west-1.euronodes.com
# Delete bucket (must be empty)
aws s3 rb s3://my-bucket --endpoint-url https://eu-west-1.euronodes.com
Storage Management
Monitor Usage
Check Storage Usage
# Check bucket size and file count
aws s3 ls s3://my-bucket --recursive --human-readable --summarize \
--endpoint-url https://eu-west-1.euronodes.com
# List all buckets
aws s3 ls --endpoint-url https://eu-west-1.euronodes.com
# Check specific folder size
aws s3 ls s3://my-bucket/folder/ --recursive --human-readable --summarize \
--endpoint-url https://eu-west-1.euronodes.com
Lifecycle Management
Organize and Clean Up
# List old files (older than 30 days)
aws s3 ls s3://my-bucket --recursive --endpoint-url https://eu-west-1.euronodes.com | \
awk '$1 < "'$(date -d '30 days ago' '+%Y-%m-%d')'"'
# Archive old files to different bucket
aws s3 sync s3://my-bucket/ s3://my-archive/ \
--endpoint-url https://eu-west-1.euronodes.com \
--exclude "*" --include "*2023*"
Security and Best Practices
Access Control
Secure Your S3 Storage
- Strong Credentials - Use complex access keys and rotate them regularly
- Least Privilege - Only grant necessary permissions
- Secure Storage - Store credentials securely (use environment variables or credential files)
- Monitor Access - Regularly review access logs
- Backup Credentials - Keep secure backup of access keys
Data Protection
Protect Your Data
# Encrypt files before upload
gpg --cipher-algo AES256 --compress-algo 1 --symmetric myfile.txt
aws s3 cp myfile.txt.gpg s3://my-bucket/ --endpoint-url https://eu-west-1.euronodes.com
# Use server-side encryption (if supported)
aws s3 cp myfile.txt s3://my-bucket/ \
--endpoint-url https://eu-west-1.euronodes.com \
--sse AES256
Backup Strategies
Backup Best Practices
- 3-2-1 Rule - 3 copies, 2 different media types, 1 offsite
- Regular Testing - Test restore procedures regularly
- Version Control - Keep multiple versions of important files
- Documentation - Document your backup procedures
- Automation - Automate backups to ensure consistency
Integration Examples
Programming Languages
SDK Examples
Python (boto3)
import boto3
# Configure S3 client
s3 = boto3.client('s3',
endpoint_url='https://eu-west-1.euronodes.com',
aws_access_key_id='YOUR_ACCESS_KEY',
aws_secret_access_key='YOUR_SECRET_KEY'
)
# Upload file
s3.upload_file('local_file.txt', 'my-bucket', 'remote_file.txt')
# Download file
s3.download_file('my-bucket', 'remote_file.txt', 'downloaded_file.txt')
# List objects
response = s3.list_objects_v2(Bucket='my-bucket')
for obj in response.get('Contents', []):
print(obj['Key'])
Node.js (AWS SDK)
const AWS = require('aws-sdk');
// Configure S3
const s3 = new AWS.S3({
endpoint: 'https://eu-west-1.euronodes.com',
accessKeyId: 'YOUR_ACCESS_KEY',
secretAccessKey: 'YOUR_SECRET_KEY',
s3ForcePathStyle: true
});
// Upload file
const uploadParams = {
Bucket: 'my-bucket',
Key: 'file.txt',
Body: 'Hello World!'
};
s3.upload(uploadParams, (err, data) => {
if (err) console.log(err);
else console.log('Upload successful:', data.Location);
});
Backup and Sync Solutions
Specialized Backup Guides
- Restic Backups - Modern backup program with deduplication (Mac/Linux/Windows)
- Database Backups - PostgreSQL, MySQL, MongoDB backup scripts
- Website Backups - WordPress, static sites, and application backups
- File Sync & Archive - Directory sync, media archives, and log management
Other Compatible Tools
Additional S3-Compatible Tools
- Duplicity - Encrypted bandwidth-efficient backup
- Rclone - Command line program to sync files and directories
- Borg Backup - Deduplicating backup program (with S3 backend)
- Duplicati - Free backup software with web interface
- CloudBerry Backup - Professional backup solution
Troubleshooting
Common Issues
Connection Problems
- Endpoint URL - Always include
--endpoint-url https://eu-west-1.euronodes.com - Credentials - Verify access key and secret key are correct
- Bucket Names - Use lowercase, no spaces, follow S3 naming rules
- Network - Check firewall and proxy settings
- SSL/TLS - Ensure HTTPS is used for secure connections
Error Resolution
Common Solutions
# Test S3 connection
aws s3 ls --endpoint-url https://eu-west-1.euronodes.com
# Verify credentials
aws sts get-caller-identity --endpoint-url https://eu-west-1.euronodes.com
# Test bucket access
aws s3 ls s3://my-bucket --endpoint-url https://eu-west-1.euronodes.com
# Check AWS CLI configuration
aws configure list
Performance Issues
Optimize Performance
# Use multipart upload for large files
aws configure set default.s3.multipart_threshold 64MB
aws configure set default.s3.multipart_chunksize 16MB
# Increase concurrent requests
aws configure set default.s3.max_concurrent_requests 20
# Use parallel uploads
aws s3 cp large-file.zip s3://my-bucket/ \
--endpoint-url https://eu-west-1.euronodes.com \
--cli-write-timeout 0
FAQ
What's the difference between S3 and regular file storage?
S3 is object storage designed for backups and archival, while regular file storage is for active file systems. S3 is more cost-effective for large amounts of data.
Can I use any S3-compatible tool?
Yes, our S3 storage is fully compatible with the S3 API, so any tool that works with Amazon S3 will work with Euronodes S3.
How secure is my data in S3 storage?
Data is encrypted in transit and at rest. You can add additional encryption by encrypting files before upload.
Are there any bandwidth limits?
Check your service plan for specific bandwidth allocations. Most plans include generous bandwidth allowances.
Can I access my S3 storage from multiple locations?
Yes, S3 storage can be accessed from anywhere with internet connectivity using your access credentials.
What are the bucket naming requirements?
Bucket names must be lowercase, 3-63 characters, start/end with letter or number, and contain only letters, numbers, and hyphens.
Contact Support
Need Help?
- Configuration Issues: Open support ticket through client portal
- Access Problems: Include bucket name and error messages in your ticket
- Performance Questions: Specify your use case and current performance metrics
For backup solutions using S3, see Restic Backups