Compression
Reduce backup file sizes with Gzip or Brotli compression.
Overview
Compression significantly reduces:
- Storage costs - Smaller files = less storage
- Transfer time - Less data to upload
- Network bandwidth - Lower bandwidth usage
Typical compression ratios for SQL dumps:
| Content Type | Compression Ratio |
|---|---|
| SQL dumps (text) | 60-80% smaller |
| Binary data (BLOBs) | 10-30% smaller |
| Already compressed | 0-5% smaller |
Compression Algorithms
Gzip
Best for: General use, fast compression
| Aspect | Rating |
|---|---|
| Speed | ⭐⭐⭐⭐⭐ Fast |
| Compression | ⭐⭐⭐ Good |
| CPU Usage | Low |
| Compatibility | Universal |
Brotli
Best for: Maximum compression, slower systems acceptable
| Aspect | Rating |
|---|---|
| Speed | ⭐⭐⭐ Moderate |
| Compression | ⭐⭐⭐⭐⭐ Excellent |
| CPU Usage | Medium |
| Compatibility | Modern |
Comparison
100MB SQL dump:
| Algorithm | Compressed Size | Time |
|---|---|---|
| None | 100 MB | 0s |
| Gzip | 25 MB (75% reduction) | ~5s |
| Brotli | 20 MB (80% reduction) | ~15s |
Enabling Compression
On Job Creation
- Create or edit a backup job
- In Processing section, enable Compression
- Select algorithm: Gzip or Brotli
- Save
File Extensions
Compressed backups have extensions:
- Gzip:
backup.sql.gz - Brotli:
backup.sql.br
With encryption:
backup.sql.gz.encbackup.sql.br.enc
Pipeline Order
Compression happens before encryption:
Database → Dump → Compress → Encrypt → Upload
↑ ↑
Gzip/Brotli AES-256This order is optimal because:
- Encrypted data doesn't compress well
- Compression works best on text data
- Smaller encrypted file to upload
Streaming Architecture
DBackup uses streaming compression:
dumpStream
.pipe(compressionStream)
.pipe(encryptionStream)
.pipe(uploadStream)Benefits:
- Low memory: Doesn't load entire file
- Fast: Parallel processing
- Scalable: Works with any size database
When to Use
Use Gzip When
- Backup speed is critical
- CPU resources are limited
- Compatibility is important
- Good balance needed
Use Brotli When
- Storage costs are high
- Maximum compression wanted
- Time is not critical
- Modern systems only
Skip Compression When
- Database contains mostly binary BLOBs
- Network is faster than compression time
- Immediate backups needed
Storage Savings
Example: 1GB Database
| Setup | Size | Monthly Cost* |
|---|---|---|
| No compression | 30 GB (30 daily) | $0.69 |
| Gzip | 7.5 GB | $0.17 |
| Brotli | 6 GB | $0.14 |
*S3 Standard pricing ($0.023/GB)
Yearly Savings
For 10 databases with Smart retention:
- No compression: ~300 GB = $83/year
- With Gzip: ~75 GB = $21/year
- Savings: $62/year per 10 databases
Restore and Download
Automatic Decompression
When restoring or downloading:
- DBackup reads metadata
- Detects compression algorithm
- Decompresses automatically
Manual Decompression
If needed outside DBackup:
# Gzip
gunzip backup.sql.gz
# Brotli
brotli -d backup.sql.brPerformance Tuning
CPU Considerations
Compression uses CPU. Monitor during backups:
- High CPU: Consider Gzip over Brotli
- Multiple jobs: Stagger schedules
Memory Usage
Streaming keeps memory low, but:
- Brotli uses more memory than Gzip
- Large databases may need more resources
Disk I/O
Compression reduces disk writes:
- Smaller temp files
- Faster uploads
- Less disk wear
Metadata Storage
Compression info stored in .meta.json:
{
"compression": "GZIP",
"encryption": {
"enabled": false
},
"originalSize": 104857600,
"compressedSize": 26214400
}Troubleshooting
Compression Slow
Causes:
- Large database
- Brotli algorithm
- CPU constraints
Solutions:
- Switch to Gzip
- Schedule during low-usage
- Check CPU availability
Decompression Fails
Causes:
- Corrupted file
- Wrong algorithm detected
- Incomplete download
Solutions:
- Re-download from storage
- Check
.meta.jsonfor correct algorithm - Verify file integrity
File Larger After Compression
Cause: Already compressed data (images, PDFs)
Solution:
- Consider skipping compression
- Or accept minimal overhead
Best Practices
- Start with Gzip - Good balance for most cases
- Monitor backup times - Switch if too slow
- Compare sizes - Test both algorithms
- Enable for all jobs - Storage savings add up
- Combine with encryption - Compress then encrypt
- Test restores - Verify decompression works
Next Steps
- Encryption - Encrypt compressed backups
- Creating Jobs - Configure compression
- Storage Explorer - View backup sizes