Set Up Automatic Backups
Configure automated backups — local or to Cloudflare R2 and other S3-compatible storage providers.
This tutorial walks through configuring automated backups. Vardo supports five backup target types:
- S3 — AWS S3 or any S3-compatible endpoint
- R2 — Cloudflare R2
- B2 — Backblaze B2
- SSH — Remote server via SSH/SCP
- Local — Server filesystem (good for quick restores, not recommended as the only target)
Prerequisites
- A running Vardo instance
- An account with an S3-compatible storage provider (Cloudflare R2, AWS S3, Backblaze B2, etc.)
Step 1: Create an R2 Bucket
These steps use Cloudflare R2. The process is similar for other providers.
- Log in to the Cloudflare dashboard
- Go to R2 Object Storage
- Click Create bucket
- Name it something like
vardo-backups - Choose a location (pick the region closest to your server)
- Click Create bucket
For AWS S3, create a bucket via the S3 console. For Backblaze B2, create a private bucket in the B2 dashboard.
Step 2: Get Access Credentials
Cloudflare R2
- In the R2 dashboard, click Manage R2 API tokens
- Click Create API token
- Give it a name (e.g.,
vardo-backups) - Set permissions: Object Read & Write
- Scope it to your specific bucket
- Click Create API Token
- Copy the Access Key ID and Secret Access Key — you won't see the secret again
Also note your Account ID (visible in the R2 overview sidebar) — you'll need it for the endpoint URL:
https://<ACCOUNT_ID>.r2.cloudflarestorage.comAWS S3
Create an IAM user with s3:GetObject, s3:PutObject, s3:DeleteObject, and s3:ListBucket permissions on your bucket. Use the IAM user's access key and secret.
Endpoint: https://s3.<region>.amazonaws.com
Backblaze B2
Generate an application key in the B2 dashboard with access to your bucket. Use the keyID as the access key and applicationKey as the secret.
Endpoint: https://s3.<region>.backblazeb2.com
Step 3: Add a Backup Target in Vardo
Option A: Local backup target
If you want to store backups on the server itself (useful for fast restores or as a secondary target):
- Go to Backups in the sidebar
- Click New Target
- Set type to Local
- Set the storage path — e.g.,
/var/backups/vardo - Click Save
Local backups are fast but don't protect against disk failure. Pair them with an S3 target for real durability.
Option B: S3-compatible target
-
Go to Backups in the sidebar
-
Click New Target
-
Fill in the form:
Field Value Name Cloudflare R2(or any label)Type S3Endpoint https://<account_id>.r2.cloudflarestorage.comBucket vardo-backupsAccess Key ID (from Step 2) Secret Access Key (from Step 2) Path prefix backups/(optional — organizes files within the bucket) -
Click Save
-
Vardo will test the connection — if it fails, double-check the endpoint URL and credentials
Step 4: Create a Backup Job with a Schedule
-
Click New Job
-
Configure the job:
Field Value Name Daily database backupTarget Select the target you just created What to back up Select the app(s) or choose "All apps" Schedule 0 2 * * *(2am daily) -
Set a retention policy (see Retention Policies below)
-
Click Save
The job is now active. It will run on schedule and upload backups to your storage target.
Backup strategies per volume
Vardo picks the right backup strategy for each volume automatically:
- Database volumes (PostgreSQL, MySQL) — uses native dump tools (
pg_dump,mysqldump) for consistent, restorable backups - File volumes (uploads, media, config) — uses tar archives
You can override the strategy per volume in the job settings if needed.
Step 5: Verify the First Backup Runs
You don't need to wait for the schedule — trigger a manual run immediately:
- Find your job in the Backup Jobs list
- Click Run Now
- Watch the status change from Running to Success
- Click on the job to see the backup history, file size, and duration
Check your R2 bucket (or S3 console) to confirm the backup file appeared.
Step 6: Test a Restore
Verify your backups are restorable before you need them:
- Go to the backup job's history
- Find a completed backup
- Click Restore
- Select the target app and confirm
The restore process:
- Downloads the backup from storage
- Stops the app container
- Restores the data
- Restarts the container
Tip: Test restores to a staging environment first so you're not restoring over production.
Retention Policies
Retention controls how many backups Vardo keeps. Older backups are deleted automatically after the job runs.
| Tier | Default | Recommended for... |
|---|---|---|
| Hourly | Keep last 24 | High-traffic apps needing fine-grained recovery |
| Daily | Keep last 7 | Standard apps |
| Weekly | Keep last 4 | Long-term coverage |
| Monthly | Keep last 3 | Compliance, audit trails |
Configure these under the job's Retention section. More aggressive retention costs more storage; dial it based on your recovery objectives.
Monitoring Backup Health
Vardo shows backup status on the backups page. To catch failures proactively:
- Enable Notifications under Settings → Notifications — Vardo will alert you when a backup job fails
- Check the Backup History table on the backups page — each row shows status, size, and duration
- A job that consistently takes longer than expected may indicate growth in your data or a slow network path to storage
Next Steps
- Deploy your first app if you haven't already
- Configure monitoring under Metrics to track resource usage over time
- Set up email notifications so you're alerted on backup failures: Settings → Notifications