Automating Daily Database Backups to Google Drive with Rclone
4/22/2025
Keeping regular backups of your database is essential — but just storing them on the same server isn’t enough when considering the risk of losing backups stored on the same server. In this guide, I’ll walk through how I automated the process of backing up a MongoDB database and securely uploading it to Google Drive using rclone
.
We’ll go from local backups to a fully automated, space-efficient offsite backup strategy — all running on a headless Linux server.
🛠️ Step 1: The Local Backup Script
Before uploading anything, we needed a script to generate daily backups from a Docker-based MongoDB database.
The script:
- Runs
mongodump
inside the container. - Compresses the backup into a
.tar.gz
. - Copies it to a local directory (e.g.,
/home/Projects/Backups
). - Deletes any local backups older than 31 days.
This gave us a rolling 31-day local backup — lightweight, efficient and also enough data in case of a data mutation kind of scenario.
☁️ Step 2: Choosing Rclone for Cloud Sync
To get those backups off the server and into the cloud, we chose rclone — a command-line tool that supports syncing to Google Drive.
Why Rclone?
✅ Lightweight & scriptable
✅ Supports Google Drive folder structure
✅ Handles file versioning and cleanup
✅ Easy to schedule with cron
✅ Free and Open-Source
When using Rclone to upload files to Google Drive, you will face certain limits imposed by Google:
-
Daily Upload Limit: You can upload up to 750GB per day per user.
-
Individual File Size Limit: Each file uploaded can be up to 5TB in size, provided you have sufficient storage space in your Google Drive Account
🔐 Step 3: Configuring Rclone with Google Drive
Running rclone config
, we created a new remote pointing to Google Drive.
🔒 Tip: Instead of granting full access (
drive
), use the saferdrive.file
scope — which only allows access to files created by rclone.
We also specified a root folder ID in Google Drive so rclone uploads would be confined to that specific folder — clean and secure.
⚠️ Common Gotcha: Google Blocks Default OAuth Clients
On our EC2 server, rclone
threw this:
Turns out Google blocks rclone
’s default OAuth credentials for new users — especially on headless servers.
✅ The Fix:
- Create your own Google Cloud project
- Enable the Google Drive API
- Create your own OAuth client (Desktop App)
- Use your
client_id
andclient_secret
inrclone config
- Add yourself as a Test User under the OAuth consent screen
After that — smooth sailing 🎉
🧭 Step 4: Testing and Verifying Uploads
Once configured, we tested uploads like:
rclone copy /home/Projects/Backups/db-backup-20250415.tar.gz gdrive:Projects --progress
We confirmed uploads using:
rclone lsf gdrive:Projects
If files didn’t show up, we used:
rclone config show gdrive
…to verify the connected Google account.
🔁 Step 5: Automating the Upload
We created a daily script to:
- Find the latest
.tar.gz
backup - Upload it to the correct Google Drive folder
- Delete backups older than 7 days from GDrive
#!/bin/bash
LOCAL_BACKUP_DIR="/home/Projects/Backups"
GDRIVE_REMOTE="gdrive:Projects"
LOG_FILE="/home/Projects/Backups/weekly_gdrive_backup.log"
LATEST_BACKUP=$(ls -t "$LOCAL_BACKUP_DIR"/db-backup-*.tar.gz | head -n 1)
if [ -f "$LATEST_BACKUP" ]; then
echo "[INFO] Uploading $LATEST_BACKUP at $(date)" >> "$LOG_FILE"
rclone copy "$LATEST_BACKUP" "$GDRIVE_REMOTE" --progress >> "$LOG_FILE" 2>&1
if [ $? -eq 0 ]; then
echo "[SUCCESS] Upload complete" >> "$LOG_FILE"
else
echo "[ERROR] Upload failed" >> "$LOG_FILE"
fi
else
echo "[ERROR] No backup file found" >> "$LOG_FILE"
fi
echo "[INFO] Cleaning up old backups..." >> "$LOG_FILE"
rclone delete --min-age 7d "$GDRIVE_REMOTE" >> "$LOG_FILE" 2>&1
if [ $? -eq 0 ]; then
echo "[SUCCESS] Deleted old backups" >> "$LOG_FILE"
else
echo "[ERROR] Failed to delete old backups" >> "$LOG_FILE"
fi
⏰ Step 6: Scheduling with Cron
We used cron
to run the upload every night after the local DB backup completes.
0 2 * * * /bin/bash /home/Projects/Scripts/gdrive_backup.sh
🕒 This runs at 8:30 PM UTC, which is 2:00 AM Sri Lanka Time — just after the daily backup script finishes.
💣 Gotcha: Files Get Deleted Prematurely?
Yes, this actually happened during testing.
Even freshly uploaded files were sometimes deleted when using:
rclone delete --min-age 31d ...
🤯 Why? Google Drive doesn’t always preserve or reflect accurate modification timestamps — especially when uploading via rclone. So, even a new file might be treated as “older than 7 days”.
This will be not an issue when the backing up and exporting has been going on for sometime, the backup created date will be considered when deleting
🔐 Workarounds If you’re running into this issue, here are two safer alternatives:
Option 1: Use —use-server-modtime
rclone copy --use-server-modtime ...
This helps rclone compare server timestamps more accurately.
Option 2: Delete by File Count Instead of File Age Instead of relying on timestamps, delete everything except the latest 7 files:
rclone ls "gdrive:Projects" | sort | head -n -7 | awk '{print $2}' | while read file; do
rclone deletefile "gdrive:Projects/$file"
done
This keeps your backup history based on filename sorting — much more reliable and predictable.
📦 Final Thoughts
This setup gives you:
✅ Daily database backups stored locally and in Google Drive
✅ A self-cleaning routine that keeps only recent backups
✅ A safe, automated, low-maintenance solution using free tools
It’s perfect for small teams, personal projects, or even production workloads that don’t require large infrastructure.
Thanks for reading! 🙌 I’m Dhaneja, a software engineer living in Japan and building practical tools like this in my spare time.
If you enjoyed this post or found it helpful, feel free to reach out or follow along:
📸 Instagram
🐦 X / Twitter
🌐 dhaneja.com
Let’s connect!