In this post, I want to share my experience setting up a daily backup procedure using Rclone on a Windows server, with backups sent directly to Google Drive. This project was essential to protect critical FileMaker files and ensure an efficient and organized backup workflow. It took some time, a few challenges, and helpful support from ChatGPT to reach a solid result.
🎯 Objectives and Requirements
- Runs every weekday at 8:00 PM
- Finds all `.fmp12` FileMaker files in the daily backup folder and its subfolders
- Compresses them into ZIP archives, split into parts if larger than 2 GB
- Uploads the ZIP files to Google Drive using Rclone
- Organizes the uploads into a daily folder named with the current date
- Keeps only the last 4 daily, 4 weekly, and 4 monthly backups on Google Drive
- Sends email notifications at the beginning and end of the process
- Deletes all temporary files from the local SSD after upload
⚙️ Step-by-Step Implementation
1. Installing Rclone and Setting Up Google Drive
I configured Rclone to access my Google Drive by following the official documentation. I used a headless authentication to generate the necessary token and successfully connected Rclone to a dedicated backup folder on my Drive.
2. PowerShell Script for Backup Automation
The PowerShell script performs the following:
- Scans for
.fmp12files recursively - Uses 7-Zip to compress files, splitting large ones into 2 GB parts
- Creates a detailed log of every step
- Uploads each archive to a date-named folder on Google Drive
- Deletes local ZIPs after upload
- Sends email notifications on start and finish
3. Retention Policy Logic
Backup folders older than the 4 most recent (daily, weekly, monthly) are automatically deleted based on date parsing and naming.
🧪 Testing and Troubleshooting
I ran several tests with different file sizes, folder structures, and simulated connection failures. I also validated the deletion logic with dummy folders and verified logs for accuracy.
❌ Problems Solved (Thanks to ChatGPT!)
- Wrong 7-Zip syntax fixed
- Incorrect Rclone paths corrected
- Recursive search for files fixed
- Overwrite vs append log behavior adjusted
- Retention logic improved to avoid over-deletion
✅ Final Result
Now, every weekday at 8:00 PM:
- Files are compressed and split
- ZIPs are uploaded and sorted
- Old backups are deleted
- Logs are emailed
- Temporary files are removed
📝 Conclusion
This solution is now reliable, fast, and easy to monitor. The combination of PowerShell scripting, Rclone, and 7-Zip provides full control without third-party software dependencies.
Special thanks to ChatGPT for helping me resolve key issues and refine each step of the process!
⚙️ Step-by-Step Implementation
1. Installing Rclone and Setting Up Google Drive
I configured Rclone to access my Google Drive by following the official documentation. I used a headless authentication to generate the necessary token and successfully connected Rclone to a dedicated backup folder on my Drive.
2. PowerShell Script for Backup Automation
The PowerShell script performs the following:
- Scans for
.fmp12files recursively - Uses 7-Zip to compress files, splitting large ones into 2 GB parts
- Creates a detailed log of every step
- Uploads each archive to a date-named folder on Google Drive
- Deletes local ZIPs after upload
- Sends email notifications on start and finish
3. Retention Policy Logic
Backup folders older than the 4 most recent (daily, weekly, monthly) are automatically deleted based on date parsing and naming.
💻 PowerShell Script Example
# Define paths
$backupSource = "D:\FileMakerBackups\Today"
$zipDestination = "D:\TempZIPs"
$logFile = "D:\Logs\backup_log_$(Get-Date -Format 'yyyyMMdd').txt"
$gdriveTarget = "gdrive:backup_fmp12/$(Get-Date -Format 'yyyy-MM-dd')"
# Create log
Add-Content -Path $logFile -Value "=== Backup started at $(Get-Date) ==="
# Compress files
Get-ChildItem -Path $backupSource -Recurse -Filter *.fmp12 | ForEach-Object {
$zipName = "$zipDestination\$($_.BaseName).zip"
& "C:\Program Files\7-Zip\7z.exe" a -v2g -tzip $zipName $_.FullName
Add-Content -Path $logFile -Value "Compressed: $($_.FullName)"
}
# Upload to Google Drive using Rclone
& "C:\rclone\rclone.exe" copy $zipDestination $gdriveTarget --log-file=$logFile --log-level INFO
# Retention: delete local temp files
Remove-Item "$zipDestination\*" -Recurse -Force
Add-Content -Path $logFile -Value "Temporary ZIPs deleted."
# (Optional) Retention policy for Google Drive can be added using rclone delete or scripting
# Send email notification (example using Send-MailMessage)
Send-MailMessage -From "backup@yourdomain.com" -To "admin@yourdomain.com" `
-Subject "✅ Backup Completed" `
-Body "Backup completed at $(Get-Date). See attached log." `
-SmtpServer "smtp.yourdomain.com" `
-Attachments $logFile
Add-Content -Path $logFile -Value "=== Backup completed at $(Get-Date) ==="
Leave a Reply