When Do You Need rclone?
I once lost all my data when the SSD on my VPS failed without warning — no snapshots, no backups. After that, I stopped keeping important data in just one place. The problem is that rsync and scp only sync between Linux machines. To push data to Google Drive or S3, you need something else.
rclone solves exactly this problem. It’s an open-source command-line tool that supports over 70 cloud storage providers — Google Drive, Amazon S3, OneDrive, Backblaze B2, Dropbox, SFTP, and more. The syntax is similar to rsync, but instead of syncing between two machines, the target is the cloud.
Three Core Concepts Before You Start
Skip this section and you’ll get lost on the very first command. Understanding these three things is all you need:
- Remote: The name you assign to a cloud connection (e.g.,
gdrive:,mys3:). Think of it as a mounting point. - Path: The directory path within a remote.
gdrive:backups/vps1means thebackups/vps1folder inside your Google Drive. - Config file: rclone stores credentials at
~/.config/rclone/rclone.conf. Every remote you create lives here.
The three commands you’ll use most often:
rclone copy— copy files from source to destination without deleting existing files at the destinationrclone sync— full sync, deleting files at the destination if they no longer exist at the sourcerclone ls/rclone lsd— list files/directories in a remote
Installing rclone on Linux
The official install script works on any distro:
curl https://rclone.org/install.sh | sudo bash
On Ubuntu/Debian you can also use apt:
sudo apt update && sudo apt install rclone -y
Verify the installation:
rclone version
Hands-On: Configuring Each Cloud Provider
1. Google Drive
Run the interactive configuration command:
rclone config
Follow these steps in the menu:
- Press
nto create a new remote - Set the name:
gdrive - Choose storage type: enter
drive(or the number corresponding to Google Drive) - Leave
client_idandclient_secretblank — rclone’s defaults work fine - Scope: choose
1(full access) - No browser on your VPS? Choose
nat the Use auto config step. rclone prints a URL — open it on your local machine, authenticate with Google, then paste the returned token back into the VPS terminal.
Once done, verify by listing directories in your Drive:
rclone lsd gdrive:
2. Amazon S3
S3 authenticates with an Access Key + Secret Key, not OAuth like Google. First create an IAM user in the AWS Console — grant AmazonS3FullAccess permissions and grab the credentials. Then run:
rclone config
- Create a new remote named
mys3 - Choose storage type:
s3 - Provider:
AWS - Enter your
access_key_idandsecret_access_key - Region: e.g.,
ap-northeast-1(Tokyo) orap-southeast-1(Singapore) - Press Enter for the remaining steps to use defaults
Test the connection:
rclone lsd mys3:your-bucket-name
3. Microsoft OneDrive
rclone config
- Create a remote named
onedrive - Storage type:
onedrive - Leave client_id/secret blank
- Authenticate via browser — same flow as Google Drive
- Choose account type:
onedrive(personal) orbusiness(Microsoft 365)
Practical Backup Commands
Copy a Directory to the Cloud
# Copy /var/www/html to Google Drive (backups/web folder)
rclone copy /var/www/html gdrive:backups/web --progress
# Copy a database dump to S3
rclone copy /backup/db.sql.gz mys3:my-backup-bucket/db/ --progress
Full Sync — Use This Command with Caution
# Sync /home/ubuntu/data to OneDrive
rclone sync /home/ubuntu/data onedrive:vps-backup/data --progress
# Always run --dry-run first to preview what will happen without making changes
rclone sync /home/ubuntu/data onedrive:vps-backup/data --dry-run
After managing over 10 VPS instances for 3 years, I learned an expensive lesson: always run --dry-run before using rclone sync for real. The sync command deletes files at the destination if they don’t exist at the source — I once nearly wiped an entire S3 bucket by accidentally swapping the source and destination. That heart-pounding moment is something I never want to experience again.
Filtering Files and Excluding Directories
# Only copy .sql and .gz files
rclone copy /backup gdrive:backups --include "*.{sql,gz}" --progress
# Exclude node_modules and .git — saves significant storage space
rclone copy /var/www/myapp gdrive:backups/myapp \
--exclude "node_modules/**" \
--exclude ".git/**" \
--progress
Verify Before and After Backup
# List files in a remote
rclone ls gdrive:backups/web
# Compare local vs remote — detect missing or differing files
rclone check /var/www/html gdrive:backups/web
Automating with Cron
Create a clean backup script. If you’re new to scheduling tasks on Linux, it helps to understand how cron jobs work for automating backups before wiring everything together:
cat > /usr/local/bin/backup-to-cloud.sh << 'EOF'
#!/bin/bash
DATE=$(date +%Y%m%d)
LOG="/var/log/rclone-backup.log"
echo "[$(date)] Starting backup..." >> $LOG
# Dump database
mysqldump -u root -p"$MYSQL_PASS" mydb > /tmp/mydb-$DATE.sql
gzip /tmp/mydb-$DATE.sql
# Upload to cloud
rclone copy /tmp/mydb-$DATE.sql.gz gdrive:backups/db/ >> $LOG 2>&1
rclone sync /var/www/html gdrive:backups/web/ >> $LOG 2>&1
# Clean up files older than 7 days
find /tmp -name "mydb-*.sql.gz" -mtime +7 -delete
echo "[$(date)] Done!" >> $LOG
EOF
chmod +x /usr/local/bin/backup-to-cloud.sh
Add it to crontab to run every day at 2 AM — a low-traffic window that won’t impact your server:
crontab -e
# Add the following line:
0 2 * * * /usr/local/bin/backup-to-cloud.sh
Useful Tips
- Bandwidth limit: Running backups during the day can congest your network. Add
--bwlimit 10Mto cap usage at 10MB/s — your server keeps serving traffic normally while the backup runs in the background. - Parallel transfers: By default, rclone copies 4 files at a time. If you’re backing up many small files (under 1MB), increasing to
--transfers 16can make a noticeable difference in speed. - Mount cloud as a drive:
rclone mount gdrive: /mnt/gdrive --daemon— after this command, Google Drive appears as a regular folder;ls /mnt/gdriveshows your files. - Encrypt before uploading: The
cryptremote type encrypts files before pushing them to the cloud. Even the provider can’t read them — useful when backing up sensitive data to S3 or a corporate Drive.
Conclusion
rclone has saved me from several stressful situations — not just for VPS backups, but also for migrating data between cloud providers without downloading anything locally. The syntax is clean, multi-provider support is broad, and everything can be scripted. If you’re managing a VPS without offsite backups, this is something you should install today. To keep your server in top shape alongside cloud backups, consider optimizing your Linux server performance for production as a natural next step. Start with rclone config, connect a remote, try rclone copy --dry-run — and you’re already in a much safer place.
