10+ practical Rclone examples – Rclone tutorial for beginners

Rclone is a cross-platform, command-line data synchronization application, similar to rsync, but focuses on cloud storage management.

Rclone allows you to virtually manage any major cloud storage service, regardless of the operating system you’re using, at blazing speed. It uses something called "delta-transfer algorithm", which means only the changes in your data are synced/copied, saving you from transferring the whole chunk of data.

Beginners may be intimidated by the fact that rclone doesn’t really have an official user-friendly interface (GUI), except a web-based one.

However, if you’re familiar with just the most entry-level basics of how the terminal works, then this program shouldn’t be that difficult for anyone to use. In this article, we will walk through the most used rclone commands, followed by examples.

Rclone config examples – How to configure Rclone

This is the first command you’ll use once you’ve successfully installed rclone. The config command allows you to add, delete, and modify existing remotes. All your remotes are stored in a config file. If you run the rclone config file command, you will see where your config file is stored.

Since rclone config is an interactive command, you will be guided step by step on how to add a storage service to its config. Everytime you run rclone config, you will see something similar to this :

To create a new remotes, type in option n, which stands for New remote, then press Enter.

At this step, you need to provide a name for the remote. It can be anything memorable, like DanGDrive or BettyDropbox, as long as it does not include special characters. The name also need to be short as you will use it a lot in the future, you really don’t want to manually type a 100-character name.

The next prompt will ask you for the Storage type. Look for your cloud storage name and its associated number, then input it into the prompt, followed by an Enter.

rclone supports a long list of providers including Google Drive, Amazon S3, Dropbox, etc. In this example, we are going to configure a Google Drive account, so I’ll input 13.

After that, just follow along with the configuration prompt and you’ll be ready to manipulate files in your cloud storage account with rclone right away.

Rclone copy examples – Copying data between cloud storage services

In this section, we will walk through several common scenarios and how to leverage rclone‘s power to get things done.

Use rclone to copy one file

This is the most basic feature of rclone. To copy one file from source_path to destination_path, you’ll have to follow this syntax

rclone copy source:source_path dest:destination_path

For example, if I have a file located in /home/nl/sample.txt, the command below copies it to home directory of my GoogleDrive account (which will be MyGDrive/sample.txt, not MyGDrive/home/nl/sample.txt)

rclone -v copy /home/nl/sample.txt MyGDrive:/

Note : -v stands for verbose, means additional details about the process will be printed.

Use rclone to copy multiple files

rclone allows you to copy multiple files in one command, as long as they match a pattern. Check out rclone filtering documentation page for more information.
In this example, let’s say we have 3 files named sample.txt, sample_2.txt in our current directory and sample_3.txt in a subdirectory named sub. The command which copies all three files should look like this :

rclone copy . --include "sample*.txt" MyGDrive:/

If you just want sample.txt and sample_2.txt, skipping subdirectories, you have to remove / from the filter, so rclone will stop scanning anything deeper than the current directory:

rclone copy . --include "sample*.txt" MyGDrive:/

You can also copy all text files with the following command:

rclone copy . --include "*.txt" MyGDrive:/

Note: Please note that the dot . indicates that we’re working on the current directory. If you want to manipulate files in a different directory, please change this accordingly.

Use rclone to copy the whole directory/folder, including directory structure and contents

The default behavior of rclone is preserving all the directory structure and contents, skipping the empty directories. So if you are looking for ways to answer this question, you may want something else.
Scenario 1 :
In some cases, if you want to copy directory structure including empty directories in rclone, just add --create-empty-src-dirs flag into the command.

rclone copy --create-empty-src-dirs . MyGDrive:/

Please do note that Amazon S3 Bucket doesn’t have a directory concept, so you can’t have an empty one in there.

Scenario 2 :
You want to put a directory under another one, not copying its contents. For example, let’s say we have dir1 directory that contains 2 files 1.txt and 2.txt, we want to put it under stuff directory in the remotes, so everything should look like this :

└── dir1
    ├── file1
    └── file2

If this is your case, you would want to put stuff into the remote path:

rclone copy dir1 MyGDrive:stuff/dir1

Use rclone to copy only new files

"New files" can have several meanings.
Scenario 1 :
If you want only files that have been modified or created in the last 2 days to be copied, --max-age filter are made to do exactly that.

rclone copy --max-age 2d dir1 MyGDrive:stuff/dir1

You can change 2d to another period of time, following the rules below :

  • ms – Milliseconds
  • s – Seconds
  • m – Minutes
  • h – Hours
  • d – Days
  • w – Weeks
  • M – Months
  • y – Years

You can put absolute time in one of these formats into 2d place

  • RFC3339 – eg "2006-01-02T15:04:05Z07:00"
  • ISO8601 Date and time, local timezone – "2006-01-02T15:04:05"
  • ISO8601 Date and time, local timezone – "2006-01-02 15:04:05"
  • ISO8601 Date – "2006-01-02" (YYYY-MM-DD)
    Scenario 2 : Force rclone to skip existing files
    If the "new files" you mean are files that does not exist in destination path. This means that if both source path and destination path has file.txt but their content is different, file.txt won’t be copied/updated.
    In this case, you need to use --ignore-existing flag.
    rclone copy --ignore-existing dir1 MyGDrive:stuff/dir1

    Force rclone to copy recursively

    A lot of beginners asked us this question.
    However, if you read rclone documentation, this is its default behavior. rclone scans through all the files and subdirectories in source path, including hidden files, then copies it over to the destination path.

There’s no magic here, it recursively copies things all the time. That’s it!

Rclone sync examples – Syncing directories with Rclone

Now that you knew enough to get started with rclone, let’s move on to its next popular command : rclone sync.

From the documentation, sync make the source and destination identical, modifying destination only. What it does is that it takes a list of files in the source and the destination, then compare their hashes. If they match, the file won’t be transfered, otherwise destination will be overwritten.

Important: Since sync can cause data loss, you should always do a test with the --dry-run flag to see exactly what would be copied and deleted before actually run the command.

If you want to see how rclone is doing during the transfer process, add the -P or --progress flag to print out real-time transfer statistics. You can also add -v/--verbose to view addition details about each step of the process.

Rclone sync automatically

rclone allows you to achieve one-way syncing with any cloud storage service of your choice, using its sync command. This works similar to the way Google Drive or Mega client do their job, but way more reliable.

To make a "continuous" sync, you can create a cronjob that runs every 5 minutes (or whichever period of time you want).

Create a file named cloudsync.sh with the contents below. Don’t forget to replace SOURCE_PATH and REMOTE_PATH with your own path.

# Get config path from rclone
config=$(rclone config file)
# Remove everything except the path

# Exit if rclone running
if [[ "$(pidof -x $(basename $0) -o %PPID)" ]]; then exit; fi

# Sync files to the cloud
/usr/bin/rclone sync [SOURCE_PATH] remote:[REMOTE_PATH] 
    --log-file /opt/rclone_upload.log 

After that, we assign execute permission to the file by running chmod +x ./cloudsync.sh in the terminal.

Finally, we set up its cron job by typing crontab -e to edit the configuration file, then we add our job. If you haven’t done it before, check out How To Set Up A Cron Job In Linux. In our case, the added line should look like this

*/5 * * * * /home/nl/cloudsync.sh

Rclone two-way sync

If you are looking for this, chances are you need a program that could mimic the functionalities of cloud storage clients (like Dropbox client or Google Drive client).

However, as of this writing, rclone doesn’t have a two-way sync/bisync solution at the moment.

With that in mind, on most situation, you could be fine using rclone copy to backup your files to the remote.

Or you could use a full-fledge sync solution, like Insync or Odrive. They usually comes with nice GUI, good support, advanced syncing features which makes them perfectly suited for beginners.

CloudCross, an open source sync client would make a good solution to this problem. However it does not have a graphical interface as of this writing, which poses a bit steep learning curve for beginners.

Show progress in long-running rclone copy/sync

There are situations where users upload a huge amount of data and want to check if the process is still running or not and estimate total transfer time.

rclone supports a -P/--progress flag to show additional progress details to any copy/sync command. The end result should look like this.
rclone copy progress

Rclone mount examples – Using Rclone to mount cloud storage as local drive

rclone mount command allows you to mount any of Rclone’s cloud storage systems as a file system directory.

Rclone mount on MacOS or Linux

On Linux/macOS you can the mount a remote named MyGDrive like this to /path/to/local/mount . Keep in mind that destination path must be an empty, existing directory.

rclone mount MyGDrive:remote_file_path /path/to/local/mount

To automatically mount the remote as computer boots up, you need to add these command to the startup script. Additionally, passing --daemon flag will make Rclone runs in background mode, which might be useful in this case.
Alternatively, you can add a cronjob to run rclone mount on boot like this

@reboot ( /path/to/rclone mount --<options> <src> <dst> &)&

Rclone mount on Windows

On Windows you can mount cloud storage system to an unused drive letter (X: in this example) or a non-existent folder.

rclone mount MyGDrive:remote_file_path X:
rclone mount MyGDrive:remote_file_path C:nonexistent_directory

Differences between Rclone mount and Rclone sync/copy

File systems expect things to be 100% reliable, whereas cloud storage systems are a long way from 100% reliable. The rclone sync/copy commands deal with this with lots of retries. However rclone mount can’t use retries in the same way without making local copies of the uploads. Check out file caching manual for solutions to make mount more reliable.

We hope this article has helped you learn how to use various rclone commands to copy or sync your files with multiple cloud storage services. You might also like our guide on the best GUI and tools for Rclone.

Click to rate this post!
[Total: 52 Average: 4.9]

4 thoughts on “10+ practical Rclone examples – Rclone tutorial for beginners”

  1. Thanks for this good work. It has helped me a lot.

    Please can you give example of using copyto with folder compression

  2. Hi Daan.
    Excellent stuff and very well written.
    Will bookmark your page.

  3. Hello Daan,

    very well done!
    Would you please add examples for encrypting files in thy cloud storage using crypt?

    Kind regards


Leave a Comment