Rclone is a tool for managing material in cloud services. In DDC, it is primarily a way of transferring content out of a donor's cloud storage into DDC's digital storage.
Rclone should already be installed in BitCurator but if you are transferring content that you need to log in to access, you'll need to do some additional set up to connect to cloud storage service. This will have to be repeated for each cloud storage service and possibly reconnected if some time has passed between the last time you've used Rclone. Below lists the general set up process with sections for specific cloud storage providers used in practice so far. If there is a cloud storage provider you need to access that's not covered below, please contact the digital archivist.
This general setup will work with DropBox
Type "rclone config" and hit enter
Type "rclone config" and hit enter
Type "rclone config" and hit enter
In general we use Rclone for transferring files from cloud services. When possible we also use it to confirm the fixity of the files downloaded.
The command to copy files is fairly simple, you specify that you want to copy the files, enter their location, and then their destination. For instance:
rclone copy [name of remote as set up above]:[name_of_folder_or_file (if spaces in name, you can put quotation marks around this after the colon)] [/path/to/destination/folder/originalname, i.e. processing folder, etc. If you want to retain the original folder name, enter it here, quoted if there are spaces in it] |
Here is an example:
rclone copy dropbox:"Radhika Nahpal INT" "/media/sf_BCShared01/processing/2022_061acc/Radhika Nahpal INT" |
When transferring files from Google Drive (that do not include Google objects such as Docs, Sheets, and Slides) and additional analysis will most likely not be needed, such as a small transfer of word documents, you can direct the output of Rclone to a folder that aligns with Archivematica's standard packaging structure. This will save some work later when preparing for Archivematica. Here is an example:
rclone copy googledrive:"Radhika Nahpal INT" "/media/sf_BCShared01/processing/2022_061acc/objects/Radhika Nahpal INT" |
If content had been shared with you and not in your Google Drive, you can use the --drive-shared-with-me
flag to look in that area for the content instead.
rclone copy googledrive:"Radhika Nahpal INT" --drive-shared-with-me "/media/sf_BCShared01/processing/2022_061acc/objects/Radhika Nahpal INT" |
Additionally, when transferring content from Google Drive, there may be Google objects (Docs, Sheets, Slides, etc.). Because Rclone cannot tell the size of these files they are all listed as having a file size of -1. So you can check for these by listing (ls
) the content and setting max-size
to 0
. Additionally, there are some formats that cannot be exported by rclone (such as forms) and are not listed,
so we want to add also the flag --all-drive-formats
Here is an example:
rclone ls googledrive:"Radhika Nahpal INT" "/media/sf_BCShared01/processing/2022_061acc/objects/Radhika Nahpal INT" --drive-show-all-gdocs --max-size 0 |
Once you have your list of Google object files, you can assess how to export them. <Add more info here on that>
Some cloud providers have checksums stored in their system that you can extract and facilitate fixity checking. Some are unique to their system or some can be more standard types. Here is a general layout of the command to extract the checksums into a text file:
rclone hashsum [type of checksum] [remote source]:"folder_name or file" (same as used when copying) --output-file /path/to/output/file.txt
Here is an example for dropbox:
rclone hashsum dropbox dropbox:"Radhika Nahpal INT" --output-file /media/sf_BCShared01/processing/2022_061acc/submissionDocumentation/dropbox_checksums.txt
Here is an example for OneDrive or SharePoint:
rclone hashsum quickxor onedrive:"Radhika Nahpal INT" --output-file /media/sf_BCShared01/processing/2022_061acc/submissionDocumentation/onedrive_checksums.txt
Here is an example for Google Drive, because you can reuse md5 checksums in Archivematica, we can name the checksum file and store it in it's standard packaging and naming structure:
rclone hashsum md5 googledrive:"Radhika Nahpal INT" --output-file /media/sf_BCShared01/processing/2022_061acc/submissionDocumentation/checksum.md5
Google objects, such as Docs, Sheets, and Slides, do not have checksums stored in Google Drive that can be extracted. If you have any of these in the content you're transferring, they will be downloaded as regular files, but they will not have checksums in the checksum file extracted from Google Drive. In these cases, we will not reuse the checksum file we create here in Archivematica and it should be named googledrive_checksums.txt in a location of your choosing. |
In order to confirm fixity, there are number of options:
Confirm the using the checksums you extracted in the steps above:
rclone checksum [checksum type] /path/to/checksum/file.txt /path/to/local_directory/of/copied_files
Here is an example for dropbox:
rclone checksum dropbox /media/sf_BCShared01/processing/2022_061acc/submissionDocumentation/dropbox_checksums.txt "/media/sf_BCShared01/processing/2022_061acc/Radhika Nahpal INT"
Confirm without local checksums/those that rclone generates:
rclone check [remote name]:[source folder] /path/to/local_copy/of/source_folder
While Rclone exports the files from the cloud provider, under most circumstances, it doesn't perform the needed packaging or analysis that will be needed for processing the files. Proceed to the Logical Transfer section for next steps in processing this content.