The Archives Transfer Tool is currently a Dropbox based app for donors to transfer digital material (often for MIT community groups and members).
The system is a small collection of platforms/tools:
a Dropbox Shared Folder into which files can saved by end users
a folder on the NAS (Network Attached Storage) for preparation work on the files before ingest into preservation
a Python script that moves files and metadata from Dropbox to the NAS
Setting up a transfer
This step is done synchronously with the collections team. Once they create a submission agreement with the group, take the follow steps:
If this is a group's first transfer: create a subfolder in the Dropbox shared folder linked above that will receive content and proceed to step 2. If the group has done at least one transfer already, proceed to step 9.
Add the authorized user who will be doing a transfer as an editor of this subfolder
Download the
sample_default_metadata.json
file and rename it todefault_metadata.json
Use a text editor to update the file with information from the submission agreement that will apply to all the files that will be uploaded to the subfolder.
- You can edit or delete any field except the following that will be overwritten by the tool as a part of the transfer process:
- "Transfer Date"
- "Beginning Year"
- "Ending Year"
- "Description"
- "Dropbox SHA256"
- You can edit or delete any field except the following that will be overwritten by the tool as a part of the transfer process:
- If a field is not used (for instance, "campus-address"), delete the whole line
- Save the json file and upload it to the Dropbox subfolder for the transfer
- Download the sample_csv.csv file and upload it to the Dropbox subfolder for the transfer. This will be used by the person transferring the files to submit metadata.
On the NAS, create a subfolder within cdps-processing/ATT to receive the content copied by the Python script from Dropbox. Name it the same as the Dropbox subfolder.
Direct a user to follow these steps to transfer files.
When DDC has been notified that an authorized group transferred a set of files, confirm all have been uploaded to the associated folder in Dropbox
Files will now need to be downloaded from Dropbox to the NAS using a Python script before accessioning or processing can begin. The next steps below are where the process is handed over fully to the DDC processing team.
Getting situated with the ATT python app
Log in to Dropbox in your default web browser
Download the latest version of the the ATT repository (which also means you need to have Python 3.x installed on your computer) to your workstation and follow the instructions in the README.
Mount the
cdps-processing
share on your workstation
Moving the files
- Open Terminal (Mac/Linux) / PowerShell (PC) and navigate to the root of the downloaded copy of the
interim-att
folder. Make sure that your environment is properly configured:
you should have a
.env
file in the folder (see the README for details)you should have run
make install
(ormake update
) to get the rest of the environment configured
- Run the “check” command (
pipenv run att check
) to verify that the script can connect to Dropbox and connect to the NASinterim-att % pipenv run att check Loading .env environment variables... 1. Go to: https://www.dropbox.com/oauth2/authorize?response_type=code&client_id=6pvw4x2047kxu4d&token_access_type=offline&code_challenge=<challenge_code>&code_challenge_method=S256 2. Click "Allow" (you might have to log in first). 3. Copy the authorization code. Enter the authorization code here: <auth_code_from_browser> 2025-06-06 12:04:46,793 INFO att.cli.check(): Successful Dropbox OAuth via PKCE 2025-06-06 12:04:47,775 INFO att.cli.check(): SUCCESS: Connected to MIT Dropbox 2025-06-06 12:04:47,777 INFO att.cli.check(): SUCCESS: NAS Folder is connected.
Run the “bulk-file-copy” command (
pipenv run att bulk-file-copy
), passing in the name of the CSV file (--remote-csv "DropBoxFolderName/csv_name.csv"
) as the required argument. You will need to include the Dropbox subfolder you are transferring from in the path to the CSV. For instance,AMITA/sample_csv.csv
interim-att % pipenv run att bulk-file-copy --remote-csv "AMITA/sample_csv.csv" initial-application Loading .env environment variables...
- Navigate to the
ATT
folder in thecdps-processing
share to verify that the files were copied and that the metadata files and checksum manifest files were generated in the correct subfolder.
Packaging the files for further processing
The downloaded folder is not organized in a way that will allow for processing in Archivematica. To do this, you will need to use another tool for packaging such as SIPCreator or DART. You can follow the steps on those pages with specifics to content from ATT:
- When packaging the files, select the content files only
- Once packaged, add the original metadata files and manifest files to the submissionDocumentation folder as outlined in the section of the tool you used (SIPCreator or DART)
- Once complete, follow the next steps outlined in the respective tool's documentation.
- Once fully complete, clean out the subfolder in Dropbox. Make sure that when you are cleaning up the subfolder that you leave behind the
default_metadata.json
andsample_csv.csv
files!