Running CLI tool in Linux

I followed the guide and did the following"

I ran the following:

“$HOME/.odrive-agent/bin/odrive” sync “$HOME/odrive-agent-mount/Google Drive.cloudf” --recursive --nodownload

This added the Folder within odrive-agent-mount and added all placeholder files

so my goal is to download everything from Google Drive to my local machine

I ran the following:

“$HOME/.odrive-agent/bin/odrive” sync “$HOME/odrive-agent-mount/Google Drive” --recursive

Its downloading files but its taking way too long… Just to download a small xl file is taking like 20 seconds.

Am I doing something wrong ?

In addition I am getting tons of Google Errors and than the sync stops

Also how do I synch to a USB Drive

Hi @damianhenry,
Can you post what the errors are that you are seeing?

Can you also use the diagnostics command to send a diagnostic so that I can take a closer look?

You would need to create a mount on the USB drive using the mount command.

So right now I am pulling down from Amazon. Everything is running but does it take a long time to download files from Amazon ?

Just watching the images come down and its taking about 3 to 4 seconds per photos. Does that seem right?

The images are 2 to 5 mb in size

Hi @damianhenry,
That is within the range of expected. The “recursive” directive is single threaded, so not ideal for large bulk downloads like you are trying to do. You can use a one-line script to allow for faster downloads by using concurrent threads.

Example command:
exec 6>&1;num_procs=4;output="go"; while [ "$output" ]; do output=$(find "$HOME/odrive-agent-mount/Amazon Drive" -name "*.cloud*" -print0 | xargs -0 -n 1 -P $num_procs python "$HOME/.odrive-agent/bin/odrive.py" sync | tee /dev/fd/6); done

This will use 4 concurrent processes to perform the download in the given folder (in this case $HOME/odrive-agent-mount/Amazon Drive).

Yes I want to run the command but how do I do this: I am a newbie so I am sorry for the questions. I can tell you I have successfully pulled down a 50 gig folder from amazon with the CLI tool… So far so good. My attempt in other programs and amazons tool failed

Hi @damianhenry,
You can just copy and paste the exact command I have above to the terminal. You will just want to make sure the folder targeted for download is correct. Right now it is "$HOME/odrive-agent-mount/Amazon Drive", so you will need to change that to the correct path if it is not correct for your setup.

What version of Linux are you running?

Mint 20 cinnamon version 4.67 Linux Kern 5.4.0-60

Getting this

xargs: python: No such file or directory

“$HOME/.odrive-agent/bin/odrive” mount “$/media/Joe/USB STICK/odrive-agent-mount” /

Getting this error:
usage: odrive mount [-h] localPath remotePath
odrive mount: error: too few arguments

I am trying to mount to a USB Harddrive

Does the python3 command work? If so:

exec 6>&1;num_procs=4;output="go"; while [ "$output" ]; do output=$(find "$HOME/odrive-agent-mount/Amazon Drive" -name "*.cloud*" -print0 | xargs -0 -n 1 -P $num_procs python3 "$HOME/.odrive-agent/bin/odrive.py" sync | tee /dev/fd/6); done

The path you are using looks wrong (the $ at the beginning). Instead, is it just /media/Joe/USB STICK/odrive-agent-mount? So:
"$HOME/.odrive-agent/bin/odrive" mount "/media/Joe/USB STICK/odrive-agent-mount" /

“$HOME/.odrive-agent/bin/odrive” sync “/media/bob/MAGA/odrive-agent-mount/Amazon Drive.cloudf” --recursive --nodownload

Is this going through the entire Amazon Drive Folder on Amazon and creating place holder files on my local machine ? If so if I want to download everything why wouldn’t I just run recursive

I am getting SyntaxError: invalid Syntax

exec 6>&1;num_procs=4;output=“go”; while [ “$output” ]; do output=$(find “/media/bob/Doe/odrive-agent-mount/Amazon Drive” -name “.cloud” -print0 | xargs -0 -n 1 -P $num_procs python3 “$HOME/.odrive-agent/bin/odrive.py” sync | tee /dev/fd/6); done

I think its this odrive.py"

Yes, it is just exposing the entire structure, but not downloading anything. If you wanted to download you would remove the --nodownload parameter. For your bulk downloading needs, however, the script below will be faster.

exec 6>&1;num_procs=4;output="go"; while [ "$output" ]; do output=$(find "/media/bob/Doe/odrive-agent-mount/Amazon Drive" -name "*.cloud*" -print0 | xargs -0 -n 1 -P $num_procs python3 "$HOME/.odrive-agent/bin/odrive.py" sync | tee /dev/fd/6); done

Try this command, exactly as it is above (just copy, paste, and hit enter). Copying from the forum here can get a little weird because the quotes are converted into fancy opposing quotes (“”), which won’t work within the terminal. The commands I am pasting in are designated as preformatted text, so that they are shown exactly as they should be (this is done by adding a ` character to the beginning and end of the line).

If you are still getting an error, can you take a screenshot of the terminal with the command and the error output?

It looks like it is ripping now: This is the way this should work out of the box… This is light years better than before.

The only thing I see is when the files are processing I see remaining (4%) etc
Is that because its moving so fast it doesn’t get to 100 and moves on

Also if the system crashes or something happens and I have to run the command again will the system just pick up were it left off ?

Also how much fast can you tweak that script to get more speed ?

BTW Thank you for the help !

Hi @damianhenry,

Yeah for smaller files it is likely going too fast to show the full progression. You can tail the main log to see the sync actions:
tail -F "$HOME/.odrive-agent/log/main.log"

Yes

You can increase the num_procs=4 to num_procs=6 or num_procs=8, for example. The risk will be that Amazon will start to throw rate limit errors and they can get progressively worse if you continue after they start, so I would say to increase gradually and be on the look-out for any errors they start throwing.

I think I will just leave it be its blazing fast…

I am sure you know this but there are may companies who are facilitating the transfer of data from Amazon to various new storage companies. aka Wasabi. Many of them are charging big money to handle this transfer. This tool would be a great solution to a growing market.

I tired so many tools on maxOS, and Windows and I just kept getting errors and slow downloads. This tool just keeps jugging along. :slight_smile:

1 Like

I’m glad to hear things are looking good now!

I appreciate the patience on this and the feedback. I have been wanting to create some “simple-ish” scripts/utilities to aid folks with migration from one storage to another (handle the download from one and upload to another automatically), but I haven’t had a chance to focus on it yet, unfortunately.

So how can I set this up to to backups all the time. I want to setup my folders local and remote and have it sync at all times. I want the process to start when I start my computer.

Do I need to purchase the software to do these things ?

Hi @damianhenry,
You can run odriveagent automatically a few different ways. There was a post here where someone used what looked like a simple solution on Mint:

Other options would be to create a systemd service, which is a bit more involved.

When odriveagent is running, it will automatically upload items that it find during its routine local scans.

For downloads, you could add a cron job (crontab -e) to run periodically to recursively download any files it finds, setup like this (will run every hour):
0 * * * * "$HOME/.odrive-agent/bin/odrive" sync "/media/bob/MAGA/odrive-agent-mount/Amazon Drive" --recursive

The above stuff doesn’t require a Premium subscription.

Tony I am seeing downloads like the following: I had to restart the script

Hi @damianhenry,
Are you seeing the downloads hang? The multiple progress counting up is expected for larger files.

What were you seeing when tailing the main.log?

This is what I see: The files that are listed here show up in the main.log

But most just say remaining