I have a very large (about 20Tb) repository for my business in Amazon Cloud Drive.
Since they announced they are stopping ‘unlimited’ licensing I need to download everything and upload to my recently paid PRO account in Dropbox.
But here the problem: everytime I force a folder sync including subfolders, it starts working a bit but suddenly it stops syncing with no error. I need to force sync, file by file, subfolder by subfolder which is insane.
Hi @armisoft,
Are you also sliding the slider all the way to the right for “Everything”? If so, can you do this again and then send a diagnostic from the odrive menu so I can take a look?
Hi @armisoft,
Let’s try using the CLI to force a sync instead, then.
To use the CLI commands from Mac:
Open a terminal session (type “terminal” in Spotlight search):
Run the following command in the terminal session (copy & paste + Enter):
exec 6>&1;num_procs=3;output="go"; while [ "$output" ]; do output=$(find "[path to folder here]" -name "*.cloud*" -print0 | xargs -0 -n 1 -P $num_procs python $(ls -d "$HOME/.odrive/bin/"*/ | tail -1)odrive.py sync | tee /dev/fd/6); done
Change “[path to folder here]” to the proper path of your “sync to odrive” folder.
It is an ugly one-liner, but the above command will download everything in the folder using 3 concurrent workers. It won’t stop until everything has been downloaded.
Hi @tommysuriwong,
The UI will abort if it hits enough exceptions from the storage service, or “critical” errors, as it tries to be more intelligent and not just blindly retry for infinity. Amazon Drive has a higher rate of exceptions than some other services, so you end up in this situation more often.
The CLI script is a brute force method, where it will just try and try forever. It is not elegant, but it can work for certain cases like this, where you are wanting to download lots and lots of content.
Hi @jonatan.rueda,
The desktop client will work just fine as long as there aren’t too many exceptions returned by the storage. The more data and the type of storage service affect the likelihood of aborting the operation, but it is definitely possible to get through large data sets with one attempt on the desktop client. The CLI method just brute forces it, no matter what. This can be a problem if you hit an issue that isn’t temporary, so you need to keep an eye on things with the CLI method to make sure it doesn’t just hammer on the service forever.
Got it thanks - I’m trying to pull all my data off amazon cloud drive. The service really does suck as far as speed and interfaces go. Btw, the command line process died and I’m trying to restart it:
When I do this, I just get a “>” at the prompt and nothing happens. I’m able to type stuff into this but it looks like it’s still waiting for input. Am I doing something wrong?
Hi @tommysuriwong,
If you are seeing the ‘>’ that means there is something wrong with the command, like its missing a terminating quote, for example.
Pasting the command in the forum without using the </> to wrap it messes with the command, so its hard to say what the problem may be, as pasted above, so let’s try this: