Sync to cloud stalls

I have a few hundred GB of files I"m using odrive to encrypt and sync to the cloud (Amazon Cloud Drive).

I’ve synced about 190GB so far, but each day I check ACD and see no files have been sync’d for a few hours. I check my mac doing the sync and although odrive is running with (right now) 10 syncing and 60 waiting - none of percentages are moving in teh 10 files, and presumably haven’t for several hours as it’s been 4 hours since a file has landed on ACD.

The last release (I’m running prod 6057) had some notes about a problem sounding like mine being fixed - so I was excited to try it, but I have seen my issue repeat over several days after upgrading. My solution is to quit odrive and restart, then after…maybe an hour ? it’ll start u/l to ACD. This process of always checking in is old and so I’m posting my problem here hoping for guidance.

Hi @ben1,
Can you send a diagnostic from the odrive tray menu the next time this occurs and then ping this thread? I can take a look.

Thanks!

I just sent it the diagnostic report.

Hi @ben1,
Unfortunately the diagnostic didn’t go through, which may indicate that there is a breakdown in the apps ability to make remote calls after a certain point.

When this happens, are you able to download an unsynced file (a .cloud file)? If not, what error does it give you?

Also, the next time this happens, let’s see if the CLI can talk to odrive and get a status.

You can copy and paste these commands into a command prompt on windows to run.

This first command will download the odrive CLI for windows:

powershell -command "& {$comm_bin=\"$HOME\.odrive\common\bin\";$o_cli_bin=\"$comm_bin\odrive.exe\";(New-Object System.Net.WebClient).DownloadFile(\"https://dl.odrive.com/odrivecli-win\", \"$comm_bin\oc.zip\");$shl=new-object -com shell.application; $shl.namespace(\"$comm_bin\").copyhere($shl.namespace(\"$comm_bin\oc.zip\").items(),0x10);del \"$comm_bin\oc.zip\";}"

This second command will query the odrive desktop status:

powershell -command "& {$comm_bin=\"$HOME\.odrive\common\bin\";$o_cli_bin=\"$comm_bin\odrive.exe\";$(& \"$o_cli_bin\" status);}"

It seems to be going okay so far, syncing 10 waiting 270. But it’s using 5gb of memory and 90+ %of the CPU. It’s at 215gb uploaded to ACD, 25 more than last check. It should fail in a few hours and I’ll try to resend a diagnostic, d/l a .cloud file and run your Mac compatible command when it does.

With the high memory and CPU usage, I sent a diagnostic report. It’s still able to sync files so I’m guessing you’ll get this report. Maybe it spins out of control and stops. Since it looks close to using all the RAM and CPU I thought now would be a good time to send a report.

I’m not sure this report went through either though, I didn’t get a popup asking me if I’m sure to send this report as I remember getting last time…

FYI - other things I did is I “turned off automatic sync” and the CPU went down from 102% to 3%. Still trying to send a diagnostic gave no popup/warning/confirmation - so either you got 2+, or none. The list of 'Syncing…" still says 10, but the waiting now says 302 - so it’s churning, just not over the net.

After stopping auto sync and the CPU went down, I enabled/started auto sync again (all this without doing a quit). I saw no files u/l to ACD (as looking at ACD recent activity). I also tried to send another diagnostic but same behavior.

I then ran odrive.py status and got below:

odrive Make Cloud Storage THE WAY IT SHOULD BE.

isActivated: True hasSession: True
email: XXXX accountType: Google
syncEnabled: True version: prod 6057
placeholderThreshold: neverDownload autoUnsyncThreshold: never
downloadThrottlingThreshold: unlimited uploadThrottlingThreshold: unlimited
autoTrashThreshold: never Mounts: 1
xlThreshold: never Backups: 0

Sync Requests: 1
Background Requests: 10
Uploads: 2
Downloads: 0
Trash: 0
Waiting: 302
Not Allowed: 0

HI @ben1,
Unfortunately none of the diags made it through. Can you restart odrive and then send a diagnostic after it starts again?

Also, can you tell me approximately how many files and folders you have in your local odrive folder?

Thanks!

I just sent it after a restart.

Size is - probably 2TB - # of files? no idea.

This diagnostic didn’t make it through either, unfortunately.

Can you right-click and select “properties” on your odrive folder so we can see how many files/folders you have in there. If you have any “sync to odrive” folders you would also want to right-click->properties on them, as well. I have a feeling this could possibly be related to data scale.

Can you also tell me where your odrive folder is located and, if you have any “sync to odrive folders”, where they are located.

Thanks!

Right clicking on odrive folder and showing properties shows:
390,506 bytes (414 KB on disk) for 10 items

But I know this isn’t right, I have, for example, a folder in the /Users/bstout/odrive/Encryptor/encrypt/ben that is a sym link to /Volumes/Drobo5N/drb144401a00449/1/ben

That one sym link to the drobo makes up 537,701,400,576 bytes (537.7 GB on disk). Previously I had a few folders sym linked to drobo in the Encrypted odrive folder so I could encrypt all my NAS stuff. But thinking it’s too much for odrive to scan at once, I now only have the one folder ‘ben’ above and am trying to sync just that. But, that too is failing.

Hi @ben1,
To see the number of files and folders we are talking about you can run these two commands from a terminal session:
For files:
find "/Users/bstout/odrive/Encryptor/encrypt/ben" -type f | wc -l

For folders:
find "/Users/bstout/odrive/Encryptor/encrypt/ben" -type d | wc -l

This will tell us how much data we are talking about.

Unfortunately, the problem may be related to the symlinking, especially since it is to a network volume. We do not officially support symlinking like this. Here is a post on some of the reasons why:

I understand why you set it up this way, however. We do not support “sync to odrive” for Encryption yet, which makes what you are doing not possible using odrive officially supported methods. Our next version of encryption is slated to have this feature, though.

Is your main use case encrypted backup of your NAS to Amazon Drive?

Yes, that’s the use case; sync an encrypted copy of my NAS data to ACD. The NAS drive isn’t local, so a sym link was the only way. However, the NAS is a drobo and may be able to run the CLI - but I haven’t dove into that yet.

BTW: the counts are:
directories: 217,686
files: 1,033,627

Thanks @ben1,
I think the combination of symlink + network volume + 1.25 million objects being tracked is the major factor here, unfortunately.

I suppose it working for a while before stopping allows you to make forward progress, at least, but I know it is not a real solution.

Better support for your use case is coming, but it’s going to take a little bit.