I have a folder on an external drive that doesn’t exist in the cloud yet, so I wanted to use odrive to sync it to my Amazon Cloud Drive by right-clicking and selecting “Sync with odrive”.
Once the folder started syncing, it got stuck on a symlink and was syncing extremely slowly. The symlink is at root level in the folder and is called “CDs”, and it just points to another root-level subfolder called “Packs” (because I have some old project files that would look for media in a folder called “CDs”)
odrive was syncing “CDs” as though it were a real folder (it was the only “folder” showing a pink sync icon in Finder) and it was uploading files from the “Packs” folder as you’d expect (the Packs folder itself wasn’t showing any sync icon yet in Finder), but it was going very very slowly.
I’ve just deleted the symlink and now it’s uploading normally, but is this behaviour expected?
I would need to look at a diagnostic when this is occurring. Similarly to the accents topic you posted yesterday, actual transfer of the data of a single file should be completely unaffected by the name or structure. It is possible that some exception are being hit somewhere else, however, that causes total progress to slow.
If you are able to repro and send a diagnostic from the odrive menu, I can take a look.
My initial fix was to just delete the symlink (it’s only there to cover a handful of projects, for which there’s a workaround anyway) and reboot, and everything proceeded to sync fine after that…
…at least initially, but then this morning I noticed syncing had ground to a halt again in a similar way - nothing much actually uploading, and a large number of “Waiting” list items in the odrive menubar item. I just rebooted my computer and everything is proceeding fine again now.
I think the issue is that something is causing odrive to jam up after 12-24 hours of syncing this particular folder. It’s 1.3 TB, 1,314,271 items
That amount of data will certainly take its toll as odrive tries to scan and sync it. That scale of data is something we are working on handling better in our next major revision, but its always going to be tough to actively monitor this many objects.
If you are able to progressively drop the data in (sync in chunks), instead of it behind all at once, this can mitigate the effect.
Are you able to tell approximately how far through the data you’ve made it, so far?
I’ve got some more folders I’d like to sync so I’ll try doing those in chunks, but I might try letting this one run & see how it goes…
Not sure exactly how far thru it’s gone, not sure if the subfolders that are uploading right now are complete or not
Is there any timeline on this next major revision?
This the best answer I have at the moment: What's the status of the 'major release/revision'?