I would like to clarify specifically with regard to Google Drive, which has a number of idiosyncrasies.
In my case, I have a little over 1TB of data that was being kept in sync with Google Drive's Windows client (until it started crashing & Google advised me that it was due to the large number of files being sync'ed). So I am considering using odrive instead.
In my use case, I need for all files to be on that local machine. It serves as a local repository and however any file is created or updated by anyone on the shared folders, I want to be sure a full copy ends up fairly timely locally in that repository. (I.E., not placeholders, but the actual files).
Right now, over 90% of the files are present (those that sync'ed before Google Drive Client started misbehaving). I would prefer not having to download those files again.
But what really scares me is that Google has no problem having two (or more) identically-named files sitting in the cloud in the same location. I had an incident once in the past where hundreds of files were re-uploaded to Google Drive from the repository, replicating (not replacing) files that were already there. I had to go through hundreds of pairs of files manually, figuring out which one of each pair was the up-to-date one and deleting the other. I do not relish the thought of doing that again.
The second thing that scares me is that some of the files that are presently in the repository are obsolete, having been updated in the cloud. I would not want them to overwrite the cloud files losing the newer version already in the cloud.
Is there a straightforward solution to this? Is odrive smart enough to avoid the gotchas of Google Drive? Or will I have to compromise and re-download >1TB of data to be safe?