Best practices for syncing LARGE amounts of data with Drive to google

Hi, I am pretty new to Odrive. I am syncing my archive to Google, it’s 24tb large. I have a fiber connection at the office and I am able to have a sustained 5MB uplink. When I upload a file straight with Dropbox I see a full 5MB upload, however with Odrive syncing files to google I see a graph that looks like peaks and valleys and the max Odrive is transferring is 700mbs. I have the Odrive agent set to no throttling. Currently, I have a hard drive that is 2BT big and it is been syncing continuously for 45 days and it is still not up in the cloud. Please advise on the best way to use Odrive.

Thank you!

Hi @shawncorrigan,
odrive is interfacing directly with Google’s API and will generally transfer as fast as the connection allows. It does prioritize smaller files first, which can show more erratic transfer trends (peaks and valleys) depending on the size of the files being transferred at that time. It will also try to limit concurrent uploads to one at a time if the file exceeds 100MB in size.

I upload to Google Drive quite a bit and I usually see upload speeds of around 10-15MB/sec. If I upload from cloud infrastructure, like Amazon EC2, I can see even faster speeds.

It sounds like you are trying to perform a very large data import/backup to Google vs a sync. odrive is built for sync, which means it will constantly monitor remote and local content to make sure they stay in lock-step with each other. This can create quite a bit of overhead if you ask odrive to monitor a very large set of data.

For best practices I recommend performing large imports in chunks instead of all at once. There is a post here where I go into this a bit more:

Is this number correct? 700Mbps would be very fast.

To clarify, you have been syncing 2TB of data to Google Drive, using odrive, for 45 days so far?
How much of the 2TB is left to upload?
How many files and folders are in the data set?
When you look at the odrive status, what do you see?