Best practices for syncing LARGE amounts of data with Drive to google

Hi @shawncorrigan,
odrive is interfacing directly with Google’s API and will generally transfer as fast as the connection allows. It does prioritize smaller files first, which can show more erratic transfer trends (peaks and valleys) depending on the size of the files being transferred at that time. It will also try to limit concurrent uploads to one at a time if the file exceeds 100MB in size.

I upload to Google Drive quite a bit and I usually see upload speeds of around 10-15MB/sec. If I upload from cloud infrastructure, like Amazon EC2, I can see even faster speeds.

It sounds like you are trying to perform a very large data import/backup to Google vs a sync. odrive is built for sync, which means it will constantly monitor remote and local content to make sure they stay in lock-step with each other. This can create quite a bit of overhead if you ask odrive to monitor a very large set of data.

For best practices I recommend performing large imports in chunks instead of all at once. There is a post here where I go into this a bit more:

Is this number correct? 700Mbps would be very fast.

To clarify, you have been syncing 2TB of data to Google Drive, using odrive, for 45 days so far?
How much of the 2TB is left to upload?
How many files and folders are in the data set?
When you look at the odrive status, what do you see?