I recently provisioned a new machine. I wanted to have content - several TB worth - from encrypted folders on the computer in question. The simple solution appeared to be to copy the files over the LAN onto the new machine. No matter what amounts of Sync or Refresh commands I issued, odrive steadfastly refused to acknowledge that the existing content had anything to do with the encrypted folder. An unsync command made odrive complain that unsynced files would be deleted.
Is there any method of populating an encrypted share other than downloading the files from the cloud provider? This gest expensive if using a business-level service such as S3, Azure, or Google Cloud.
Hi @Ethan,
To be clear, you want to take local data that resides inside the Encryptor folder on one machine and copy it into the Encryptor folder on another machine (all local/LAN ops). Is that correct?
There shouldn’t be anything preventing you from doing this, although you will need to have expanded the root of each Encryptor folder before trying to copy things over, so that you can enter the encryption passphrase for that folder. Without that, odrive won’t know what to do with the data.
Keep in mind that copying that data into the Encryptor folder will cause a re-upload of it, even if the data is “the same”. The reason is because the data is actually not going to be the same. The encryption key used for that file will be unique, which changes both the data and the name of the resulting remote file.
Damn. That’s not what I wanted to hear. This makes using odrive less attractive. The combination of PUTs and class C transactions gets substantial, not to speak about the time required to upload 5-8 TB per computer.
I’m not following your statement that :The encryption key used for that file will be unique, which changes both the data and the name of the resulting remote file." What I’m out to accomplish is to access an encrypted folder from more than one computer at a time without needing to upload or download new data.
An example: Server A already has an encrypted folder set up with bunches of local files. What we want is for workstation B to have the same files locally on it. I’m not talking about onesy-twosy numbers of files here. Whether the initial replication is performed over an office LAN, Amazon Snowballs, or a combination of external drives and FedEx is irrelevant. The volume of data is sufficient that intermediate hops through the cloud are neither cost effective nor practical.
If new data are uploaded for each server - duplicating each and every file in the cloud - that’s a non-starter. Paying an additional $225 per month in storage costs per computer just to have a deduplication nightmare … no thanks.
Hi @Ethan,
Can you elaborate a bit on the setup for me so I can be clear on the use case?
You mentioned a “server” and several workstations. What is the difference in the purposes of these systems and how they are used?
Are you logging in as the same system user on all of these systems?
Are you logging in with the same odrive user on all of these systems?
Each system needs to have a full, local copy of everything in Encryptor?
How many machines are we talking about?
Are you using versioning on the remote storage?
Currently, replacing a cloud file with a physical file inside an encryptor folder will be seen as an update and it will upload that file, even if it is technically the same exact content (minus encryption differences). Since it is an update, though, it will replace the file in the cloud and not store it alongside the old one. This should mean that you won’t get duplicate data in the cloud, unless you have versioning enabled on the bucket.
you will need to adjust the encryption use case and implementaion so that the same binary file in the same location on different installations/computers is the same entity on the remote side. The encryption key, name, data should not differ between multiple installations, so that no update is done, if there is actually no change in data (even if a local update event occurs).
If you do not respect the equality of the same data local and remote, it will always cascade into an additional upload -> and even worse into a download on all installations - without only a single bit changed in the file in question. As you can see, this solution cannot stay overtime, you need to change it.
I had a similar issue on my encrypted folder, where I was not able to move in the same file from the filesystem and (replace) the identical already uploaded file - without actually triggereing a new upload.
This ended up in a mess, so I needed to download the content instead of just moving it into the local odrive encrypted folder.
Please re-think the encryption topic.
Hi @carsten,
I understand the use case, but there is a required randomness to the encryption scheme that results in different stored data, even for the same data input, so any new local file will have that randomness applied to it.
Any local addition on top of a placeholder will result in a new file uploaded to the cloud. You are correct that, since the properties of this file are now different in the remote storage, any other clients will see this as a change and download the file, as well. <-- @Ethan This is something I overlooked in my previous posts, so please be aware. Uploading data will actually result in more overhead because of the resulting downloads on other clients.
This means there really is no shortcut to populating local data of an Encrypted folder. It currently needs to be pulled in from the authoritative source (the remote storage).
I will pass this to the product team to see what other options could be possible for this use case.
The issue in my mind is what odrive’s target market is. Single user - likely freeware but possibly paid - or businesses? From a business perspective, odrive’s zero-knowledge encryption offering is a major selling point. The ability to protect confidential documents and data on any cloud platform rather than a select few is compelling.
Unfortunately the way you describe odrive’s implementation is the way I suspected it worked. After un-syncing all folders from an encrypted share and deleting the share in the odrive web interface, the folder on the cloud provider still had 300K+ files remaining. We clued into this from our latest AWS billing statements which showed both more usage than we could account for and inexplicably large numbers of transactions. I suspect the culprit was our trying to replicate encrypted odrive folders without uploading and downloading from the provider.
To answer your questions:
Are you logging in as the same system user on all of these systems? In some cases yes, others no. If odrive supports multiple users (for a business audience this is a given), it should not matter. In either case, the data could not be replicated locally.
Are you logging in with the same odrive user on all of these systems? Same as above.
Each system needs to have a full, local copy of everything in Encryptor? No. Our use case involves encrypted data of two types: internal business data and customer data where our contracts require full encryption of files. Most individual computers require only a subset of these files. The size of a particular bucket can be fairly large - several TB for instance. Enough that duplication in the cloud is undesired and wasting bandwidth on transfers is something to be avoided as well.
How many machines are we talking about? Less than 20 in our case. We’re a fairly small shop.
Are you using versioning on the remote storage? Not for encrypted syncs. That’s a waste of space and provides no benefit.
Ideally there would be a means of either duplicating the encryption token (isn’t it stored in the registry on Windows computers?) or having odrive be smart enough to figure out that files having the same name, size, modification date, and contents were indeed identical. Storing a hash of the unencrypted contents in the encrypted file header would be one approach; there certainly are others.
Hi @Ethan
I have passed on the idea of trying to allow a local file to replace a placeholder in Encryptor without considering it an update. Encryption is unique in its handling (necessarily so) and we want to be careful not to compromise the integrity and security of the scheme.
My questioning around the same users was more me brainstorming other ways to accomplish this, like cloning the system, entirely, which should preserve everything the way it is and be usable as long as the users are the same.
Can you explain this one? What command did you use and was this an odrive Space? odrive won’t delete anything from the cloud unless an explicit delete is used on the content. Unlinking, removing a Space, or deleting a shared link will not delete any cloud content, as designed. We would have some disastrous results if we did that.
Re cloning: The system I first went down this rat hole on was a computer upgraded from Win 7 to Win 10. The odrive files were not on the o/s drive, so they were not only identical but were the exact same files that were the master copies for the encrypted share.
The share mentioned above was not shared among other users. I kept a number of project files that I was the only one working on. I accessed the files on two workstations and a laptop. The master files were all on a single workstation, while the other computers had only downloaded snippets. I gave up syncing the data back to an encrypted share and performed unsyncs on each computer and deleted the Encrypted folder using the Manage Storage web link. Many of the files still existed in the cloud. If this is by design, that is fine but it should be documented that the only way to actually delete a full encrypted folder is via the cloud provider’s portal.
This comes back to a method of loading encrypted storage folders on more than one machine without requiring that each computer after the initial source download all the files from the cloud. Looking at our cloud storage, it appears that we inadvertently duplicated large numbers of files by pre-loading encrypted folders on machine #2 directly from the source on machine #1 rather than first uploading to the cloud then downloading every time on subsequent installs.
Given the file name mangling that odrive’s encryption does, it is not possible to determine which file came from what computer. We scrubbed all the affected buckets and went with another solution for the present time.