I assume they had a kind of pool for files, and a system linking files (or should I say "blobs" to each client's directory layout. Kind of like if I have a disk with different subdirectories, I could run a tool (which do exist) to find duplicates, and delete all except one copy, and hardlink the rest to that one.
As for the cloud storage system, the files were, as mentioned, stored in an encrypted form, using a hash of the original file as key (possibly md5, possibly something else, I can't recall that at the moment). Which the cloud provider didn't know, but the client's application would know it. The encrypted file is provided to (every) client, every client can decrypt it because the clients keep the encryption keys (the original hashes, one for every file).
The details of that I don't have anymore, there used to be a document describing the whole thing. I probably got rid of all of that after they stopped the service (which I used for several years, with no issues).