For "cloud storage" I encrypt files and upload them to different free file hosting services like
anonfiles (the "[email protected]" thing is hiding "file=(atpersat)
filename"). I upload them to multiple places, so that if one of them deletes it, there are other copies. I think it would be nice for a program to regularly check that the file is still up, and reupload it if it's not. For example, my computer would take a file, encrypt it, upload it to one file host, to another file host, etc. Then, it would check that the file is still there, and if it's not, then my computer can download it from one of the other places, and them upload it again.
To avoid them detecting that it's the same file being uploaded multiple times, it can be encrypted with a different key each time. Maybe it would be encrypted with a different key to different places, too, so that if they're working together, they won't notice that it's the same file being hosted in multiple places. More sophisticated features against detection could be splitting the file into multiple random-sized pieces, and piecing it back together when retrieving it.
Using different internet proxies or Tor can help against detection. Doing it at random times can help against detection.
Doing it at different times can stagger file expirations, so if all file hosts use the same expiration length, the copies won't all simultaneously expire.