I'm intrigued by uBackup, using Usenet for backing up large files.
I thought it would be a good idea to use python for this, but I'm having problems understanding the correct protocol for posting large files.
I know you need to compress your files and best split them up in smaller parts. But when you actually post the file to usenet it yencodes them AND splits them up in even smaller parts (because each article can only be a limited size).
But how are large files actually split into smaller parts this way?
Sorry for the confusion. The uBackup article on WikiHow explains in step 2 how to split files. It uses 7-zip.org to do so. In this image you can see that the file(s) are split into 50 Mb chunks. http://www.wikihow.com/Image:2T-7-zip.org-parameters.jpg
In step 4 you can see that the files are uploaded with 'Camelsystem Powerpost' That program also does the encoding. http://en.wikipedia.org/wiki/File:Usenet_Binaries_Upload_process.PNG
The split and encoding is (manually) done by to different programs. When downloading, you have to use the same process to combine the split files. Eg. if you used rar or zip or another methode/program, then you have to use the same methode to combine them.
Maybe this article will help you also: How to split large files efficiently