I'm building a .NET remoting client/server that will be transmitting thousands of files, of varying sizes (everything from a few bytes to hundreds of MB), and I'd like some feedback on the best method for achieving this. As I see it, there are a couple of options:
- Serialize the entire file into my remoting object and transmit at all at once, regardless of size. This would probably be the fastest, but a failure during transmission requires that the whole file be re-transmitted, with no way to resume.
- If the file size is larger than something small (like 4KB), break it into 4KB chunks and remote those, re-assembling on the server. In addition to the complexity of this, it's slower because of continued round-trips and acknowledgements, though a failure of any one piece doesn't waste much time.
- Including something like an FTP or SFTP server with my application - the client will notify the server that it's starting using remoting, upload the file, then use remoting to notify of completion. I'd like to contain everything in my app instead of requiring a separate FTP service, but I'm open to this option if it's needed.
- Use some kind of stated TCP connection or WPF or some other transmission method that's built to handle the failures or is capable of doing some kind of checkpoint/resume.
- Any others I'm missing?
What's the most flexible/reliable transmission method? I'm not that concerned about speed, but more about reliability - I want the file to move, even if it's slowly. Since the client and server will be multi-threaded, I can transmit multiple files at the same time if the connection allows it.
Thanks for your feedback - I'll throw in a bounty to get some recommendations on ways people would accomplish this.
BITS (background Intelligent Transfer Service) is a good solution. It has years of experience built in
Some starting points are