What is the fastest way to send large binary file from one pc to another pc over the Internet?

3.8k views Asked by At

I need to send large binary(2Gb-10Gb) data from one pc(client) to another pc(server) over the Internet. First I tried to use WCF service hosted in IIS using wsHttpBinding binding with message security but it took a lot of time (a few days) which is a inappropriate for me. Now i think about writing client and server applications using sockets. Would it be faster?

What is the best way to do it?

Thanks

5

There are 5 answers

2
Felice Pollano On BEST ANSWER

The plain old FTP in order to me would be suitable in this case. By using it you will have the chance to recover an interrupted transfer without need to redo de job from start. You need to keep in account the possibility a so massive download get interrupted for some reasons.

3
GvS On

When sending large amounts of data, you are limited by the bandwidth of the connection. And you should take care of disruptions in the connection. Small disruptions can have a big impact if you have to resend a lot of data.

You can use BITS, this transfers the data in the background, and divides the data into blocks. So it will take care of a lot of stuff for you.

It depends on IIS (on the server), and has a client (API) to transfer the data. So you do not need to read or write the basics of the data transferring the data.

I don't know if it will be faster, but at least a lot more reliable as making a single HTTP or FTP request. And you can have it running very fast.

If bandwidth is a problem, and it doesn't have to be send over the internet, you could check out high-bandwidth/low-latency connections like sending a DVD by courier.

You can use BITS from .Net, on CodeProject there is wrapper.

1
mmix On

Well, the bandwidth is your problem, going even lower into sockets won't help you much there as WCF overhead doesn't play much with long binary responses. Maybe your option is to use some lossless streaming compression algorithm? Provided that your data is compressible (do a dry run using zip, if it shrinks a file on local disk you can find a suitable streaming algorithm). Btw, I would suggest providing a resume support :)

0
hookenz On

Usually it's most appropriate to leverage something that's already been written for this type of thing. e.g. FTP, SCP, rsync etc

FTP supports resuming if the download broke, although not sure if it supports a resumed upload. Rsync is much better at this kind of thing.

EDIT: It might be worth considering something that I'm not terribly familiar with but might be another option - bit-torrent?

A further option is to roll your own client/server using a protocol library such as UDT which will give you better than TCP performance. See: http://udt.sourceforge.net/

0
Brandon On

Although there is some bandwidth overhead associated with higher level frameworks, I have found WCF file transfer as a stream to be more than adequately fast. Usually as fast as a regular file transfer over SMB. I have transferred hundreds of thousands of small files in a session, which included larger files 6-10gb sometimes larger. Never once had any major issues over any sort of decent connection.

I really like the interfaces that it provides. Allows you to do some pretty cool stuff that FTP cant, like remoting, or duplex end points. You get programmatic control over every aspect of the connection on both sides, and they can communicate messages along with the files. Fun stuff.

Yes FTP is fast and simple, if you don't need all that stuff.