sftp_read jumps large blocks of bytes when reading from a file

464 views Asked by At

I'm trying to download a file over sftp and so far I've connected to a server via ssh session then opened an sftp session using the ssh session and everything seems fine. I have opened a file on the server and I'm using sftp_read() to get block of bytes from the file. My code is:

char buffer[16384] = {};
ssize_t nbytes;
ssize_t ntotal = 0;
for (;;) {
    nbytes = sftp_read(file, buffer, sizeof(buffer));

    if (nbytes == 0) {
        break; // EOF
    }

    localFile << buffer;
    ntotal += nbytes;
    //sftp_seek(file, ntotal);
}

But for some reason, when the blocks are being read, the first 20 bytes are correct but the 21st up till the 16384th byte are wrong. It turns out that after reading 20 bytes, the reader jumps to the 16384th byte of the file and continues reading like nothing happened.

I did some testing and if I changed the buffer to any size whether it was 16384, 21, 22, 50 etc, it still jumped to the 16384th byte after reading 20 bytes correctly.

Is there a reason for this? Is there a better library than libssh and sftp that I can use without errors?

1

There are 1 answers

1
Stapps On BEST ANSWER

I didn't need to use libssh after all. For anybody wanting another way of copying files over sftp, just use:

system("scp -i private_key [email protected]:directory_to_file/file.txt path_to_local_dir/file.txt");