fast conversion of a text file to arrays C++

71 views Asked by At

I've been working on a project that involves large heightmaps (3000x3000 ~60MB). . What i need to do is to split the data into several 200x200 arrays (15x15 of them), then save them separately (but this time in a format which is as fast as possible to load again). I've tried using streams (I'm not that good at C++ so don't exclude ideas with streams) but it's agonizingly slow.

Stuff that might help (based on what I've seen while searching for the answer): The heightmaps are supplied as text files (.asc) with the numbers written like this "125.123" but without the "". Each entry has three decimals no matter what the number is ("0.123" and "100.123").As far as i know there are no negative numbers and the size of the heightmap is known beforehand (usually 3000x3000).

So my questions essentially:

  1. Whats the best way to do this? (preferably without boost or such but if it helps a lot then why not)
  2. What format (for the 200x200 arrays) would allow the fastest loading time?

any help, ideas, code or links/litterature?

1

There are 1 answers

1
pm100 On BEST ANSWER

part 2

If you are reading the file back onto the same type of system (endianness) then use a binary blittable format. Ie store a straight binary dump of the 200 * 200 array. I would also multiply by 1000 and store as ints since they are typically slightly faster (you did not mention the range of values, nor the required precision, are the units feet, miles, nanometers?)