I'm going to read a big csv file in matlab which contains rows like this:
1, 0, 1, 0, 1
1, 0, 1, 0, 1, 0, 1, 0, 1
1, 0, 1
1, 0, 1
1, 0, 1, 0, 1
0, 1, 0, 1, 0, 1, 0, 1, 0
For reading big files I'm using textscan however I should define number of expected parameters in each line of text file.
Using csvread helps but it is too slow and seems to be not efficient.
Are there any methods to use textscan with uknown number of inputs in each line? or do you have any other suggestion for this situation?
Since you said "Numerical matrix padded with zeros would be good", there is a solution using
textscanwhich can give you that. The catch however is you have to know the maximum number of element a line can have (i.e. the longest line in your file).Provided you know that, then a combination of the additional parameters for
textscanallow you to read an incomplete line:If you set the parameter
'EndOfLine','\r\n', the documentation explains:So with the example data in your question saved as
differentRows.txt, the following code:will return:
As an alternative, if it makes a significant speed difference, you could import your data into integers instead of double. The trouble with that is
NaNis not defined for integers, so you have 2 options:0just replace the line which define the format specifier with:
This will return:
99)Define a value which you are sure you'll never have in your original data (for quick identification of empty cells), then use the
EmptyValueparameter of thetextscanfunction:will yield: