Memory limit in Azure Data Lake Analytics

469 views Asked by At

I have implemented a custom extractor for NetCDF files and load the variables into arrays in memory before outputting them. Some arrays can be quite big, so I wonder what the memory limit is in ADLA. Is there some max amount of memory you can allocate?

2

There are 2 answers

0
saveenr On BEST ANSWER

Each vertex has 6GB available. Keep in mind that this memory is shared between the OS, the U-SQL runtime, and the user code running on the vertex.

0
Michael Rys On

In addition to Saveen's reply: Please note that a row can at most contain 4MB of data, thus your SqlArray will be limited by the maximal row size as well once you return it from your extractor.