I have implemented a custom extractor for NetCDF files and load the variables into arrays in memory before outputting them. Some arrays can be quite big, so I wonder what the memory limit is in ADLA. Is there some max amount of memory you can allocate?
Memory limit in Azure Data Lake Analytics
469 views Asked by Magnus Johannesson At
2
Each vertex has 6GB available. Keep in mind that this memory is shared between the OS, the U-SQL runtime, and the user code running on the vertex.