I am working on a task to show location markers on a google map. I have around 40K records in my table. I need to create a location marker for all of these records. I am using the gem gmaps4rails with my ruby on rails app. I am using Passenger Standalone as my app server.
Here is my code
my_models_controller.rb
def index
@markers = []
MyModel.select(:id, :first_name, :latitude,:longitude).find_in_batches do |batch_records|
@markers += Gmaps4rails.build_markers(batch_records) do |record, marker|
marker.lat record.latitude
marker.lng record.longitude
marker.title record.first_name
marker.infowindow "<span> #{record.first_name} </span> "
end
end
end
index.html.haml
:javascript
var markers_json = #{raw @markers.to_json};
%h1
- markers = nil
#my_model_map.map{ style: 'width: 100%; height: 600px;' }
my_model.js
if(typeof Gmaps !== 'undefined' && $('#my_model_map').length) {
handler = Gmaps.build('Google', { markers: { maxRandomDistance: null } });
handler.buildMap({ provider: {}, internal: { id: 'my_model_map' } }, function() {
if(typeof markers_json !== 'undefined') {
markers = handler.addMarkers(markers_json);
handler.bounds.extendWith(markers);
handler.fitMapToBounds();
}
});
}
I tried selecting the required columns and added batch processing to reduce memory consumption. But It still takes around 200 MB while constructing hash for a google map markers for all of these records.
The problem here is, it never releases this memory after serving each request. And the memory increase is in incremental. Can anyone please advice me which part of my code causes this memory leak?