I have the following map:
std::map<int, std::string> map;
and my output when i call:
std::cout << map.max_size() << std::endl;
is 128102389400760775 on my linux system(wsl2). I am searching for an alternative way to reach this result without std::numerical_limits.
So far i came up with the following wrong approach, which worked for vector:
std::map<int, std::string>::allocator_type a_type_map;
std::cout << a_type_map.max_size() << std::endl;
Probably it has something to do with the Nodes, which take additional storage or something.
I could not find anywhere in the C++ standard a statement that affirms clearly how
max_size()can be calculated.On GCC/libstdc++ implementation,
max_sizeis defined oninclude/bits/stl_map.haswhere
_M_tis of typestd::_Rb_treeand definesmax_size()ininclude/bits/stl_tree.hasSo this says this limit is basically the maximum number of elements that the allocator itself can create.
The
_Alloc_traits::max_sizeis defined ininclude/bits/alloc_traits.hasOf course the allocator is whatever you use in your container but the default implementation boils down to
include/ext/new_allocator.h:The above was cleaned up a bit for macros etc.
Type type
_Tpis defined asstd::_Rb_tree_node<std::pair<int const, int> >, which is basicallySo for
std::map<int32_t,int32_t>that will be 40 bytes - 3 pointers at 8 bytes plus 4 bytes color plus 4 bytes padding plus 8 bytes of thestd::pair.which means this is the theoretical maximum possible you can ever have regardless of amount of memory or any other limiting factors.
Godbolt test: https://godbolt.org/z/vME5qvTGP