ConcurrentDictionary on large numbers

595 views Asked by At

I wonder to know why ConcurrentDictionary getOrAdd method slows down heavily as entry number grows up.

I call it inside 3 nested loops and, printing to file the time of each innest loop, I can see that the execution time of each innest loop grows linearly and I don't know why, since each innest loop is of the same size. I was guessing that it is some matter of ConcurrentDictionary, but that's the reason for I'm asking here.

Any suggestion?

1

There are 1 answers

0
Michał Komorowski On BEST ANSWER

It is because of collisions. ConcurrentDictionary maintains a list of buckets and the result returned by GetHashCode method determines in which bucket an entry will be placed. Ideally each entry should be in the seperate bucket. However, in practise it is not possible. If GetHashCode returns the same value for 2 different keys, the collision appears and more and more entries is placed in the same bucket.

Under the hood, each bucket is implemented as the linked list. Every time, when the collision is detected a dictionary has to scan some linked list in order to check if a given entry is already there. The more elements is in this dictionary, the longer linked lists will be and the more time will be required to iterate through them.

By the way, standard Dictionary is implemented in the different way but will behave in the similar manner. The nice description of internals of both data structures can be found here