Argon2 by design is memory hungry. In the semi-official Go implementation, the following parameters are recommended when using IDKey
:
key := argon2.IDKey([]byte("some password"), salt, 1, 64*1024, 4, 32)
where 1
is the time parameter and 64*1024
is the memory parameter. This means the library will create a 64MB buffer when hashing a value. In scenarios where many hashing procedures might run at the same time this creates high pressure on the host memory.
In cases where this is too much memory consumption it is advised to decrease the memory parameter and increase the time factor:
The draft RFC recommends[2] time=1, and memory=64*1024 is a sensible number. If using that amount of memory (64 MB) is not possible in some contexts then the time parameter can be increased to compensate.
So, assuming I would like to limit memory consumption to 16MB (1/4 of the recommended 64MB), it is still unclear to me how I should be adjusting the time
parameter: is this supposed to be times 4 so that the product of memory and time stays the same? Or is there some other logic behind the correlation of time and memory at play?
I think the key here is the word "to compensate", so in this context it is trying to say: to achieve similar hashing complexity as
IDKey([]byte("some password"), salt, 1, 64*1024, 4, 32)
, you can tryIDKey([]byte("some password"), salt, 4, 16*1024, 4, 32)
.But if you want to decrease hashing result complexity (and decreasing performance overhead), you can decrease the size of
memory uint32
disregarding thetime
parameter.I dont think so, i believe the
memory
here means the length of result hash, buttime
parameter could mean "how many times the hashing result needs to be re-hash-ed until i get the end result".So these 2 parameters are independent of each other. These are just controlling how much "brute- force cost savings due to time-memory tradeoffs" you want to achieve