What is the best caching engine for large datasets that do not fit in memory?

286 views Asked by At

I would like to serve a huge numbers of keys (100 000 000+) but only a few (50 000) can fit in memory (the one that are the most asked). Does anyone have any experience with redis, membase or other? Does anyone have benchmarks of disk serving keys?

Thanks

2

There are 2 answers

0
Vincent Chavelle On

Salvatore Sanfilippo's response :

This is a use case for VM, but not when the difference between in-ram and not-in-ram is so big.

Btw in Redis unstable the VM is replaced by a new idea called "diskstore" that address your use case, but unfortunately it is not production ready at this stage.

0
Anoop On

If quite a lot of keys are going to be in disk and if the storage engines does not provide efficient indexing mechanism then performance will be hit severly. I don't think redis b-trees are ready yet. You can check Tokio Cabinet. It seems to provide key-value storage + btrees.

http://www.igvita.com/2009/02/13/tokyo-cabinet-beyond-key-value-store/
http://colinhowe.wordpress.com/2009/04/27/redis-vs-mysql/