I have a lot of static data (i.e. read only data, which is not transactional) which gets updated only once in few days.
I have to support searches on that data (api calls, not sql). So I am thinking I will just load it in Memory, and refresh the in-memory data once in a while. The RAM should not be an issue since we are on 64 bit... data can be in 2 GB to 50 GB range.
I am hoping I can process searches on the in-memory data much faster than querying a database (indexed tables as well).
Is there a certain "approach" I can take to design this in-memory data?
UPDATE:
My question isn't about what RDBMS / noSQLDB to use. I want to know how to structure data in-memory when I am no longer bound by a storage mechanism.
It totally depends on what kind of data you are working with and what kind of searches you want to perform on it.
For example, with hash based structures you can not support partial word searches.
You could go for an in-memory relational db if your data is really relational (With lot of columns and relations). You can index all the searchable columns. But RDBMS is of no use, if your data is just a bunch of key value pairs or just a bunch of paragraphs.
A specific DS can not be suggested here with out the knowledge of your requirements.
I suggest you to explore data structures(like search trees, tries, hashtables), databases(like redis), search engines(like solr, lucene) to find out which suits your needs best.