Using Dict as a cache for Key/Value store vs database? /u/identicalBadger Python Education

I’m use in key / value store in a dictionary in order to perform some rudimentary caching. In this context, my keys will be strings that are anywhere from 20 to 60 digits in length, values can be a kilobyte or two, maybe more.

The solution I’ve come upon works, prevents making time consuming calls to retrieve the identical data over and over, but I’ve only tested with data sets of up to around 3000 key/values. Will using dict as my cache scale to 10,000 keys? 100,000 keys?

Or should I be looking to use a DB for these lookups?

What I’m currently doing is:

def get_message(self, user, internet_message_id): if internet_message_id in self.cache: return self.cache[internet_message_id] else: headers = { 'Content-Type': 'application/json', 'Authorization': 'Bearer ' + self.bearer_token() } response = self.session.get( f"https://graph.microsoft.com/v1.0/users/{user}/messages?$filter=internetMessageId eq '{internet_message_id}'", headers=headers) self.cache[internet_message_id] = response return response.json() 

submitted by /u/identicalBadger
[link] [comments]

​r/learnpython I’m use in key / value store in a dictionary in order to perform some rudimentary caching. In this context, my keys will be strings that are anywhere from 20 to 60 digits in length, values can be a kilobyte or two, maybe more. The solution I’ve come upon works, prevents making time consuming calls to retrieve the identical data over and over, but I’ve only tested with data sets of up to around 3000 key/values. Will using dict as my cache scale to 10,000 keys? 100,000 keys? Or should I be looking to use a DB for these lookups? What I’m currently doing is: def get_message(self, user, internet_message_id): if internet_message_id in self.cache: return self.cache[internet_message_id] else: headers = { ‘Content-Type’: ‘application/json’, ‘Authorization’: ‘Bearer ‘ + self.bearer_token() } response = self.session.get( f”https://graph.microsoft.com/v1.0/users/{user}/messages?$filter=internetMessageId eq ‘{internet_message_id}'”, headers=headers) self.cache[internet_message_id] = response return response.json() submitted by /u/identicalBadger [link] [comments] 

I’m use in key / value store in a dictionary in order to perform some rudimentary caching. In this context, my keys will be strings that are anywhere from 20 to 60 digits in length, values can be a kilobyte or two, maybe more.

The solution I’ve come upon works, prevents making time consuming calls to retrieve the identical data over and over, but I’ve only tested with data sets of up to around 3000 key/values. Will using dict as my cache scale to 10,000 keys? 100,000 keys?

Or should I be looking to use a DB for these lookups?

What I’m currently doing is:

def get_message(self, user, internet_message_id): if internet_message_id in self.cache: return self.cache[internet_message_id] else: headers = { 'Content-Type': 'application/json', 'Authorization': 'Bearer ' + self.bearer_token() } response = self.session.get( f"https://graph.microsoft.com/v1.0/users/{user}/messages?$filter=internetMessageId eq '{internet_message_id}'", headers=headers) self.cache[internet_message_id] = response return response.json() 

submitted by /u/identicalBadger
[link] [comments] 

Leave a Reply

Your email address will not be published. Required fields are marked *