use load balancer to spread the request to different servers.
different servers will have different cahce
but the cache could be different according to different servers.
Use Memcached to solve this.

image.png

image.png
Code:
# implement the basic memcache functions
CACHE = {}
#return True after setting the data
def set(key, value):
CACHE[key] = value
return True
#return the value for key
def get(key):
return CACHE.get(key)
#delete key from the cache
def delete(key):
if key in CACHE:
del CACHE[key]
#clear the entire cache
def flush():
CACHE.clear()
#print set('x', 1)
#>>> True
#print get('x')
#>>> 1
#print get('y')
#>>> None
#delete('x')
#print get('x')
#>>> None
Note:
-
Load balanceris just for spreading request to servers. SeeRound Robin -
Memcachedis all in memory:
- fast
- not durable
- limited by memory
-
Memcachedthrow data that is least recently used. (LRU) when no more memory available.
References:
https://en.wikipedia.org/wiki/Cache_stampede
https://en.wikipedia.org/wiki/Memcached