Memcached users - Looking for some Guinea Pigs
"How are you gonna use your RAM"
So… that's what I'd like to try. I have most of a control panel coded up, and hope to have it to a "minimum viable" level by the end of the weekend.
The basic idea is to offer memcached instances over the private network. Each instance would obviously get its own port, and you can specify the IP addresses that should be able to access that port.
Pricing is TBD, but I would like to get some folks interested in testing it for a little while to make sure there aren't any surprises.
So, if you use memcached, and would like to spin up a couple extra instances in Newark (sorry, only Newark for right now), let me know.
5 Replies
I'm a dallas man myself so can't participate however have you considered redis?
But they probably can't give you 0.1ms latency using the private network.
For the most part though, redis has the same commands as memcache. A few commands have different names ("gets" is "mget") and the only thing memcache has that redis doesn't natively afaik is "prepend", "cas" but those can totally be implemented with the current commands (using transactions). Of course, caches generally don't use these commands.
There are probably also differences in behavior (like what happens when either system runs out of RAM, or expirations), but for general caching purposes you really wouldn't tell the difference.
Gives me an idea to write a redis-masquarading-as-memcached wrapper lib for python, if one doesn't already exist. Would make dropping in redis a lot easier.