What are you looking for ?
Infinidat
Articles_top

From Google Cloud, Powering Up Caching With Memorystore for Memcached

Beta, managed and scalable service compatible with open source Memcached protocol

AshokBy Gopal Ashok, product manager, Memorystore, Google Cloud

 

 

In-memory data stores are a fundamental infrastructure for building scalable, high-performance applications.

Google Cloud Memorystore Memcached

Whether it is building a highly responsive e-commerce web site, creating multiplayer games with thousands of users, or doing real-time analysis on data pipelines with millions of events, an in-memory store helps provide low latency and scale for millions of transactions. Redis is a popular in-memory data store for use cases like session stores, gaming leaderboards, stream analytics, API rate limiting, threat detection, and more. Another in-memory data store, open source Memcached, continues to be a popular choice as a caching layer for databases and is used for its speed and simplicity.

Google is announcing Memorystore for Memcached in beta, a managed, scalable service that’s compatible with the open source Memcached protocol. It launched Memorystore for Redis in 2018 to let use the power of open source Redis without the burden of management. This announcement brings more flexibility and choice for caching layer.

Highlights of Memorystore for Memcached
Memcached offers a simple and powerful in-memory key value store and is popular as a front-end cache for databases. Using Memcached as a front-end store provides an in-memory caching layer for faster query processing, and can also help save costs by reducing the load on back-end databases.

Using Memorystore for Memcached provides several benefits:

  • Memorystore for Memcached is open source protocol compatible. If you are migrating applications using self-deployed Memcached or other cloud providers, you can migrate your application with zero code changes.

  • Memorystore for Memcached is fully managed. All the common tasks that you spend time on, like deployment, scaling, managing node configuration on the client, setting up monitoring, and patching, are all taken care of. You can focus on building applications.

  • Right-sizing a cache is a common challenge with distributed caches. The scaling feature of Memorystore for Memcached, along with detailed open source Memcached monitoring metrics, allows to scale instance up and down easily to optimize for cache-hit ratio and price. With Memorystore for Memcached, you can scale your cluster up to 5TB per instance.

  • Auto-discovery protocol lets clients adapt to changes programmatically, making it easy to deal with changes to the number of nodes during scaling. This reduces manageability overhead and code complexity.

  • You can monitor Memorystore for Memcached instances with built-in dashboards in the Cloud Console and rich metrics in Cloud Monitoring.

It can be accessed from applications running on Compute Engine, Google Kubernetes Engine (GKE), App Engine Flex, App Engine Standard, and Cloud Functions.

Click to enlarge

Google Cloud Memorystore Screen

The beta launch is available in major regions across the US, Asia, and Europe, and will be available globally soon.

To get started, check out the quick start guide. Sign up for a $300 credit to try Memorystore and the rest of Google Cloud. You can start with the smallest instance and when you’re ready, you can scale up to serve performance-intensive applications.

Articles_bottom
AIC
ATTO
OPEN-E