AWS ElastiCache Service features and usage scenarios

Elastic Cache Service

The goals of this post are:

  • To describe the features and benefits of the Amazon ElastiCache service,
  • Explain when to use the Amazon ElastiCache service,
  • Locate the ElastiCache option in the AWS console and discuss how to configure it, and
  • Give examples of use case scenarios for Amazon ElastiCache.

What is Elastic Cache?

Amazon ElastiCache is a web service that makes it easy to deploy, operate, and scale an in-memory data store or cache in the cloud.

This service improves the performance of web applications by allowing you to retrieve information from fast, managed in-memory data stores, instead of relying entirely on slower, disk-based databases.

So, what does that mean? Let’s take a look at caching, and then expand our discussion to in-memory caching environments in the cloud.

Caching

Let’s start out by discussing a basic caching concept. Think about your own computing devices. If you need more space to store your files, applications, and so on, you buy a larger hard drive. The hard drive is for items that you need to keep. They are persistent, however, even if you keep adding hard drive space, your computer may not seem to get faster. In fact, it may appear slower as you keep opening all of your new files and applications.

How do we make the computer perform faster and serve up our information more quickly? We add RAM, more memory. Why do we do this? So that the computer can store frequently accessed information in memory instead of having to run to the disk for the information.

Now, let’s take this analogy and relate it to how a web application uses in-memory caching. To be clear, ElastiCache is not just used for web applications. It can be used for any application that can benefit from increased performance using in-memory cache. But we’re going to use a web application for our example.

A common scenario is to have a web application that reads and writes data to persistent storage, like a relational database or NoSQL or MySQL, and so on. However, persistent storage, like hard disk storage tends to experience some fluctuations in latency as each piece of data needs to be written to or retrieved from a permanent media store. This can affect overall performance.

This is where an in-memory cache is useful.

It’s generally used to improve read-only performance.

Many websites get a high percentage of read hits, but less write hits. An in-memory cache can store frequently accessed read-only information, and serve it up much more quickly than having the application continually request it from a persistent data store. This diagram shows a very simple web app solution.

Now imagine that your app becomes more popular, and you need to scale up. Adding more web servers is not that difficult. But vertically scaling a persistent data store, such as a relational database, is usually more complicated. That is where a caching layer can really make a difference.

You can add more web servers and ElastiCache can automatically grow your caching layer based on the increased demand. This can eliminate or reduce the need to scale-up on your persistent data store.

How is Elastic Cache being used today?

  • Some common uses for ElastiCache are online gaming applications, where it’s important the game code presents information, like the scoreboard, as quickly and as consistently as possible to all the players of the game.
  • Social networking sites, where we need a way to store temporary session information, and session management.
  • Q and A sites, where some articles are very popular, and therefore are requested a lot.
  • And recommendation engines, when we want the data sets that make up the recommendation presented quickly to a large number of users.

As you can see, these are applications that have lots of read-only content, that would benefit from an in-memory cache. Oftentimes, users are scanning these websites for information like, who currently has the high score in a game, what are your friends up to, looking for how-tos or guidance on the best restaurant. Obviously, there is information written to these sites as well, like when something is updated or changed, so that information would be sent to a permanent data store. By the way, in-memory caching is not limited to read-only. An application developer may want to store write data using in-memory cache, which is also supported using ElastiCache. It’s just less common. So our lesson is focused on read-only caching.

When to use Elastic Cache?

Amazon ElastiCache can be a very useful tool for developers. Application developers are always looking for ways to ensure the best performance and availability for their application. Databases and other permanent storage options are relatively slow. Users are impatient today. Waiting a few more seconds for an app to respond can mean losing business to a competitor. Developers are thrilled when an app becomes popular. But then there is the challenge of scaling. Adding front-end support, like more web servers, is fairly simple, but when it comes to scaling persistent storage, it starts getting complicated and messy. Then there are a number of management tasks to consider. Updating, patching, monitoring, and securing data takes time. Using ElastiCache allows the developer to focus on building and improving apps by simplifying and reducing administrative tasks.

Elastic Cache Features

Now that we have a better understanding of what the ElastiCache service is, let’s look at the product features. Adding an in-memory caching layer using ElastiCache can increase performance by responding much more rapidly than a persistent data store, and more easily allow the application to scale as the application grows.

ElastiCache supports both Memcached and Redis, so existing applications can be easily moved to ElastiCache.

But why would you consider the Amazon ElastiCache service over running version of Redis or Memcached on your own servers? The core benefit of using ElastiCache is that it’s a managed service. That means that configuration is simplified. The ElastiCache management console allows you to create new nodes with just a few clicks.

A cache node is a fixed sized chunk of secure, network-attached RAM, essentially the building block of the ElastiCache service.

It also supports a clustered configuration.

Clusters are a collection of one or more cache nodes.

Once you’ve provisioned a cluster, Amazon ElastiCache automatically detects and replaces failed nodes, which helps reduce the risk of overloaded databases and therefore, reduces the website and application load times. Without ElastiCache, the tasks of managing the underlying infrastructure, like scaling, patching, and so on, will need to be done by you. Securing the data also falls upon the developer when they’re using Memcached or Redis, but ElastiCache can make use of AWS and Virtual Public Cloud security features to protect the data.

Redis vs Memcached

To give you a general idea of the capabilities of both the supported caching store engines, see the comparison chart:

In broad terms, Redis has more features, whereas Memcached is often recognized for its simplicity and speed of performance.

Memcached really suits workloads where memory allocation is going to be consistent, and increased performance is more important than the additional Redis features. Often, the choice between the two comes down to memory and how you expect to use it.

ElastiCache Management Console

From AWS console, access the ElastiCache Management Console, you have the option to create a cluster using the Memcached or Redis engine. Authorize and connect to Clusters, and manage your clusters. Add resources. Modify configuration settings and monitor your ElastiCache environments. Choosing the default configuration values will generally suffice to start getting familiar with the ElastiCache service.

Sample Scenario

Now, let’s take a look at a scenario to get a better understanding of how ElastiCache really works. Let’s imagine we manufacture motorcycles, and we have a website that provides support information about the range of motorcycles that we sell worldwide. We’ve sold five million motorcycles since 2010.

Our support website usually receives around 100 thousand hits a day, generally from people looking for information about the specifications of our motorcycles and user guides. One day, a fault is reported in a hose pipe, commonly used in motorcycle engines. Anyone who has a motorcycle wants to verify that their bike does not use this hose pipe. Luckily, our motorcycles do not use the faulty hose pipe, and we put out a press release stating that fact. However, we fear the worst, since last year a similar fault was announced and our website crashed when two million customers checked our website for information about the fault.

This time, our website received seven million views, however, the website was able to respond and deliver on those requests, because after the site crashed last year, we implemented Amazon ElastiCache between the web server and the MySQL database, to cache website-based content. Now when the web server requests the press release page, the content of that page is delivered out of Amazon ElastiCache, reducing the amount of time it takes the web server to display the press release by removing the need for the web server to request the press release page content from the MySQL database.

FAQs