Skip to main content

Course: Cache concepts

This beginner caching course covers such topics as:

  • What is caching
  • Why it matters
  • Why a serverless service is great for caching
  • How caching works
  • Caching strategies
  • How to employ those strategies
  • Time to live (TTL)
  • Cache eviction vs cache expiration

Video

This intro video (1:38) explains the information you need to know to get going with caching in your architecture.

Transcript

If you have never worked with caching or want a refresher on the basics, you're in the right place. In this course, you'll learn what caching is, why it matters, and why a serverless service is a great choice for caching. I'll also walk through the basics of how caching works and caching strategies and how to employ them in your application architecture.

Let's talk about what is caching. Caching is a technique used in software development to store frequently used data in a temporary storage area to speed up the performance of an application. The storage area might be in-memory, local disk-based storage, or a distributed caching service. The goal of application caching is to reduce the number of times an application has to retrieve data from a slower storage system, such as a database, an API, or a remote service, and to retrieve that data from a cache closer to where the data is needed.

Application caching can be used to store a variety of data, including database query results, API responses, images, and other data. By caching this data, an application can reduce the time it takes to retrieve the data and thus improve the overall experience of the application.

The choice of caching technology to use depends on the specific needs of the application and a few select criteria, which we will get into in the course. With this knowledge, let's press on to more topics.