Skip to main content

Using Compression in the Momento Node.js SDK

Why Compression?

If your cache data consists of large string values, especially those that can be repetitive like JSON, enabling client-side compression may reduce the size of the data that you are transferring to and from Momento by as much as 90%. This can result in significant cost savings due to reduced network traffic and storage costs.

Enabling Compression

The get/set and getBatch/setBatch cache methods in the Momento Node.js SDK support compression. To avoid requiring additional dependencies in the main SDK, compression support is provided by installing an extension package. We provide two different extension packages that you can choose from, both available on

Because gzip is available in the node.js standard library, there are no additional dependencies required, so packaging your app will be the same as any other Momento node.js app. The zstd extension requires an additional native dependency that is specific to your target platform, so you will need to make sure that your builds are configured to include the correct dependency for the target architecture.

In our testing we found that, for typical JSON data being stored in a Momento cache, the compression ratio is very similar between the gzip and zstd compression extensions. However, for very large values (100kb or larger), zstd can provide up to a 20% performance improvement in the time that it takes to compress and decompress the data.

If you're not sure which extension is right for you, we recommend starting with the default @gomomento/sdk-nodejs-compression package. You can switch to zstd later if you find that you need the additional performance.

To get started with compression, first add the compression extension package to your project:

npm install @gomomento/sdk-nodejs-compression

Once that is installed, you can enable compression by adding a compression strategy to the cache client configuration:

compressorFactory: CompressorFactory.default(),
compressionLevel: CompressionLevel.Balanced,

Compressing your data

With the dependency installed and the client configured, you can specify compress: true when calling set or setBatch to compress that value:

const result = await cacheClient.set(cacheName, 'test-key', 'test-value', {compress: true});
if (result instanceof CacheSet.Success) {
console.log("Key 'test-key' stored successfully");
} else if (result instanceof CacheSet.Error) {
throw new Error(
`An error occurred while attempting to store key 'test-key' in cache '${cacheName}': ${result.errorCode()}: ${result.toString()}`

Automatic Decompression

By default, when you enable compression, the SDK also enables automatic decompression. This means that any cache value that the SDK reads via get or getBatch will be automatically decompressed if it was compressed when it was written. Therefore, you don't need to change any of your existing calls to get or getBatch to handle compressed data.

If you want to be able to compress data, but don't want the SDK to automatically decompress it, you can also configure that:

compressorFactory: CompressorFactory.default(),
compressionLevel: CompressionLevel.Balanced,
automaticDecompression: AutomaticDecompression.Disabled,

If automatic decompression is disabled, you can specify decompress: true when calling get or getBatch to tell the SDK to decompress that particular value:

const result = await cacheClient.get(cacheName, 'test-key', {decompress: true});
if (result instanceof CacheGet.Hit) {
console.log(`Retrieved value for key 'test-key': ${result.valueString()}`);
} else if (result instanceof CacheGet.Miss) {
console.log(`Key 'test-key' was not found in cache '${cacheName}'`);
} else if (result instanceof CacheGet.Error) {
throw new Error(
`An error occurred while attempting to get key 'test-key' from cache '${cacheName}': ${result.errorCode()}: ${result.toString()}`

Uncompressed data will be unaffected by the compression configuration.

More Examples

Here are some additional examples available in the SDK github repository: