Redis Document Store

Redis can serve as an efficient document store for Retrieval-Augmented Generation (RAG) applications using Quarkus LangChain4j. This guide explains how to configure and use Redis as a vector-capable document store.

You can implement a custom ContentRetriever for Redis as well. This page focuses on the integration of Redis as a vector store, allowing you to store and retrieve documents enriched with vector embeddings for RAG scenarios.

Prerequisites

To use Redis as a document store, the following conditions must be met:

  • Redis must support both the RedisJSON and RedisSearch modules.

  • We recommend using Redis Stack, which bundles these modules.

  • Vector embeddings must have a fixed dimension matching your embedding model.

Redis Stack Required

Redis support requires the RedisJSON and RedisSearch modules. These are not part of standard Redis deployments. Use the redis-stack:latest image or an equivalent setup that includes the necessary modules.

When using Dev Services, the quarkus-langchain4j-redis extension automatically uses the redis-stack:latest container image. You can override this by setting the following property:

quarkus.redis.devservices.image-name=your/custom/image

Dependency

To enable Redis integration in your Quarkus project, add the following Maven dependency:

<dependency>
    <groupId>io.quarkiverse.langchain4j</groupId>
    <artifactId>quarkus-langchain4j-redis</artifactId>
    <version>1.0.2</version>
</dependency>

This extension builds on top of the Quarkus Redis client. Ensure the default Redis data source is properly configured* For more details, refer to:

Embedding Dimension

You must configure the dimension of the embedding vectors to match your model by setting:

quarkus.langchain4j.redis.dimension=384

Typical values include:

  • AllMiniLmL6V2QuantizedEmbeddingModel → 384

  • OpenAI text-embedding-ada-002 → 1536

If the embedding dimension is missing or mismatched, ingestion and retrieval will fail or produce inaccurate results.

If you switch to a different embedding model, ensure the dimension value is updated accordingly.

Usage Example

Once the extension is installed and configured, you can use the Redis document store as shown below:

package io.quarkiverse.langchain4j.samples;

import static dev.langchain4j.data.document.splitter.DocumentSplitters.recursive;

import java.util.List;

import jakarta.enterprise.context.ApplicationScoped;
import jakarta.inject.Inject;

import dev.langchain4j.data.document.Document;
import dev.langchain4j.model.embedding.EmbeddingModel;
import dev.langchain4j.store.embedding.EmbeddingStoreIngestor;
import io.quarkiverse.langchain4j.redis.RedisEmbeddingStore;

@ApplicationScoped
public class IngestorExampleWithRedis {

    /**
     * The embedding store (the database).
     * The bean is provided by the quarkus-langchain4j-redis extension.
     */
    @Inject
    RedisEmbeddingStore store;

    /**
     * The embedding model (how is computed the vector of a document).
     * The bean is provided by the LLM (like openai) extension.
     */
    @Inject
    EmbeddingModel embeddingModel;

    public void ingest(List<Document> documents) {
        EmbeddingStoreIngestor ingestor = EmbeddingStoreIngestor.builder()
                .embeddingStore(store)
                .embeddingModel(embeddingModel)
                .documentSplitter(recursive(500, 0))
                .build();
        // Warning - this can take a long time...
        ingestor.ingest(documents);
    }
}

Configuration

By default, the extension uses the default Redis data source and standard configuration values. You can customize its behavior using the following options:

Configuration property fixed at build time - All other configuration properties are overridable at runtime

Configuration property

Type

Default

The name of the Redis client to use. These clients are configured by means of the redis-client extension. If unspecified, it will use the default Redis client.

Environment variable: QUARKUS_LANGCHAIN4J_REDIS_CLIENT_NAME

string

The dimension of the embedding vectors. This has to be the same as the dimension of vectors produced by the embedding model that you use. For example, AllMiniLmL6V2QuantizedEmbeddingModel produces vectors of dimension 384. OpenAI’s text-embedding-ada-002 produces vectors of dimension 1536.

Environment variable: QUARKUS_LANGCHAIN4J_REDIS_DIMENSION

long

required

Name of the index that will be used in Redis when searching for related embeddings. If this index doesn’t exist, it will be created.

Environment variable: QUARKUS_LANGCHAIN4J_REDIS_INDEX_NAME

string

embedding-index

Names of fields that will store textual metadata associated with embeddings. NOTE: Filtering based on textual metadata fields is not supported at the moment.

Environment variable: QUARKUS_LANGCHAIN4J_REDIS_TEXTUAL_METADATA_FIELDS

list of string

Names of fields that will store numeric metadata associated with embeddings.

Environment variable: QUARKUS_LANGCHAIN4J_REDIS_NUMERIC_METADATA_FIELDS

list of string

Metric used to compute the distance between two vectors.

Environment variable: QUARKUS_LANGCHAIN4J_REDIS_DISTANCE_METRIC

l2, ip, cosine

cosine

Name of the key that will be used to store the embedding vector.

Environment variable: QUARKUS_LANGCHAIN4J_REDIS_VECTOR_FIELD_NAME

string

vector

Name of the key that will be used to store the embedded text.

Environment variable: QUARKUS_LANGCHAIN4J_REDIS_SCALAR_FIELD_NAME

string

scalar

Prefix to be applied to all keys by the embedding store. Embeddings are stored in Redis under a key that is the concatenation of this prefix and the embedding ID.

If the configured prefix does not ends with :, it will be added automatically to follow the Redis convention.

Environment variable: QUARKUS_LANGCHAIN4J_REDIS_PREFIX

string

embedding:

Algorithm used to index the embedding vectors.

Environment variable: QUARKUS_LANGCHAIN4J_REDIS_VECTOR_ALGORITHM

flat, hnsw

hnsw

How It Works

Each ingested document is stored in Redis as a JSON object that includes:

  • The original text

  • Metadata

  • A vector embedding

The extension automatically indexes documents using RedisSearch, enabling fast similarity queries. Retrieval operations use FT.SEARCH combined with KNN (k-nearest neighbor) queries to find the most relevant results.

Metadata Filtering

The Redis document store supports limited metadata filtering with the following constraints:

  • Only numeric fields can be used in filters.

  • Only AND conditions are supported (no OR, NOT, or nested logic).

  • You must explicitly declare the numeric fields that can be used for filtering:

quarkus.langchain4j.redis.numeric-metadata-fields=year,pageCount

You may also declare textual fields, which will be returned with search results but cannot be used for filtering:

quarkus.langchain4j.redis.textual-metadata-fields=language,author

Textual metadata fields are returned in query results but are not currently usable as filter criteria.

Summary

To use Redis as a vector store with Quarkus LangChain4j:

  • Ensure Redis Stack (or Redis with RedisJSON and RedisSearch) is available

  • Add the required extension and configure the Redis client

  • Set the embedding dimension to match your embedding model

  • Declare metadata fields to support filtering and enrichment

  • Use RedisEmbeddingStore for ingestion and retrieval