Comparison Intermediate · 3 min read

FAISS IndexFlatL2 vs IndexFlatIP comparison

Quick answer
The IndexFlatL2 in FAISS performs exact nearest neighbor search using Euclidean (L2) distance, ideal for applications needing geometric distance. The IndexFlatIP uses inner product similarity, suited for cosine similarity when vectors are normalized, common in semantic search and RAG tasks.

VERDICT

Use IndexFlatL2 when Euclidean distance is the natural metric, such as image embeddings; use IndexFlatIP for semantic search with normalized vectors where cosine similarity is preferred.
Index TypeDistance MetricSimilarity MeasureBest forNormalization Required
IndexFlatL2Euclidean (L2) distanceNoGeometric nearest neighbor search, image embeddingsNo
IndexFlatIPInner productYes (cosine similarity if vectors normalized)Semantic search, text embeddings, RAGYes for cosine similarity
BothExact searchExact similaritySmall to medium datasets, baseline comparisonsDepends on metric
BothFlat (brute-force)No indexing compressionHigh accuracy, slower on large datasetsN/A

Key differences

IndexFlatL2 uses Euclidean distance to measure similarity, which calculates the straight-line distance between vectors. IndexFlatIP uses the inner product, which measures similarity as the dot product of vectors, often used as cosine similarity when vectors are normalized. IndexFlatL2 does not require vector normalization, while IndexFlatIP typically requires normalization for cosine similarity.

Side-by-side example: IndexFlatL2

This example creates a FAISS IndexFlatL2 index and performs a nearest neighbor search using Euclidean distance.

python
import faiss
import numpy as np

# Create 5 vectors of dimension 3
vectors = np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1], [1, 1, 0], [0, 1, 1]], dtype='float32')

# Initialize IndexFlatL2
index = faiss.IndexFlatL2(3)
index.add(vectors)

# Query vector
query = np.array([[1, 0, 0]], dtype='float32')

# Search for 2 nearest neighbors
distances, indices = index.search(query, 2)
print('Indices:', indices)
print('Distances:', distances)
output
Indices: [[0 3]]
Distances: [[0. 1.]]

Side-by-side example: IndexFlatIP

This example creates a FAISS IndexFlatIP index and performs a nearest neighbor search using inner product similarity. Vectors are normalized to use cosine similarity.

python
import faiss
import numpy as np

# Create 5 vectors of dimension 3
vectors = np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1], [1, 1, 0], [0, 1, 1]], dtype='float32')

# Normalize vectors for cosine similarity
faiss.normalize_L2(vectors)

# Initialize IndexFlatIP
index = faiss.IndexFlatIP(3)
index.add(vectors)

# Query vector
query = np.array([[1, 0, 0]], dtype='float32')
faiss.normalize_L2(query)

# Search for 2 nearest neighbors
scores, indices = index.search(query, 2)
print('Indices:', indices)
print('Scores:', scores)
output
Indices: [[0 3]]
Scores: [[1.         0.70710677]]

When to use each

Use IndexFlatL2 when your application requires true geometric distance, such as image or sensor data embeddings where Euclidean distance reflects similarity. Use IndexFlatIP when working with semantic embeddings (e.g., text or language models) where cosine similarity is preferred, which requires normalizing vectors before indexing.

Use caseRecommended indexReason
Image embeddingsIndexFlatL2Euclidean distance matches perceptual similarity
Text embeddings for semantic searchIndexFlatIPInner product with normalized vectors equals cosine similarity
Small datasets needing exact searchEitherBoth provide exact search, choose metric based on data
Normalized vectors from language modelsIndexFlatIPOptimized for cosine similarity

Pricing and access

FAISS is an open-source library from Meta and is free to use with no API costs. It runs locally or on your servers, so pricing depends on your compute resources.

OptionFreePaidAPI access
FAISS IndexFlatL2YesNoNo (local library)
FAISS IndexFlatIPYesNoNo (local library)
Managed vector search servicesVariesYesYes (e.g., Pinecone, Weaviate)

Key Takeaways

  • IndexFlatL2 uses Euclidean distance, no normalization needed.
  • IndexFlatIP uses inner product, requires normalization for cosine similarity.
  • Choose IndexFlatIP for semantic search with text embeddings.
  • Both are exact search indexes, suitable for small to medium datasets.
  • FAISS is free and runs locally; no API or usage costs.
Verified 2026-04
Verify ↗