Skip to main content

Overview

Generate vector embeddings for text using various embedding models through LiteLLM. Supports 100+ embedding models including OpenAI, Cohere, HuggingFace, Azure, and more.

Quick Start

from praisonaiagents import embedding

result = embedding("Hello, world!")
print(f"Dimensions: {len(result.embeddings[0])}")  # 1536

Agent-Centric Usage

Embeddings are used internally by PraisonAI Agents for:
  • Knowledge retrieval (RAG) - semantic search over documents
  • Memory storage - storing and retrieving conversation context
  • Semantic similarity - finding related content

Agent with Knowledge (uses embeddings internally)

from praisonaiagents import Agent

agent = Agent(
    instructions="You are a helpful assistant.",
    knowledge=["AI agents can use tools to accomplish tasks."],
    knowledge_config={
        "embedder": {
            "provider": "openai",
            "config": {"model": "text-embedding-3-small"}
        }
    }
)

response = agent.chat("What can AI agents do?")

Agent with Memory (uses embeddings internally)

from praisonaiagents import Agent

agent = Agent(
    instructions="You remember conversations.",
    memory=True,
    memory_config={"embedding_model": "text-embedding-3-small"}
)

agent.chat("My name is Alice.")
response = agent.chat("What is my name?")  # Uses embedding for retrieval

Direct Embedding API

Single Text Embedding

from praisonaiagents import embedding

result = embedding("Hello, world!", model="text-embedding-3-small")
print(f"Dimensions: {len(result.embeddings[0])}")  # 1536

Batch Embeddings

from praisonaiagents import embedding

result = embedding(
    ["Hello", "World", "AI agents"],
    model="text-embedding-3-small"
)

print(f"Generated {len(result.embeddings)} embeddings")

Get Model Dimensions

from praisonaiagents import get_dimensions

dims = get_dimensions("text-embedding-3-small")  # 1536
dims = get_dimensions("text-embedding-3-large")  # 3072

With Custom Dimensions

from praisonaiagents import embedding

result = embedding(
    "Hello world",
    model="text-embedding-3-large",
    dimensions=256  # Reduce dimensions
)

print(f"Dimensions: {len(result.embeddings[0])}")  # 256

Async Usage

import asyncio
from praisonaiagents import aembedding

async def main():
    result = await aembedding(
        "Hello world",
        model="text-embedding-3-small"
    )
    print(f"Dimensions: {len(result.embeddings[0])}")

asyncio.run(main())

Import Options

# Recommended: Direct import from praisonaiagents
from praisonaiagents import embedding, EmbeddingResult, get_dimensions
from praisonaiagents import aembedding  # async version

# Plural aliases (OpenAI style)
from praisonaiagents import embeddings, aembeddings

# Short aliases
from praisonaiagents import embed, aembed

# Alternative: Import from embedding submodule
from praisonaiagents.embedding import embedding, aembedding

Parameters

ParameterTypeDefaultDescription
inputstr or List[str]RequiredText(s) to embed
modelstr”text-embedding-3-small”Embedding model
dimensionsintNoneOutput dimensions
encoding_formatstr”float""float” or “base64”
timeoutfloat600.0Request timeout
api_keystrNoneAPI key override

Result Object

The EmbeddingResult object contains:
  • embeddings: List of embedding vectors
  • model: Model used
  • usage: Token usage statistics

Supported Models

Since PraisonAI wraps LiteLLM, all LiteLLM-supported embedding models work:
  • OpenAI: text-embedding-3-small, text-embedding-3-large, text-embedding-ada-002
  • Cohere: cohere/embed-english-v3.0, cohere/embed-multilingual-v3.0
  • Azure: azure/text-embedding-ada-002
  • HuggingFace: huggingface/sentence-transformers/all-MiniLM-L6-v2
  • Voyage: voyage/voyage-01, voyage/voyage-lite-01
  • And many more via LiteLLM