Skip to main content

LlmProvider

Defined in the LLM Providers module.
Rust AI Agent SDK Trait for LLM providers

Methods

chat

async fn chat(
        &self,
        messages: &[Message],
        tools: Option<&[ToolDefinition]>,
    ) -> Result<LlmResponse>
Send a chat completion request Parameters:
NameType
messages&[Message]
toolsOption&lt;&[ToolDefinition]&gt;

chat_stream

async fn chat_stream(
        &self,
        messages: &[Message],
        tools: Option<&[ToolDefinition]>,
    ) -> Result<Box<dyn futures::Stream<Item = Result<String>> + Send + Unpin>>
Stream a chat completion (returns chunks) Parameters:
NameType
messages&[Message]
toolsOption&lt;&[ToolDefinition]&gt;

model

fn model(&self) -> &str
Get the model name