Tools are functions that agents can use to interact with external systems and perform actions. They are essential for creating agents that can do more than just process text.
Create any function that you want to use as a tool, that performs a specific task.
Copy
from duckduckgo_search import DDGSfrom typing import List, Dict# Tool Implementationdef internet_search_tool(query: str) -> List[Dict]: """ Perform Internet Search using DuckDuckGo Args: query (str): The search query string Returns: List[Dict]: List of search results containing title, URL, and snippet """ results = [] ddgs = DDGS() for result in ddgs.text(keywords=query, max_results=5): results.append({ "title": result.get("title", ""), "url": result.get("href", ""), "snippet": result.get("body", "") }) return results
2
Assign the tool to an agent
Copy
data_agent = Agent( name="DataCollector", role="Search Specialist", goal="Perform internet searches to collect relevant information.", backstory="Expert in finding and organising internet data.", tools=[internet_search_tool], ## Add the tool to the agent i.e the function name )
That's it!
You have created a custom tool and assigned it to an agent.
Generate your OpenAI API key from OpenAI
Use other LLM providers like Ollama, Anthropic, Groq, Google, etc. Please refer to the Models for more information.
3
Create Agent with Tool
Create app.py
Copy
from praisonaiagents import Agent, Task, AgentTeamfrom duckduckgo_search import DDGSfrom typing import List, Dict# 1. Tool Implementationdef internet_search_tool(query: str) -> List[Dict]: """ Perform Internet Search using DuckDuckGo Args: query (str): The search query string Returns: List[Dict]: List of search results containing title, URL, and snippet """ results = [] ddgs = DDGS() for result in ddgs.text(keywords=query, max_results=5): results.append({ "title": result.get("title", ""), "url": result.get("href", ""), "snippet": result.get("body", "") }) return results# 2. Assign the tool to an agentdata_agent = Agent( name="DataCollector", role="Search Specialist", goal="Perform internet searches to collect relevant information.", backstory="Expert in finding and organising internet data.", tools=[internet_search_tool], reflection=False)# 3. Task Definitioncollect_task = Task( description="Perform an internet search using the query: 'AI job trends in 2024'. Return results as a list of title, URL, and snippet.", expected_output="List of search results with titles, URLs, and snippets.", agent=data_agent, name="collect_data",)# 4. Start Agentsagents = AgentTeam( agents=[data_agent], tasks=[collect_task], process="sequential")agents.start()
4
Start Agents
Execute your script:
Terminal
Copy
python app.py
1
Install PraisonAI
Install the core package and duckduckgo_search package:
Terminal
Copy
pip install praisonai duckduckgo_search
2
Create Custom Tool
To add additional tools/features you need some coding which can be generated using ChatGPT or any LLM
Create a new file tools.py with the following content:
Copy
from duckduckgo_search import DDGSfrom typing import List, Dict# 1. Tooldef internet_search_tool(query: str) -> List[Dict]: """ Perform Internet Search """ results = [] ddgs = DDGS() for result in ddgs.text(keywords=query, max_results=5): results.append({ "title": result.get("title", ""), "url": result.get("href", ""), "snippet": result.get("body", "") }) return results
3
Create Agent
Create a new file agents.yaml with the following content:
Copy
framework: praisonaitopic: create movie script about cat in marsagents: # Canonical: use 'agents' instead of 'roles' scriptwriter: instructions: # Canonical: use 'instructions' instead of 'backstory' Expert in dialogue and script structure, translating concepts into scripts. goal: Write a movie script about a cat in Mars role: Scriptwriter tools: - internet_search_tool # <-- Tool assigned to Agent here tasks: scriptwriting_task: description: Turn the story concept into a production-ready movie script, including dialogue and scene details. expected_output: Final movie script with dialogue and scene details.
MCP allows agents to use external tools via standardized protocols. This is the recommended way to add powerful tools to your agents.
Copy
from praisonaiagents import Agent, MCP# Use MCP tools with environment variablesagent = Agent( instructions="Search the web for information", tools=MCP( command="npx", args=["-y", "@anthropic/mcp-server-brave-search"], env={"BRAVE_API_KEY": "your-api-key"} ))agent.start("Search for AI trends in 2025")
Following these best practices will help you create robust, efficient, and secure tools in PraisonAI.
Design Principles
Single Responsibility
Each tool should have one clear purpose and do it well. Avoid creating tools that try to do too many things.
Copy
# Good Exampledef process_image(image: np.array) -> np.array: return processed_image# Avoiddef process_and_save_and_upload(image): # Too many responsibilities pass
Clear Interfaces
Define explicit input/output types and maintain consistent parameter naming.
Copy
def search_tool( query: str, max_results: int = 10) -> List[Dict[str, Any]]: """ Search for information with clear parameters """ pass
Documentation
Always include detailed docstrings and type hints.
Copy
def analyze_text( text: str, language: str = "en") -> Dict[str, float]: """ Analyze text sentiment and emotions. Args: text: Input text to analyze language: ISO language code Returns: Dict with sentiment scores """ pass
Performance Optimization
Efficient Processing
Optimize resource usage and processing time.
Copy
# Use generators for large datasetsdef process_large_data(): for chunk in data_generator(): yield process_chunk(chunk)
Resource Management
Properly handle resource allocation and cleanup.
Copy
async with aiohttp.ClientSession() as session: # Resource automatically managed await process_data(session)
async def fetch_data(urls: List[str]): async with aiohttp.ClientSession() as session: tasks = [fetch_url(session, url) for url in urls] return await asyncio.gather(*tasks)
Security Best Practices
Input Validation
Always validate and sanitize inputs to prevent security vulnerabilities.
Copy
def process_user_input(data: str) -> str: if not isinstance(data, str): raise ValueError("Input must be string") return sanitize_input(data.strip())
Rate Limiting
Implement rate limiting for API calls to prevent abuse.
By default, LLMs may skip calling tools even when instructed. Use tool_choice to control this behavior.
YAML
Python
Copy
agents: researcher: role: Research Specialist tools: - search_web tool_choice: required # Forces the agent to call a tool llm: gpt-4o-mini
Copy
from praisonaiagents import Agentfrom praisonaiagents.tools import search_web# For direct Agent usage, tool_choice is passed via chat()agent = Agent( name="researcher", tools=[search_web], llm="gpt-4o-mini")# Force tool usage in a specific callresponse = agent.chat("Search for AI news", tool_choice="required")
auto - LLM decides whether to call tools (default)
required - LLM must call a tool before responding
none - LLM cannot call tools (text response only)
Use tool_choice: required in YAML workflows when you need guaranteed tool execution. This is especially important for research agents that must search the web.