Skip to main content
The run command manages async job execution for agents and recipes via a jobs server.

Quick Start

# Start the jobs server
python -m uvicorn praisonai.jobs.server:create_app --port 8005 --factory

# Submit a job
praisonai run submit "Analyze this data"

# Submit a recipe as job
praisonai run submit "Analyze AI trends" --recipe news-analyzer

Commands Overview

CommandDescription
praisonai run submitSubmit a new job
praisonai run status <id>Get job status
praisonai run result <id>Get job result
praisonai run stream <id>Stream job progress
praisonai run listList all jobs
praisonai run cancel <id>Cancel a job

Starting the Jobs Server

Before using job commands, start the jobs server:
# Start server on default port (8005)
python -m uvicorn praisonai.jobs.server:create_app --port 8005 --factory

# Or with custom host/port
python -m uvicorn praisonai.jobs.server:create_app --host 0.0.0.0 --port 8080 --factory

Submit a Job

Basic Submission

# Basic submission with prompt
praisonai run submit "Analyze this data"

# Wait for completion
praisonai run submit "Quick task" --wait

# Stream progress after submission
praisonai run submit "Long task" --stream

Submit with Recipe

# Submit with recipe
praisonai run submit "Analyze AI trends" --recipe news-analyzer

# With recipe config
praisonai run submit "Analyze AI trends" --recipe news-analyzer --recipe-config '{"format": "json"}'

Submit with Agent File

# With agent file
praisonai run submit "Process task" --agent-file agents.yaml

Advanced Options

# With timeout
praisonai run submit "Complex task" --timeout 7200

# With webhook
praisonai run submit "Task" --webhook-url https://example.com/callback

# With idempotency
praisonai run submit "Task" --idempotency-key order-123 --idempotency-scope session

# With metadata
praisonai run submit "Task" --metadata user=john --metadata priority=high

# JSON output
praisonai run submit "Task" --json

# Custom API URL
praisonai run submit "Task" --api-url http://localhost:8080

Submit Options

OptionDescription
--agent-filePath to agents.yaml
--recipeRecipe name (mutually exclusive with —agent-file)
--recipe-configRecipe config as JSON string
--frameworkFramework to use (default: praisonai)
--timeoutTimeout in seconds (default: 3600)
--waitWait for completion
--streamStream progress after submission
--idempotency-keyKey to prevent duplicates
--idempotency-scopeScope: none, session, global
--webhook-urlWebhook URL for completion
--session-idSession ID for grouping
--metadataCustom metadata (KEY=VALUE, repeatable)
--jsonOutput JSON for scripting
--api-urlJobs API URL (default: http://127.0.0.1:8005)

Check Job Status

# Get status
praisonai run status run_abc123

# JSON output
praisonai run status run_abc123 --json

Status Output

Job: run_abc123
Status: running
Progress: 45%
Step: Processing data
Created: 2024-01-15 10:30:00
Duration: 45.2s

Get Job Result

# Get result
praisonai run result run_abc123

# JSON output
praisonai run result run_abc123 --json

Stream Job Progress

Stream real-time progress updates via SSE:
# Stream progress
praisonai run stream run_abc123

# Raw JSON events
praisonai run stream run_abc123 --json

Stream Output

[10%] Initializing agent
[20%] Loading recipe
[50%] Processing data
[90%] Finalizing
[100%] Completed

List Jobs

# List all jobs
praisonai run list

# Filter by status
praisonai run list --status running
praisonai run list --status succeeded
praisonai run list --status failed

# Pagination
praisonai run list --page 1 --page-size 20

# JSON output
praisonai run list --json

Cancel a Job

# Cancel job
praisonai run cancel run_abc123

# JSON output
praisonai run cancel run_abc123 --json

Idempotency

Prevent duplicate job submissions:
# First submission creates job
praisonai run submit "Process order" --idempotency-key order-123 --json

# Second submission returns same job (no duplicate)
praisonai run submit "Process order" --idempotency-key order-123 --json

Idempotency Scopes

ScopeDescription
noneNo idempotency (default)
sessionUnique within session
globalUnique across all sessions

Webhooks

Configure webhooks to receive notifications when jobs complete:
praisonai run submit "Task" --webhook-url https://example.com/callback
Webhook payload:
{
  "job_id": "run_abc123",
  "status": "succeeded",
  "result": {"output": "..."},
  "completed_at": "2024-01-15T10:30:00Z",
  "duration_seconds": 45.2
}

Session Grouping

Group related jobs by session:
praisonai run submit "Task 1" --session-id project-alpha
praisonai run submit "Task 2" --session-id project-alpha

Python API

from praisonai import recipe

# Submit recipe as async job
job = recipe.submit_job(
    "my-recipe",
    input={"query": "What is AI?"},
    config={"max_tokens": 1000},
    session_id="session_123",
    timeout_sec=3600,
    webhook_url="https://example.com/webhook",
    idempotency_key="unique-key-123",
    api_url="http://127.0.0.1:8005",
)

print(f"Job ID: {job.job_id}")
print(f"Status: {job.status}")

# Wait for completion
result = job.wait(poll_interval=5, timeout=300)
print(f"Result: {result}")

Complete Workflow Example

# 1. Start the server (in another terminal)
python -m uvicorn praisonai.jobs.server:create_app --port 8005 --factory

# 2. Submit a job with recipe
praisonai run submit "Analyze AI news" --recipe news-analyzer --json
# Output: {"job_id": "run_abc123", "status": "queued", ...}

# 3. Check status
praisonai run status run_abc123

# 4. Stream progress
praisonai run stream run_abc123

# 5. Get result when done
praisonai run result run_abc123

# 6. List all jobs
praisonai run list

Scripting with JSON

#!/bin/bash

API_URL="http://127.0.0.1:8005"

# Submit job
RESULT=$(praisonai run submit "Analyze data" --recipe analyzer --json --api-url $API_URL)
JOB_ID=$(echo $RESULT | jq -r '.job_id')

echo "Submitted job: $JOB_ID"

# Poll for completion
while true; do
    STATUS=$(praisonai run status $JOB_ID --json --api-url $API_URL | jq -r '.status')
    echo "Status: $STATUS"
    
    if [ "$STATUS" = "succeeded" ] || [ "$STATUS" = "failed" ]; then
        break
    fi
    
    sleep 5
done

# Get result
praisonai run result $JOB_ID --json --api-url $API_URL

See Also