AGiXT — Extensible AI Agent Automation Framework
Open-source AI agent automation platform with 50+ provider integrations, plugin system, chain-of-thought workflows, and persistent memory. Self-hostable via Docker.
What it is
AGiXT is an open-source AI agent automation platform that combines 50+ LLM provider integrations, a plugin system for external tools, chain-of-thought workflows (called 'chains'), and persistent memory. It provides a web UI, REST API, and Python SDK for building and managing autonomous AI agents.
AGiXT is designed for teams that need customizable autonomous agents with tool integrations. It works with Claude, GPT, Gemini, Mistral, Ollama, and 45+ more providers, and can be set up in under 10 minutes with Docker.
How it saves time or tokens
AGiXT reduces development time by providing pre-built integrations with over 50 LLM providers and a plugin system for external tools. Instead of writing custom code for each provider or tool, you configure agents through its web UI or API. The chain-of-thought workflow system lets you compose multi-step agent behaviors without manually orchestrating API calls. With an estimated token budget of 4,000 tokens per workflow run, AGiXT keeps costs predictable through configurable limits and provider switching.
How to use
- Clone the repository and start AGiXT with Docker:
git clone https://github.com/Josh-XT/AGiXT
cd AGiXT
docker compose up -d
- Open
http://localhost:8501for the web UI and configure your first agent with an LLM provider and plugins.
- Create chains (multi-step workflows) by linking agent actions — each chain step can call different providers, tools, or sub-agents.
Example
# Using AGiXT Python SDK
from agixtsdk import AGiXTSDK
agixt = AGiXTSDK(base_uri='http://localhost:7437')
# Create an agent
agixt.add_agent(
agent_name='researcher',
settings={
'provider': 'anthropic',
'model': 'claude-sonnet-4-20250514'
}
)
# Run a prompt
response = agixt.prompt_agent(
agent_name='researcher',
prompt_name='Think About It',
prompt_args={'user_input': 'Summarize the latest AI safety papers'}
)
print(response)
Related on TokRepo
- Multi-agent frameworks compared — Compare AGiXT with CrewAI, AutoGen, and other multi-agent orchestration tools.
- AI agent tools — Browse more agent-building frameworks and utilities curated on TokRepo.
Common pitfalls
- Starting with too many plugins enabled slows agent response time. Enable only the plugins your workflow actually needs.
- Chain-of-thought workflows can loop indefinitely if exit conditions are not set properly. Always define a maximum iteration count.
- Mixing providers within a single chain can cause format incompatibilities. Test each chain step individually before composing them.
Frequently Asked Questions
AGiXT supports 50+ LLM providers including OpenAI, Anthropic, Google, Cohere, Mistral, Ollama, vLLM, Hugging Face, llama.cpp, Azure, AWS Bedrock, Google Vertex, OpenRouter, LiteLLM, and Portkey. You can switch providers per agent or per chain step.
Clone the GitHub repository and run docker compose up -d. The web UI is available at localhost:8501 and the API at localhost:7437. No additional infrastructure is required for a basic setup.
Chains are multi-step workflows where each step can call a different LLM provider, execute a plugin (web browsing, code execution, file operations), or delegate to another agent. They support conditional branching and looping for complex automation.
Yes. AGiXT has a plugin system with built-in plugins for web browsing, code execution (Python, Shell), file operations, email, GitHub integration, database queries, image generation, and voice. You can also write custom plugins.
AGiXT focuses on provider breadth (50+ integrations) and a plugin-based tool system with a visual web UI. CrewAI emphasizes role-based multi-agent collaboration. AutoGen focuses on conversational multi-agent patterns. AGiXT is more infrastructure-oriented while CrewAI and AutoGen are more pattern-oriented.
Citations (3)
- AGiXT GitHub— AGiXT supports 50+ LLM providers and plugin system
- Anthropic API Docs— Anthropic Claude model integration for AI agents
- Docker Documentation— Docker Compose for container orchestration
Related on TokRepo
Source & Thanks
Created by Josh-XT. Licensed under MIT.
Josh-XT/AGiXT — 3k+ stars
Discussion
Related Assets
doctest — The Fastest Feature-Rich C++ Testing Framework
doctest is a single-header C++ testing framework designed for minimal compile-time overhead and maximum speed.
Chai — BDD/TDD Assertion Library for Node.js
Chai is a flexible assertion library for Node.js and browsers that supports expect, should, and assert styles.
Supertest — HTTP Assertion Library for Node.js APIs
Supertest provides a high-level API for testing HTTP servers in Node.js with fluent assertion chaining.