The Deep Agent Factory v1.0.3

Build agentsthat think deeply.

Zero dependencies. Bring your own LLM. Attach remote MCP servers.Stream native thinking blocks. Iterate with strict Pydantic schemas.

Get Started
$pip install 'shipit-agent[all]'
521+Tests Passing
11LLM Providers
30+Built-in Tools
0Core Dependencies
goal_example.py
Live
MCP: Connected
Python 3.12
42.4 MB
521 TestsUnit + E2E
11 ProvidersOpenAI, Anthropic, Bedrock, Gemini...
30+ ToolsBuilt-in & Extensible
100% LocalNo Data Leaves
Open SourceMIT Licensed
Zero DepsOnly Pydantic
The Deep Agent Factory

Cognitive architectures out of the box.

When a single agent loop isn't enough, switch to a Deep Agent. Gain planning, sub-agent delegation, self-reflection, and runtime tool creation.

goal_example.py
from shipit_agent.deep import GoalAgent, Goal
 
agent = GoalAgent.with_builtins(
llm=llm,
goal=Goal(
objective="Compare Python async libraries",
success_criteria=[
"Speed benchmarks",
"Memory usage comparison",
"Cites data sources"
]
)
)
 
result = agent.run()
print(result.goal_status) # "completed"
print(result.criteria_met) # [True, True, True]
Live Architecture
Goal
Planner
Execute
criteria_met: [True, True, True]
Ready to execute
Full Docs
Super RAG Subsystem

Zero-hallucination context injection.

A powerful, pluggable retrieval-augmented-generation subsystem built directly into the agent. Runs hybrid search (Vector + BM25 + Reciprocal Rank Fusion) out of the box with zero required dependencies.

Explore RAG Pipeline
Vector Store
Dense (Semantic)
BM25 Store
Sparse (Keywords)
Reciprocal Rank Fusion
AST Metadata + Exponential Time Decay
agent.run("Explain JWT rotation")
Injected Hit [1] lib/auth.ts (Score: 0.98)
Injected Hit [2] docs/auth.md (Score: 0.91)
Result attached to result.rag_sources
The Skills Factory

Domain knowledge as executable code.

Skills are runtime behavior packages. They auto-match your intent and inject specialized tools on the fly.

Skill Catalog
Skill Runtime
U
"Create a FastAPI REST API with User and Task models, CRUD endpoints, and a Dockerfile."
Full-Stack Developer
+write_file+edit_file+bash+run_code+plan_task
SUCCESS: Created 6 files — app/main.py, models.py, routes/tasks.py, database.py, requirements.txt, Dockerfile.
Complete
Active
13 tools injected
Explore Skills
Pipelines & Agent Teams

Deterministic logic meets dynamic reasoning.

Chain agents together like UNIX pipes. Use Pipeline.sequential() for strict step-by-step processing, or parallel() to fan-out sub-tasks concurrently.

Explore Agent Teams
multi_agent.py
from shipit_agent import Pipeline, step, parallel

pipe = Pipeline.sequential(
step("plan", agent=planner, prompt="Decompose: {objective}"),
parallel(
step("code_auth", agent=coder, prompt="Auth module from: {plan.output}"),
step("code_db", agent=coder, prompt="DB module from: {plan.output}")
),
step("verify", agent=verifier, prompt="Verify {code_auth.output} and {code_db.output}")
)

result = pipe.run(objective="Build a secure REST API")
Execution Graph
P
Planner
C1
Coder (Auth)
C2
Coder (DB)
V
Verifier
Real-Time Events

Watch the thought process unfold.

Under the hood, Shipit runs on a background thread and pushes AgentEvent objects through a thread-safe queue. Every tool invocation, planning step, and raw reasoning block reaches your loop the instant it's emitted—no buffering.

View Streaming API
agent.stream() loop
run_started
0.0s
mcp_attached
0.1s
planning_started
0.2s
reasoning_started
0.8s
tool_called
2.4s
tool_completed
4.1s
run_completed
5.5s
Advanced Memory System

Three memory types, one line of code.

Initialize with AgentMemory.default() and the agent handles conversation, semantic history, and entity tracking automatically.

conversation_memory.py
mem = ConversationMemory(
strategy="summary",
summary_llm=llm,
window_size=20
)
 
# Old messages → LLM summary
# Recent 20 → kept verbatim
msgs = mem.get_messages()
# [summary_msg, msg_81, ..., msg_100]
Visualization
buffer
window
summary
token
Developer Experience

Clean runtime. Observable execution.

We built Shipit to expose low-level control over its execution loop. Keep clean boundaries between your runtime, tools, policies, and profiles.

Zero Core Dependencies

Shipit keeps its footprint light. The base library requires only `pydantic`. Provider SDKs like `openai`, `anthropic`, or `litellm` are strictly opt-in extras.

Real-time Event Streaming

Watch the thought process unfold instantly. Every token, tool argument, reasoning block, and retry streams natively out of the agent loop.

Native MCP Integration

Instantly attach any remote or local Model Context Protocol server. Give your agent secure access to Linear, Slack, Postgres, or internal tooling with one line of code.

Pydantic Structured Output

Stop parsing raw text. Define your exact output schema using Pydantic models. The agent is forced to return strict, typed JSON—perfect for data pipelines.

main.py
from pydantic import BaseModel
from shipit_agent import Agent, Tool
class ResearchResult(BaseModel):
summary: str
confidence: float
# Bring your own LLM and Tools
agent = Agent.with_builtins(
llm=AnthropicChatLLM(model="claude-3-7-sonnet"),
tools=[web_search, git_status],
output_schema=ResearchResult
)
# Stream reasoning and tool calls natively
for chunk in agent.stream("Analyze recent commits"):
print(chunk.content, end="")
python main.py
<thinking> I need to run git_status first to see...
{ 'summary': '...', 'confidence': 0.95 }
Why SHIPIT

How we compare.

SHIPIT Agent is a library, not a framework. Small, focused, and observable. Here's how it stacks up against the alternatives.

Feature
SHIPIT
LangChain
CrewAI
AutoGen
Zero core dependencies
Native reasoning/thinking blocks
Bring your own LLM
MCP server integration
Parallel tool execution
Built-in Super RAG
Deep agent architectures
Pydantic structured output
Real-time event streaming
Extensible markdown skills
Bulletproof Bedrock pairing
Agent memory system

Complete Toolkit

Everything you need for autonomous engineering. No wrappers, no bloated abstractions — highly capable tools that plug directly into your workflow.

25+ Built-in Tools

web_search, open_url, bash, read_file, edit_file, write_file, run_code, plan_task, verify_output, sub_agent, and 15 more. All opt-in, all discoverable via tool_search.

Learn More

9 SaaS Connectors

Gmail, Google Drive, Slack, Linear, Jira, Notion, Confluence, GitHub, PostgreSQL. Each surfaces as agent tools — no wrapper code needed.

Learn More

100% Local & Secure

Your code stays on your machine. Shipit runs locally, isolates memory per project, and requires explicit permission for tool executions.

Learn More

Native MCP Integration

Attach any remote or local Model Context Protocol server. Give agents access to Linear, Slack, Postgres, or internal tooling with one line.

Learn More

Parallel Execution

When the LLM returns multiple tool calls, run them concurrently. Results stay in order. Typically 2-3x faster for multi-tool turns.

Learn More

Pydantic Structured Output

Define output schemas using Pydantic models. The agent returns strict, typed JSON — perfect for data pipelines and downstream systems.

Learn More

Extensible Markdown Skills

Drop a skill file and the agent treats it as an executable behavior package. Skills auto-match prompts and inject tools at runtime.

Learn More

Bulletproof Bedrock Pairing

Every toolUse gets a paired toolResult — even on errors, hallucinated tools, or planner output. Multi-iteration Bedrock loops just work.

Learn More

Bash & Code Execution

Full terminal support with sandboxed subprocess execution. Agents can run code, install packages, run tests, and create git commits.

Learn More
Ready to deploy

Start shippingtoday.

Whether you're exploring deep architectures, executing multi-step workflows, or integrating custom tools — Shipit is your autonomous Python engineer.

Get Started Free
$pip install 'shipit-agent[all]'