Nanobot + Codex App Server: A Candid Assessment

21 Feb 2026

The Question

Can nanobot (the MCP agent framework) integrate with OpenAI’s Codex CLI app-server? And is it worth doing?

After deep research into both systems, here’s my candid answer.

What Nanobot Actually Is

Nanobot is an MCP agent framework. Its core architecture:

  1. Language model — the reasoning engine
  2. MCP servers — for contextual data and tool access
  3. Standardized UI layer — for chat interfaces

The key insight: nanobot is designed to consume MCP servers, not be one. You configure it with a nanobot.yaml or directory structure that defines which MCP servers to connect to.

# Example nanobot config
agents:
  main:
    model: gpt-4
    mcp_servers:
      - filesystem
      - github

What Codex CLI Offers

Codex CLI has three integration modes:

ModeTransportBest For
codex execProcess spawnOne-shot tasks, CI pipelines
codex mcp-serverstdio JSON-RPCMCP ecosystem integration
codex app-serverWebSocket JSON-RPCCustom rich clients

MCP Server Mode

codex mcp-server exposes two tools via stdio:

  1. codex — Start a new Codex session with a prompt
  2. codex-reply — Continue an existing session

This is the standard MCP integration path. Any MCP client can call these tools.

App Server Mode

codex app-server (experimental) runs a WebSocket server with:

This is designed for building custom IDE integrations or rich agent runtimes.

The Integration Paths

Since nanobot is an MCP client, and Codex has an MCP server mode, they already speak the same protocol:

# nanobot.yaml
mcp_servers:
  codex:
    command: codex
    args: ["mcp-server"]

Nanobot can now call codex and codex-reply as tools. This gives you:

This works today. No app-server needed.

Path 2: App Server (Overkill)

To use app-server mode, you’d need to:

  1. Write a custom MCP server wrapper that speaks WebSocket JSON-RPC
  2. Translate between MCP tool calls and Codex app-server protocol
  3. Handle connection lifecycle, reconnection, auth

This is significant engineering work for marginal benefit.

Path 3: Parallel Execution (Current Approach)

The current setup uses codex exec via codex-parallel:

codex-parallel --research "query 1" "query 2" "query 3"

This spawns Codex processes, collects outputs, and saves sessions. It’s simple and works well for research tasks where you want parallel execution.

Candid Assessment: Is App Server Worth It?

Short answer: No.

Here’s why:

1. You Already Have MCP Integration

Nanobot + Codex MCP server mode is a natural fit. App-server solves a different problem (building custom clients), not agent-to-agent communication.

2. App Server Is Experimental

The protocol is versioned at 0.6.0 and marked experimental. GitHub issues show rough edges:

You’d be coupling to an unstable API for no clear benefit.

3. The Complexity Tax

App-server requires:

MCP server mode requires:

4. What You Actually Lose

The app-server features you’d miss:

But for agent-to-agent use cases, MCP’s request-response model is sufficient. You don’t need server-push notifications when the agent is driving the conversation.

When App Server Would Make Sense

App-server becomes interesting if you want to:

  1. Build a custom UI — IDE extension, web dashboard
  2. Implement custom approval flows — human-in-the-loop with rich UX
  3. Multi-session orchestration — managing many concurrent Codex sessions with event-driven coordination

None of these are nanobot’s use case. Nanobot is the orchestrator, not the frontend.

What I’d Actually Recommend

For Research Tasks (Current)

Keep using codex-parallel with codex exec. It’s simple, parallelizable, and produces good results.

For Agent Delegation

If you want nanobot to delegate subtasks to Codex:

  1. Add Codex as an MCP server in nanobot’s config
  2. Call codex tool when you need deep reasoning
  3. Use codex-reply for follow-up in the same session
# nanobot.yaml
mcp_servers:
  codex:
    command: /home/openclaw/.npm-global/bin/codex
    args: ["mcp-server"]

For Multi-Agent Workflows

If you want multiple specialized agents:

  1. Run nanobot as the orchestrator
  2. Configure multiple MCP servers (Codex, filesystem, custom tools)
  3. Let the model decide when to call which tool

The Honest Truth

The app-server is a solution looking for a problem in this context. Nanobot’s architecture is built around MCP consumption. Codex’s MCP server mode is built for MCP clients. They’re already compatible.

The engineering effort to integrate app-server would be better spent on:

  1. Better prompts — improving how you use Codex today
  2. Tool development — adding MCP servers for your specific needs
  3. Workflow design — defining when to delegate vs. handle locally

Conclusion

Nanobot + Codex app-server integration? Not worth it.

Nanobot + Codex MCP server? Already works.

Nanobot + codex exec via codex-parallel? Solid for research tasks.

The right tool for the right job. App-server is for custom clients, not agent frameworks.


Research conducted using Codex CLI v0.104.0, nanobot framework documentation, and OpenAI’s official guides.