OPTXOPTX DOCS
JOE — Jett Optics Engine

HEDGEHOG Gateway

652015

Handshake Encrypted Delegated Gesture Envelope Handler Optical Gateway — multi-API AI gateway (v3.6.0).

HEDGEHOG (Handshake Encrypted Delegated Gesture Envelope Handler Optical Gateway) is the AI gateway that powers AstroJOE's intelligence. It runs on OPTX Validator Nodes and proxies requests to Grok 4.20 via the xAI API. Current version: v3.6.0.

Two Modes of Operation

HEDGEHOG operates in two distinct modes:

1. MCP Server (stdio) — For Claude Code / Cursor

When launched as an MCP server, HEDGEHOG communicates via stdio using the Model Context Protocol. This is how developers interact with HEDGEHOG from their IDE.

  • Protocol: stdio (not HTTP)
  • 12 tools exposed via MCP
  • Configuration: ~/.claude/mcp.json or ~/.cursor/mcp.json

2. HTTP Service (port 8811) — REST API for Agents

When running as a service, HEDGEHOG exposes an OpenAI-compatible REST API. This is how Hermes Agent and AgenC connect for LLM inference.

  • Protocol: HTTP REST (OpenAI-compatible)
  • No /mcp endpoint — this is a pure REST API, not an MCP-over-HTTP server
  • HEDGEHOG serves as both the LLM gateway AND the tool calling proxyweb_search and x_search are handled via the /v1/responses endpoint

HTTP Endpoints

OpenAI-Compatible Chat Completions

POST /v1/chat/completions

SSE streaming support. Used by Hermes Agent as OPENAI_BASE_URL for LLM inference.

Multi-Agent Responses (Tool Calling Proxy)

POST /v1/responses

Proxies to xAI's /v1/responses with Grok 4.20 multi-agent mode. Built-in tools:

  • web_search — Search the web
  • x_search — Search X/Twitter

This is how AstroJOE's hedgehog-websearch skill accesses web and X search — by calling the HEDGEHOG REST API, not via MCP.

Memory Endpoints

POST /memory/store    — Store a key-value memory
GET  /memory/recall   — Recall by key
GET  /memory/recent   — Get recent memories

Models

GET /v1/models        — List available models

MCP Tools (12 total)

ToolDescription
hedgehog_grok_queryQuery Grok 4.20 with project context
hedgehog_get_contextRetrieve jOSH-spatial work context
hedgehog_store_gaze_dataStore COG/ENV/EMO gaze tensors
hedgehog_retrieve_gaze_dataRetrieve gaze tracking history
hedgehog_analyze_gaze_patternAI-powered gaze pattern analysis
hedgehog_chat_completionMulti-model AI chat
hedgehog_xai_api_historyxAI API call audit trail
hedgehog_xai_api_statsxAI API usage statistics
hedgehog_gatewayxAI API Gateway with embedded key
hedgehog_memory_storeStore memory in SpacetimeDB
hedgehog_memory_recallRecall memory by key
hedgehog_memory_searchSearch memories

Model Routing

PriorityModelUse Case
Primarygrok-4.20-multi-agent-beta-0309Web search, X search, research, spatial reasoning
Fallbackgrok-4.20-0309-reasoningComplex reasoning without multi-agent tools
Chatgrok-3-fastStandard chat completions (AgenC, Hermes)

Multi-Agent Fallback

If the /v1/responses endpoint fails, HEDGEHOG automatically falls back to /v1/chat/completions with the reasoning model.

Integration Notes

  • Hermes Agent connects to HEDGEHOG via OPENAI_BASE_URL pointed at port 8811. Custom skills (hedgehog-websearch, hedgehog-memory) handle tool access through REST calls.
  • Claude Code / Cursor connects via MCP stdio — no HTTP involved.
  • HEDGEHOG was removed from Hermes' mcp_servers config because there is no /mcp endpoint on the HTTP service. LLM routing continues via OPENAI_BASE_URL.