- Launcher Auth Flow Upgrade: Unified login/setup/logout with browserless OAuth support
- Dashboard Auth Upgrade: Web UI authentication migrated from Token to Password
- Gateway Auth Simplification: Consolidated from PID+Token to Token-only auth
- Two-Phase Agent Loop Refactor: Execution pipeline split and reorganized for better maintainability
- Native Gemini Provider: Added native Gemini provider with separated thought/output messages
- Context & Memory Improvements: Added
/context usage indicator ring and extended /clear to purge Seahorse short-term memory DB
- Scheduled task isolation: each job now runs in an independent session
- Improved MCP/Discovery hot-reload recovery, including stronger recovery after unsupported image inputs
- Tool-calling consistency fix: stable tool-call IDs across turns when reusing calls
- Web tools page refactor: switched to "Tool Library + Web Search Settings" tab structure
- Markdown syntax highlighting support for chat and skill content
- Composer/Tooltip disabled-state reason prompts improved with regression fixes
- Added configurable Sogou search backend with stronger proxy/network-error fallback
- Channel config evolved toward multi-instance management with list-based channel editing
- Added Android arm64 cross-compilation and integrated it into the GoReleaser pipeline
- CI upgraded to pnpm/action-setup workflow and README install steps updated
- Documentation system reorganized: added session/routing docs and updated Gemini protocol docs
- Sandbox Isolation: Agent execution isolation for safer tool runs
- Seahorse Short-Term Memory: New LCM (short-term memory) engine for the seahorse memory subsystem
- Enhanced Hooks System: New
respond action and comprehensive hook documentation - Microsoft Teams Channel: New
teams_webhook output-only channel - HTTP Provider Headers: Custom header injection for HTTP-based LLM providers
- MCP Artifact Storage: Oversized text results from MCP tools are now stored as artifacts
- New LOCOMO memory benchmark tool for evaluating memory engines
- Per-candidate provider support for
model_fallbacks - Feishu: enriched reply context now includes card and file replies
- Hardened gateway PID liveness, ownership validation, and WebSocket proxy state
- Web UI now derives the WebSocket URL from the browser location instead of the backend
- Fix: WebUI could not connect to the gateway started by WebUI itself
- Fix: Docker launcher missing
-console flag and open network access - Fix:
write_file nested-JSON escape semantics clarified with new tests - Fix: duplicate
v in CLI help banner version line - Disabled seahorse context manager on FreeBSD/ARM and other unsupported platforms
- New LLM Providers: Mimo AI, Venice AI, LM Studio, and Z.AI added
- VK Channel: VK social network support
- Skill Marketplace: Web UI hub for browsing and installing skills from registry
- Launcher Dashboard: Token-protected dashboard with SPA login page
- New Tools:
load_image (local vision), read_file by lines, exec background/PTY, reaction tool - LLM Rate Limiting: Per-model rate limiting and ContextManager abstraction
- Image message support in Pico chat Web UI
- Multi-message sending via
--- split marker - BM25 index precomputation for faster memory/knowledge searches
- First-time onboarding tour guide for new users
PICOCLAW_LOG_FILE env var for file-only logging- Telegram: quoted reply context and media included in inbound turns
- Fix: Telegram HTML links broken by italic regex inside href URLs
- Fix: WeChat/WeCom new protocol support
- Fix: token over-estimation causing premature context pruning
- Fix: Retry-After header honored for 429 rate limit responses
- Fix: GitHub Copilot session creation failure
- Largest Update Ever: 539 files changed, ~86,000 lines of code added, surpassing 26K Stars
- Agent Architecture Overhaul: Turn & Sub-turn lifecycle, EventBus, Hook system, Steering dynamic intervention
- Deep WeChat/WeCom Integration: Enhanced message processing and context management
- Security System Upgrade: Comprehensive security configuration and permission management
- New Providers & Channels: Extensive new model and channel support
- Turn & Sub-turn lifecycle with max concurrency 5, max nesting depth 3
- EventBus with 18 event types covering full Agent lifecycle
- Hook system: Observer, Interceptor, Approval with 5 checkpoints (Python/Node.js compatible)
- Steering dynamic intervention: inject instructions into running Agent
- Context budget management with automatic safe compression
AGENT.md structured definition for Agent identity and capabilities
- System Tray UI: Desktop tray support for all platforms (macOS, Linux, FreeBSD)
- Exec Controls: Configurable exec settings with cron command gating
- Web Gateway: Hot reload and polling state sync for gateway management
- SpawnStatusTool: New tool for reporting subagent statuses
- Configurable cron command execution settings in web UI
- WebSocket proxy through web server port
- Fix: GLM provider nil input handling in tool_use blocks
- Fix: gateway no longer starts when server is not running
- Fix: normalize whitelist path checks for symlinked allowed roots
- Refactored gateway helpers with server.pid health check
- Voice Transcription: Echo voice audio transcription for Discord, Slack, and Telegram
- Agent Management UI: New web UI for agent management and launcher integration
- Security Hardening: Hardened unauthenticated tool-exec paths
- Exec
allow_remote config support in web settings - Session key sanitization for slash characters in forum topic keys
- Fix: gateway binary path resolution and
--config flag passing - Fix: skip meta JSON files during session migration
- Fix: Slack double message in threads
- Refactored skills loader markdown metadata parsing
- Web & UI Architecture Upgrade: Launcher migrated to modular web frontend/backend
- Vision Pipeline: Full image/media support with Base64 conversion and auto file type detection
- JSONL Memory Storage Engine: Append-only session storage with crash recovery
- MCP Deep Integration: Full Model Context Protocol support with parallel tool execution
- New Providers: Minimax, Moonshot (Kimi), Exa AI, SearXNG, and more
- WeCom AIBot integration with streaming task context and reasoning display
- Discord proxy support, channel reference parsing, quote/reply context
- Telegram custom Bot API Server, long message chunking
- Feishu enhanced cards, media, @mention, random reaction emoji
- Native Matrix and IRC channel support
send_file tool for outbound multimedia file sending- Docker: base image switched from Debian to Alpine
- Cross-compilation: added Linux/s390x and Linux/mipsle
- Architecture Overhaul: Protocol-based provider creation, unified OpenAI/Anthropic/Gemini/Mistral integration
- Channel System Upgrade: Automated placeholder scheduling, typing indicators, emoji reactions
- PicoClaw Launcher & WebUI: Visual configuration management and gateway scheduling
- Skill Discovery: ClawHub integration for skill search, cache, and install
- New providers: Cerebras, Mistral AI, Volcengine (Doubao), Perplexity, Qwen, Google Antigravity
- Native WhatsApp channel with high-performance implementation
- WeCom (Enterprise WeChat) channel support
- Low-resource optimization with comprehensive ARM build support (ARMv6/v7/v8.1)
- Sandbox security: fixed ExecTool directory escape and TOCTOU race conditions
- Multi-language docs: Vietnamese, Portuguese (Brazil) README translations
- LINE Channel: LINE Official Account support
- OneBot Channel: OneBot protocol support
- Health Check:
/health and /ready endpoints for container orchestration - Local AI: Ollama integration for local inference
- DuckDuckGo search fallback with configurable providers
- USB device hotplug event notifications on Linux
- Telegram structured command handling
- Security: blocked symlink workspace escape, tightened file permissions
- Linux/loong64 build support
- Migrated to GoReleaser for release management
- Dynamic Context Compression: Intelligent conversation context compression
- New Channels: Feishu, QQ, DingTalk Stream Mode, Slack Socket Mode
- Agent Memory System: Memory and tool execution improvements
- Cron Tool: Scheduled task support with shell command execution
- OAuth login: SDK-based subscription provider authentication
picoclaw migrate command for OpenClaw workspace migration- Telegram migrated to Telego library
- CLI-based LLM provider for subprocess integration
- macOS Darwin ARM64 build target
- Initial release of PicoClaw — ultra-lightweight AI assistant written in Go
- Single binary deployment with multi-platform support