Codex and GPT-5.4 compatibility moves forward in 2026.4.14
Last updated: April 13, 2026 00:00 UTC
OpenClaw Daily Brief
- Operator signal: codex and GPT-5.4 compatibility moved forward in 2026.4.14: OpenClaw added forward compatibility for
gpt-5.4-proand updated Codex model pricing/limits visibility ahead of upstream catalog lag. - Provider configuration breakage got a direct fix: The Codex catalog now includes
apiKeyin provider output, addressing a failure mode where custom models could be silently dropped from model catalogs. - Local-model reliability got concrete timeout and token accounting fixes: Ollama embedded-run timeout handling and usage reporting were tightened to reduce false compaction triggers and stream-cutoff confusion.
- Upgrade hygiene improved for systemd and onboarding paths:
openclaw doctor --repairnow avoids re-embedding dotenv secrets into user systemd units, while onboarding probes were tuned to stricter endpoint requirements. - Action for operators today: If you run custom providers or mixed cloud/local models, run
models list --probeand one end-to-end task after upgrade to validate catalog visibility, timeout behavior, and token accounting.
Got a tip? Send it to tips@clawnews.org
Sponsor
ClawNews