refactor: prepare unified exec for zsh-fork backend (#13392)
Why
shell_zsh_forkalready provides stronger guarantees around which executables receive elevated permissions. To reuse that machinery from unified exec without pushing Unix-specific escalation details through generic runtime code, the escalation bootstrap and session lifetime handling need a cleaner boundary.That boundary also needs to be safe for long-lived sessions: when an intercepted shell session is closed or pruned, any in-flight approval workers and any already-approved escalated child they spawned must be torn down with the session, and the inherited escalation socket must not leak into unrelated subprocesses.
What Changed
- Extracted a reusable
EscalationSessionandEscalateServer::start_session(...)inshell-escalationso callers can get the wrapper/socket env overlay and keep the escalation server alive without immediately running a one-shot command.- Documented that
EscalationSession::env()andShellCommandExecutor::run(...)exchange only that env overlay, which callers must merge into their own base shell environment.- Clarified the prepared-exec helper boundary in
coreby naming the new helper APIs aroundExecRequest, while keeping the legacyexecute_env(...)entrypoints as thin compatibility wrappers for existing callers that still use the older naming.- Added a small post-spawn hook on the prepared execution path so the parent copy of the inheritable escalation socket is closed immediately after both the existing one-shot shell-command spawn and the unified-exec spawn.
- Made session teardown explicit with session-scoped cancellation: dropping an
EscalationSessionor canceling its parent request now stops intercept workers, and the server-spawned escalated child useskill_on_drop(true)so teardown cannot orphan an already-approved child.- Added
UnifiedExecBackendConfigplumbing throughToolsConfig, ashell::zsh_fork_backendfacade, and an opaque unified-exec spawn-lifecycle hook so unified exec can prepare a wrappedzsh -c/-lcrequest without storingEscalationSessiondirectly in generic process/runtime code.- Kept the existing
shell_commandzsh-fork behavior intact on top of the new bootstrap path. Tool selection is unchanged in this PR: whenshell_zsh_forkis enabled,ShellCommandstill wins overexec_command.Verification
cargo test -p codex-shell-escalation
- includes coverage for
start_session_exposes_wrapper_env_overlay- includes coverage for
exec_closes_parent_socket_after_shell_spawn- includes coverage for
dropping_session_aborts_intercept_workers_and_kills_spawned_childcargo test -p codex-core shell_zsh_fork_prefers_shell_command_over_unified_execcargo test -p codex-core --test all shell_zsh_fork_prompts_for_skill_script_execution
Stack created with Sapling. Best reviewed with ReviewStack.
版权所有:中国计算机学会技术支持:开源发展技术委员会
京ICP备13000930号-9
京公网安备 11010802032778号
npm i -g @openai/codexor
brew install --cask codexCodex CLI is a coding agent from OpenAI that runs locally on your computer.
If you want Codex in your code editor (VS Code, Cursor, Windsurf), install in your IDE.
If you want the desktop app experience, run
codex appor visit the Codex App page.If you are looking for the cloud-based agent from OpenAI, Codex Web, go to chatgpt.com/codex.
Quickstart
Installing and running Codex CLI
Install globally with your preferred package manager:
Then simply run
codexto get started.You can also go to the latest GitHub Release and download the appropriate binary for your platform.
Each GitHub Release contains many executables, but in practice, you likely want one of these:
codex-aarch64-apple-darwin.tar.gzcodex-x86_64-apple-darwin.tar.gzcodex-x86_64-unknown-linux-musl.tar.gzcodex-aarch64-unknown-linux-musl.tar.gzEach archive contains a single entry with the platform baked into the name (e.g.,
codex-x86_64-unknown-linux-musl), so you likely want to rename it tocodexafter extracting it.Using Codex with your ChatGPT plan
Run
codexand select Sign in with ChatGPT. We recommend signing into your ChatGPT account to use Codex as part of your Plus, Pro, Team, Edu, or Enterprise plan. Learn more about what’s included in your ChatGPT plan.You can also use Codex with an API key, but this requires additional setup.
Docs
This repository is licensed under the Apache-2.0 License.