Using Codex CLI with gpt-oss:120b on an NVIDIA DGX Spark via Tailscale
Read OriginalThis technical blog post details how to configure and use OpenAI's Codex CLI coding agent with the gpt-oss:120b model running locally in Ollama on an NVIDIA DGX Spark. The setup uses a Tailscale network to allow the author to run the Codex CLI from their laptop anywhere in the world against this self-hosted model, demonstrating its use by building a Space Invaders clone.
0 comments
Comments
No comments yet
Be the first to share your thoughts!
Browser Extension
Get instant access to AllDevBlogs from your browser
Top of the Week
1
React vs Browser APIs (Mental Model)
Jivbcoop
•
4 votes
2
3
Better react-hook-form Smart Form Components
Maarten Hus
•
2 votes
4
Building Type-Safe Compound Components
TkDodo Dominik Dorfmeister
•
2 votes
5
Quoting Thariq Shihipar
Simon Willison
•
1 votes
6
Dew Drop – January 15, 2026 (#4583)
Alvin Ashcraft
•
1 votes
7
Using Browser Apis In React Practical Guide
Jivbcoop
•
1 votes