Run GPT‑5.5 from the Terminal with a Single OpenAI CLI Command

OpenAI has open‑sourced the Apache‑2.0 licensed openai‑cli, which can be installed via Homebrew or Go and lets users invoke models such as GPT‑5.5 directly from the command line, outputting structured JSON/YAML and supporting piping, file arguments, and built‑in GJSON filtering, streamlining AI workflows without writing SDK code.

AI Engineering
AI Engineering
AI Engineering
Run GPT‑5.5 from the Terminal with a Single OpenAI CLI Command

OpenAI released the open‑source CLI tool openai-cli under the Apache 2.0 license. The project lead described it as a “small ship / passion project,” emphasizing its lightweight nature.

# Homebrew
brew install openai/tools/openai

# Go
go install 'github.com/openai/openai-cli/cmd/openai@latest'

After installation, a model can be invoked with a single command, for example:

export OPENAI_API_KEY="sk-..."
openai responses create --input "Say this is a test" --model gpt-5.5

From Writing Code to Writing Commands

Previously, using OpenAI required language‑specific SDKs and multi‑line scripts. The CLI calls the Responses API directly and supports all cloud‑side tools, including web search, code interpreter, file retrieval, image generation, image editing, speech‑to‑text, and text‑to‑speech. This enables a complete agent workflow to run entirely in the terminal.

Unix‑Style Practical Value

The output is available in Unix‑style structured formats (JSON, YAML, JSONL, pretty, raw), allowing easy piping to other tools. Built‑in GJSON syntax enables field extraction similar to jq. File arguments use the @file.ext syntax, matching curl conventions, and binary data can be supplied with @data:// for explicit base64 encoding.

Bringing AI Back to the Unix World

The CLI reconnects the OpenAI API to traditional Unix toolchains—JSON/YAML pipelines, CI systems, audit logs, and secret management—so existing workflows can incorporate LLM agents without Python dependencies. Management commands allow project creation and API‑key assignment, which benefits operations and team administrators.

Anthropic’s claude-cli follows a similar direction, indicating that command‑line interfaces are the next stage of LLM integration as AI moves from a novelty to infrastructure.

The tool follows a “small and beautiful” philosophy, focusing on a clean command‑line interface rather than bundling excessive functionality. For developers who frequently call the OpenAI API in scripts, the CLI can become a daily productivity aid, reducing code overhead.

The project is still being refined, with additional documentation planned. Repository: https://github.com/openai/openai-cli

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

CLIAIautomationOpenAIUnixGPT-5.5
AI Engineering
Written by

AI Engineering

Focused on cutting‑edge product and technology information and practical experience sharing in the AI field (large models, MLOps/LLMOps, AI application development, AI infrastructure).

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.