Repository: janlay/openai-cli
Branch: master
Commit: 933130188dc5
Files: 3
Total size: 20.6 KB
Directory structure:
gitextract_j5ojrd0i/
├── LICENSE
├── README.md
└── openai
================================================
FILE CONTENTS
================================================
================================================
FILE: LICENSE
================================================
MIT License
Copyright (c) 2023 Janlay Wu
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: README.md
================================================
# openai-cli
A universal cli for OpenAI, written in BASH.
# Features
- [x] Scalable architecture allows for continuous support of new APIs.
- [x] Custom API name, version, and all relevant properties.
- [x] Dry-run mode (without actually initiating API calls) to facilitate debugging of APIs and save costs.
- [x] New in v3: Supports any AI services that provide OpenAI-compatible APIs.
Important changes in version 3:
- `-v api_version` in previous versions is now removed. If you have a custom `OPENAI_API_ENDPOINT`, you need to append API version in it. The internal API version is remove as some services like DeepSeek don't have version prefix in `/v1/chat/completions`.
- `OPENAI_CHAT_MODEL` no longer supported, use `OPENAI_API_MODEL` instead.
- By default, the request no longer includes optional parameters `temperature` / `max_tokens`. You need to explictly add `+temperature` / `+max_tokens` to customize if necessary.
Available APIs:
- [x] `chat/completions` (default API)
- [x] `models`
- [x] `images/generations`
- [x] `embeddings`
- [x] `moderations`
The default API `chat/completions` provides:
- [x] Complete pipelining to interoperate with other applications
- [x] Allow prompts to be read from command line arguments, file, and stdin
- [x] Support streaming
- [x] Support multiple topics
- [x] Support continuous conversations.
- [ ] Token usage
# Installation
- [jq](https://stedolan.github.io/jq/) is required.
- Linux: `sudo apt install jq`
- macOS: `brew install jq`
- Download script and mark it executable:
```bash
curl -fsSLOJ https://go.janlay.com/openai
chmod +x openai
```
You may want to add this file to a directory in `$PATH`.
Also install the manual page, e.g.:
```bash
pandoc -s -f markdown -t man README.md > /usr/local/man/man1/openai.1
```
<details>
<summary>Further reading: curl's killer feature</summary>
<a href="https://daniel.haxx.se/blog/2020/09/10/store-the-curl-output-over-there/"><code>-OJ</code> is a killer feature</a>
</details>
Now you can try it out!
# Tips
## Getting started
To begin, type `openai -h` to access the help manual.
⚠️ If you run `openai` directly, it may appear to be stuck because it expects prompt content from stdin which is not yet available. To exit, simply press Ctrl+C to interrupt the process.
<details>
<summary>Why are you so serious?</summary>
What happens when the `openai` command is executed without any parameters? It means that:
- The default API used will be `chat/completions`, and the schema version will be `v1`.
- The prompt will be read from stdin.
- The program will wait for input while stdin remains empty.
</details>
## Quick Examples
The best way to understand how to use `openai` is to see various usage cases.
- Debug API data for testing purposes
`openai -n foo bar`
- Say hello to OpenAI
`openai Hello`
- Use another model
`openai +model=gpt-3.5-turbo-0301 Hello`
- Disable streaming, allow for more variation in answer
`openai +stream=false +temperature=1.1 Hello`
- Call another available API
`openai -a models`
- Create a topic named `en2fr` with initial prompt
`openai @en2fr Translate to French`
- Use existing topic
`openai @en2fr Hello, world!`
- Read prompt from clipboard then send result to another topic
`pbpaste | openai | openai @en2fr`
## Providing prompt
There are multiple ways to obtain a prompt using `openai`:
- Enclose the prompt in single quotes `'` or double quotes `"`
`openai "Please help me translate '你好' into English"`
- Use any argument that does not begin with a minus sign `-`
`openai Hello, world!`
- Place any arguments after `--`
`openai -n -- What is the purpose of the -- argument in Linux commands`
- Input from stdin
`echo 'Hello, world!' | openai`
- Specify a file path with `-f /path/to/file`
`openai -f question.txt`
- Use `-f-` for input from stdin
`cat question.txt | openai -f-`
Choose any one you like :-)
## OpenAI key
`$OPENAI_API_KEY` must be available to use this tool. Prepare your OpenAI key in `~/.profile` file by adding this line:
```bash
export OPENAI_API_KEY=sk-****
```
Or you may want to run with a temporary key for one-time use:
```bash
OPENAI_API_KEY=sk-**** openai hello
```
Environment variables can also be set in `$HOME/.openai/config`.
## Working with compatible AI services
You can set the environment variable `OPENAI_COMPATIBLE_PROVIDER` to another service name in uppercase, such as `DEEPSEEK`. Then set the three environment variables `DEEPSEEK_API_ENDPOINT`, `DEEPSEEK_API_KEY`, and `DEEPSEEK_API_MODEL` respectively:
```bash
export OPENAI_COMPATIBLE_PROVIDER=DEEPSEEK
export DEEPSEEK_API_ENDPOINT=https://api.deepseek.com
export DEEPSEEK_API_KEY=sk-***
export DEEPSEEK_API_MODEL=deepseek-chat
```
You may have noticed that if you set `OPENAI_COMPATIBLE_PROVIDER` to `FOO`, you also need to set `FOO_API_ENDPOINT`, `FOO_API_KEY`, and `FOO_API_MODEL` accordingly.
After gathering information on multiple AI providers, you can switch the AI service by setting a temporary environment variable, `OPENAI_COMPATIBLE_PROVIDER`, as shown below:
```bash
# Switch to DeepSeek
OPENAI_COMPATIBLE_PROVIDER=DEEPSEEK openai
# Switch to Qwen
OPENAI_COMPATIBLE_PROVIDER=QWEN openai
# Switch to the default provider (OpenAI)
OPENAI_COMPATIBLE_PROVIDER= openai
```
## Testing your API invocations
`openai` offers a [dry-run mode](https://en.wikipedia.org/wiki/Dry_run) that allows you to test command composition without incurring any costs. Give it a try!
```bash
openai -n hello, world!
# This would be same:
openai -n 'hello, world!'
```
<details>
<summary>Command and output</summary>
```
$ openai -n hello, world!
Dry-run mode, no API calls made.
Request URL:
--------------
https://api.openai.com/v1/chat/completions
Authorization:
--------------
Bearer sk-cfw****NYre
Payload:
--------------
{
"model": "gpt-3.5-turbo",
"temperature": 0.5,
"max_tokens": 200,
"stream": true,
"messages": [
{
"role": "user",
"content": "hello, world!"
}
]
}
```
</details>
With full pipelining support, you can achieve the same functionality using alternative methods:
```bash
echo 'hello, world!' | openai -n
```
<details>
<summary>For BASH gurus</summary>
This would be same:
```bash
echo 'hello, world!' >hello.txt
openai -n <hello.txt
```
Even this one:
```bash
openai -n <<<'hello, world!'
```
and this:
```bash
openai -n <<(echo 'hello, world!')
```
</details>
It seems you have understood the basic usage. Try to get real answer from OpenAI:
```bash
openai hello, world!
```
<details>
<summary>Command and output</summary>
```
$ openai hello, world!
Hello there! How can I assist you today?
```
</details>
## Topics
Topic starts with a `@` sign. so `openai @translate Hello, world!` means calling the specified topic `translate`.
To create new topic, like translate, with the initial prompt (system role, internally):
```bash
openai @translate 'Translate, no other words: Chinese -> English, Non-Chinese -> Chinese'
```
Then you can use the topic by
```bash
openai @translate 'Hello, world!'
```
You should get answer like `你好,世界!`.
Again, to see what happens, use the dry-run mode by adding `-n`. You will see the payload would be sent:
```json
{
"model": "gpt-3.5-turbo",
"temperature": 0.5,
"max_tokens": 200,
"stream": true,
"messages": [
{
"role": "system",
"content": "Translate, no other words: Chinese -> English, Non-Chinese -> Chinese"
},
{
"role": "user",
"content": "Hello, world!"
}
]
}
```
## Chatting
All use cases above are standalone queries, not converstaions. To chat with OpenAI, use `-c`. This can also continue existing topic conversation by prepending `@topic`.
Please note that chat requests will quickly consume tokens, leading to increased costs.
## Advanced
To be continued.
## Manual
To be continued.
# LICENSE
This project uses the MIT license. Please see [LICENSE](https://github.com/janlay/openai-cli/blob/master/LICENSE) for more information.
================================================
FILE: openai
================================================
#!/usr/bin/env bash
#
# OpenAI CLI v3.0
# Created by @janlay
#
set -eo pipefail
declare _config_dir="${OPENAI_DATA_DIR:-$XDG_CONFIG_HOME}"
OPENAI_DATA_DIR="${_config_dir:-$HOME/.openai}"
# Read config file?
[ -e "$OPENAI_DATA_DIR"/config ] && . "$OPENAI_DATA_DIR"/config
# openai-cli accepts various exported environment variables:
# OPENAI_API_KEY : OpenAI's API key
# OPENAI_API_ENDPOINT : Custom API endpoint
# OPENAI_API_MODEL : Which model to use
# OPENAI_DATA_DIR : Directory to store data
OPENAI_API_ENDPOINT="${OPENAI_API_ENDPOINT:-https://api.openai.com/v1}"
OPENAI_API_KEY="${OPENAI_API_KEY:-}"
OPENAI_API_MODEL="${OPENAI_API_MODEL:-gpt-4o}"
# defaults
readonly _app_name=openai _app_version=3.0
readonly provider="${OPENAI_COMPATIBLE_PROVIDER:-OPENAI}"
readonly default_api_name=chat/completions default_model="$OPENAI_API_MODEL" default_topic=General
declare -i chat_mode=0 dry_run=0
declare tokens_file="$OPENAI_DATA_DIR/total_tokens" api_name=$default_api_name topic=$default_topic
declare dump_file dumped_file data_file temp_dir rest_args prompt_file prompt
trap cleanup EXIT
cleanup() {
if [ -d "$temp_dir" ]; then
rm -rf -- "$temp_dir"
fi
}
get_env() {
local env_key="${provider}_${1}"
echo "${!env_key}"
}
raise_error() {
[ "$2" = 0 ] || echo -n "$_app_name: " >&2
echo -e "$1" >&2
exit "${2:-1}"
}
load_conversation() {
[ -f "$data_file" ] && cat "$data_file" || echo '{}'
}
update_conversation() {
local entry="$2" data
[[ $entry == \{* ]] || entry=$(jq -n --arg content "$entry" '{$content}')
entry=$(jq --arg role "$1" '. += {$role}' <<<"$entry")
data=$(load_conversation)
jq --argjson item "$entry" '.messages += [$item]' <<<"$data" >"$data_file"
}
save_tokens() {
local data num="$1"
[ -f "$data_file" ] && {
data=$(load_conversation)
jq --argjson tokens "$num" '.total_tokens += $tokens' <<<"$data" >"$data_file"
}
data=0
[ -f "$tokens_file" ] && data=$(cat "$tokens_file")
echo "$((data + num))" >"$tokens_file"
}
read_prompt() {
# read prompt from args first
local word accepts_props=1 props='{}' real_prompt
if [ ${#rest_args[@]} -gt 0 ]; then
# read file $prompt_file word by word, and extract words starting with '+'
for word in "${rest_args[@]}"; do
if [ $accepts_props -eq 1 ] && [ "${word:0:1}" = '+' ]; then
word="${word:1}"
# determine value's type for jq
local options=(--arg key "${word%%=*}") value="${word#*=}" arg=--arg
[[ $value =~ ^[+-]?\ ?[0-9.]+$ || $value = true || $value = false || $value == [\[\{]* ]] && arg=--argjson
options+=("$arg" value "$value")
props=$(jq "${options[@]}" '.[$key] = $value' <<<"$props")
else
real_prompt="$real_prompt $word"
accepts_props=0
fi
done
[ -n "$props" ] && echo "$props" >"$temp_dir/props"
fi
if [ -n "$real_prompt" ]; then
[ -n "$prompt_file" ] && echo "* Prompt file \`$prompt_file' will be ignored as the prompt parameters are provided." >&2
echo -n "${real_prompt:1}" >"$temp_dir/prompt"
elif [ -n "$prompt_file" ]; then
[ -f "$prompt_file" ] || raise_error "File not found: $prompt_file." 3
[[ -s $prompt_file ]] || raise_error "Empty file: $prompt_file." 4
fi
}
openai_models() {
call_api | jq
}
openai_moderations() {
local prop_file="$temp_dir/props" payload="{\"model\": \"text-moderation-latest\"}"
# overwrite default properties with user's
read_prompt
[ -f "$prop_file" ] && payload=$(jq -n --argjson payload "$payload" '$payload | . += input' <"$prop_file")
# append user's prompt to messages
local payload_file="$temp_dir/payload" input_file="$temp_dir/prompt"
[ -f "$input_file" ] || input_file="${prompt_file:-/dev/stdin}"
jq -Rs -cn --argjson payload "$payload" '$payload | .input = input' "$input_file" >"$payload_file"
call_api | jq -c '.results[]'
}
openai_images_generations() {
local prop_file="$temp_dir/props" payload="{\"n\": 1, \"size\": \"1024x1024\"}"
# overwrite default properties with user's
read_prompt
[ -f "$prop_file" ] && payload=$(jq -n --argjson payload "$payload" '$payload | . += input | . += {response_format: "url"}' <"$prop_file")
# append user's prompt to messages
local payload_file="$temp_dir/payload" input_file="$temp_dir/prompt"
[ -f "$input_file" ] || input_file="${prompt_file:-/dev/stdin}"
jq -Rs -cn --argjson payload "$payload" '$payload | .prompt = input' "$input_file" >"$payload_file"
call_api | jq -r '.data[].url'
}
openai_embeddings() {
local prop_file="$temp_dir/props" payload="{\"model\": \"text-embedding-ada-002\"}"
# overwrite default properties with user's
read_prompt
[ -f "$prop_file" ] && payload=$(jq -n --argjson payload "$payload" '$payload | . += input' <"$prop_file")
# append user's prompt to messages
local payload_file="$temp_dir/payload" input_file="$temp_dir/prompt"
[ -f "$input_file" ] || input_file="${prompt_file:-/dev/stdin}"
jq -Rs -cn --argjson payload "$payload" '$payload | .input = input' "$input_file" >"$payload_file"
call_api | jq -c
}
openai_chat_completions() {
local streaming=0
if [ -n "$dumped_file" ]; then
# only succeeds when the dumped file is not streamed
jq -er '.choices[0].message.content' <"$dumped_file" 2>/dev/null && return
streaming=1
else
local prop_file="$temp_dir/props" model payload
model=$(get_env API_MODEL)
payload="{\"model\": \"$model\", \"stream\": true}"
# overwrite default properties with user's
read_prompt
[ -f "$prop_file" ] && {
payload=$(jq -n --argjson payload "$payload" '$payload | . += input | . += {messages: []}' <"$prop_file")
}
local data
data=$(load_conversation | jq .messages)
[ "$topic" != "$default_topic" ] && {
if [ $chat_mode -eq 1 ]; then
# load all messages for chat mode
payload=$(jq --argjson messages "$data" 'setpath(["messages"]; $messages)' <<<"$payload")
else
# load only first message for non-chat mode
payload=$(jq --argjson messages "$data" 'setpath(["messages"]; [$messages[0]])' <<<"$payload")
fi
}
# append user's prompt to messages
local payload_file="$temp_dir/payload" input_file="$temp_dir/prompt"
[ -f "$input_file" ] || input_file="${prompt_file:-/dev/stdin}"
jq -Rs -cn --argjson payload "$payload" '$payload | .messages += [{role: "user", content: input}]' "$input_file" >"$payload_file"
streaming=$(jq -e 'if .stream then 1 else 0 end' <"$payload_file")
# check o1's parameters
jq -e 'select(.model | test("^o\\d")) and (.temperature or .top_p or .presence_penalty or .frequency_penalty .logprobs or .top_logprobs .logit_bias)' <"$payload_file" &>/dev/null && raise_error 'One or more unsupported API parameters used for model o1. See https://platform.openai.com/docs/guides/reasoning#limitations for more details.' 5
fi
local chunk reason text role fn_name
if [ $streaming -eq 1 ]; then
call_api | while read -r chunk; do
[ -z "$chunk" ] && continue
chunk=$(cut -d: -f2- <<<"$chunk" | jq '.choices[0]')
reason=$(jq -r '.finish_reason // empty' <<<"$chunk")
[[ $reason = stop || $reason = function_call ]] && break
[ -n "$reason" ] && raise_error "API error: $reason" 10
# get role and function info from the first chunk
[ -z "$role" ] && {
role=$(jq -r '.delta.role // empty' <<<"$chunk")
fn_name=$(jq -r '.delta.function_call.name // empty' <<<"$chunk")
}
# workaround: https://stackoverflow.com/a/15184414
chunk=$(
jq -r '.delta | .function_call.arguments // .content // empty' <<<"$chunk"
printf x
)
# ensure chunk is not empty
[ ${#chunk} -ge 2 ] || continue
chunk="${chunk:0:${#chunk}-2}"
text="$text$chunk"
echo -n "$chunk"
done
[ "$dry_run" -eq 0 ] && echo
else
text=$(call_api | jq -er '.choices[0].message.content')
echo "$text"
fi
# append response to topic file for chat mode
if [ "$chat_mode" -eq 1 ]; then
[ -n "$fn_name" ] && text=$(jq -n --arg name "$fn_name" --argjson arguments "${text:-\{\}}" '{function_call: {$name, $arguments}}')
update_conversation user "$prompt"
update_conversation "$role" "$text"
fi
}
# shellcheck disable=SC2120
call_api() {
# return dumped file if specified
[ -n "$dumped_file" ] && {
cat "$dumped_file"
return
}
local url="$(get_env API_ENDPOINT)/$api_name" auth="Bearer $(get_env API_KEY)"
# dry-run mode
[ "$dry_run" -eq 1 ] && {
echo "Dry-run mode, no API calls made."
echo -e "\nRequest URL:\n--------------\n$url"
echo -en "\nAuthorization:\n--------------\n"
sed -E 's/(sk-.{3}).{41}/\1****/' <<<"$auth"
[ -n "$payload_file" ] && {
echo -e "\nPayload:\n--------------"
jq <"$payload_file"
}
exit 0
} >&2
local args=("$url" --no-buffer -fsSL -H 'Content-Type: application/json' -H "Authorization: $auth")
[ -n "$payload_file" ] && args+=(-d @"$payload_file")
[ $# -gt 0 ] && args+=("$@")
[ -n "$dump_file" ] && args+=(-o "$dump_file")
curl "${args[@]}"
[ -z "$dump_file" ] || exit 0
}
create_topic() {
update_conversation system "${rest_args[*]}"
raise_error "Topic '$topic' created with initial prompt '${rest_args[*]}'" 0
}
usage() {
raise_error "OpenAI Client v$_app_version
SYNOPSIS
ABSTRACT
$_app_name [-n] [-a api_name] [-o dump_file] [INPUT...]
$_app_name -i dumped_file
DEFAULT_API ($default_api_name)
$_app_name [-c] [+property=value...] [@TOPIC] [-f file | prompt ...]
prompt
Prompt string for the request to OpenAI API. This can consist of multiple
arguments, which are considered to be separated by spaces.
-f file
A file to be read as prompt. If file is - or neither this parameter nor a prompt
is specified, read from standard input.
-c
Continues the topic, the default topic is '$default_topic'.
property=value
Overwrites default properties in payload. Prepend a plus sign '+' before property=value.
eg: +model=gpt-3.5-turbo-0301, +stream=false
TOPICS
Topic starts with an at sign '@'.
To create new topic, use \`$_app_name @new_topic initial prompt'
OTHER APIS
$_app_name -a models
GLOBAL OPTIONS
Global options apply to all APIs.
-a name
API name, default is '$default_api_name'.
-n
Dry-run mode, don't call API.
-o filename
Dumps API response to a file and exits.
-i filename
Uses specified dumped file instead of requesting API.
Any request-related arguments and user input are ignored.
--
Ignores rest of arguments, useful when unquoted prompt consists of '-'.
-h
Shows this help" 0
}
parse() {
local opt
while getopts 'v:a:f:i:o:cnh' opt; do
case "$opt" in
c)
chat_mode=1
;;
a)
api_name="$OPTARG"
;;
f)
prompt_file="$OPTARG"
[ "$prompt_file" = - ] && prompt_file=
;;
n)
dry_run=1
;;
i)
dumped_file="$OPTARG"
;;
o)
dump_file="$OPTARG"
;;
h | ?)
usage
;;
esac
done
shift "$((OPTIND - 1))"
# extract the leading topic
[[ "$1" =~ ^@ ]] && {
topic="${1#@}"
shift
}
[ $chat_mode -eq 0 ] || {
[[ -n $topic && $topic != "$default_topic" ]] || raise_error 'Topic is required for chatting.' 2
}
rest_args=("$@")
}
check_bin() {
command -v "$1" >/dev/null || raise_error "$1 not found. Use package manager (Homebrew, apt-get etc.) to install it." "${2:-1}"
}
main() {
# show compatible provider if applicable
if [ "${SUPPRESS_PROVIDER_TIPS:-0}" -eq 0 ] && [ -n "$OPENAI_COMPATIBLE_PROVIDER" ]; then
echo "OpenAI compatible provider: $OPENAI_COMPATIBLE_PROVIDER" >&2
fi
# check required config
local key
for key in API_ENDPOINT API_KEY API_MODEL; do
[ -z "$(get_env "$key")" ] && raise_error "Missing environment variable: ${provider}_$key." 1
done
parse "$@"
check_bin jq 10
mkdir -p "$OPENAI_DATA_DIR"
data_file="$OPENAI_DATA_DIR/$topic.json"
temp_dir=$(mktemp -d)
if [[ $topic == "$default_topic" || -f "$data_file" ]]; then
local fn="openai_${api_name//\//_}"
[ "$(type -t "$fn")" = function ] || raise_error "API '$api_name' is not available." 12
"$fn"
else
[ ${#rest_args[@]} -gt 0 ] || raise_error "Prompt for new topic is required" 13
create_topic
fi
}
main "$@"
gitextract_j5ojrd0i/ ├── LICENSE ├── README.md └── openai
Condensed preview — 3 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (23K chars).
[
{
"path": "LICENSE",
"chars": 1066,
"preview": "MIT License\n\nCopyright (c) 2023 Janlay Wu\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\n"
},
{
"path": "README.md",
"chars": 8046,
"preview": "# openai-cli\nA universal cli for OpenAI, written in BASH.\n\n# Features\n- [x] Scalable architecture allows for continuous "
},
{
"path": "openai",
"chars": 11972,
"preview": "#!/usr/bin/env bash\n#\n# OpenAI CLI v3.0\n# Created by @janlay\n#\n\nset -eo pipefail\n\ndeclare _config_dir=\"${OPENAI_DATA_D"
}
]
About this extraction
This page contains the full source code of the janlay/openai-cli GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 3 files (20.6 KB), approximately 6.1k tokens. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.