EXEC:PROMPT(1)               User Commands              EXEC:PROMPT(1)

NAME

execprompt - terminal-native LLM client for local and remote models

SYNOPSIS

execprompt [--local] [--cloud] [--offline]
execprompt [--model=] [--theme=]
execprompt --help | --version

DESCRIPTION

[SYS]
Modern AI interfaces require cloud connectivity, tolerate mouse-driven GUIs, and stream your conversations to third-party logs. The terminal has been abandoned in favor of rounded corners and animated gradients.
[USR]
I need to run local models with a keyboard-first workflow. No telemetry. No hand-holding. Just stdin, processing, and stdout.
[EXEC:PROMPT]
Execute Your Thoughts.

ExecPrompt multiplexes local Ollama instances with cloud APIs in a TTY-optimized interface. Function over form, executed with style.
  • Local-first: Ollama at localhost:11434
  • Cloud-capable: Ollama Cloud API support
  • Offline-ready: Full functionality without network
  • Theme system: 5 cyberpunk palettes (P1-P5)
  • Export/import: JSON conversation backups
  • No tracking: Your prompts stay private

OPTIONS

-rwxr-xr-x user -- --local Connect to Ollama at localhost:11434
-rwxr-xr-x user -- --cloud Authenticate to Ollama Cloud API
-rwxr-xr-x user -- --offline Block all network requests
-rwxr-xr-x user -- --theme Set color palette (green|amber|white|cyan|synthwave)

EXAMPLES

[CODE]
# Start with local Ollama
execprompt --local --model=llama3.2

# Export conversation history
execprompt --export > backup.json

# Switch to amber theme
execprompt --theme=amber

FILES

~/.config/execprompt/config.toml
~/.local/share/execprompt/history
~/.local/share/execprompt/themes/

SEE ALSO

install(1), privacy(7), changelog(7), ollama(1)