A terminal-based AI assistant that provides both interactive REPL and one-shot modes for interacting with LLM providers. Built in Go using the Bubble Tea TUI framework.
- REPL Mode: Interactive terminal interface with conversation history
- One-shot Mode: Single command execution, perfect for scripting
- Multiple LLM Providers: Currently supports LMStudio (OpenAI-compatible)
- Clean Architecture: Redux-like state management with provider pattern
- Thread-safe: Concurrent operations with proper synchronization
Download the latest binary from the releases page and place it in your PATH.
git clone https://github.com/adamveld12/tai.git
cd tai
make build
sudo mv build/tai /usr/local/bin/REPL Mode (Interactive):
taiOne-shot Mode:
tai --oneshot "What's the weather like?"
echo "Explain this code" | tai --oneshot
tai --oneshot "Summarize this:" < file.txt- Go 1.24.4 or later
- LMStudio running on
localhost:1234(default provider)
# Clone the repository
git clone https://github.com/adamveld12/tai.git
cd tai
# Install dependencies
make deps
# Build the project
make build
# Run the application
make run
# Run tests
make test
# Run with race detection
make test-race
# Check code quality
make checkmake build # Build binary to ./build/tai
make run # Run the application
make test # Run tests with coverage
make test-race # Run tests with race detection
make check # Run all quality checks (fmt, vet, lint, test)
make clean # Clean build artifacts
make install # Install to $GOPATH/bin- Download and install LMStudio
- Load your preferred model
- Start the local server (default:
http://localhost:1234/v1) - Run TAI - it will automatically connect
Alternative: Use the included helper:
make lmstudio # Start LMStudio serverTAI follows a layered architecture with Redux-like state management:
cmd/tai/main.go → Entry point and mode selection
internal/cli/ → Configuration and one-shot handler
internal/ui/ → Bubble Tea UI components (REPL)
internal/state/ → Redux-like state management
internal/llm/ → Provider interface and implementations
- State Management: Immutable state updates with thread-safe dispatching
- Provider Pattern: Pluggable LLM providers implementing a common interface
- Mode Separation: REPL for interactive use, one-shot for scripting
- UI Components: Modular Bubble Tea components with clean separation
TAI uses sensible defaults but can be configured:
- LLM Provider: Currently LMStudio at
http://localhost:1234/v1 - Models: Automatically detects available models from provider
- REPL Commands:
:help,:clear,:quit
The project emphasizes production confidence with comprehensive testing:
make test # Run all tests with coverage
make test-race # Run with race detection
make test-coverage # Generate HTML coverage reportCurrent coverage: 92.7% (LLM providers), 80% (state management)
- Fork the repository
- Create a feature branch:
git checkout -b feature-name - Make changes and add tests
- Run quality checks:
make check - Submit a pull request
- Additional LLM providers (OpenAI, Anthropic, Ollama)
- Tool system for file operations and shell execution
- Enhanced logging and formatting
- Configuration file support
- CI/CD and automated releases
MIT License - see LICENSE.md file for details.
- Bubble Tea - TUI framework
- LMStudio - Local LLM runtime
- Lipgloss - Terminal styling