Pinned Loading
-
aroc
aroc PublicAgentic Read-Only Chat - A rich terminal chat interface powered by locally installed llama.cpp and g023/g023-Qwen3.5-9B-GGUF:IQ2_M
Python
-
harnessharvest
harnessharvest PublicA self-learning, self-correcting, LLM-powered harness creation and management system with FAISS-powered RAG, sandboxed execution, and autonomous improvement modes. Powered by Ollama and offline mod…
-
localmodelrouter
localmodelrouter PublicLocal LLM server that provides drop-in API compatibility with both Ollama and OpenAI, using your locally installed [llama.cpp](https://github.com/ggerganov/llama.cpp) 's `llama-server` as the infer…
Python 2
-
turboquant
turboquant PublicStandalone TurboQuant KV Cache Inference for https://huggingface.co/g023/Qwen3-1.77B-g023
Python 4
-
g023-OllamaMan
g023-OllamaMan PublicA Concept Ollama Server Management OS that runs in a web browser.
PHP 7
If the problem persists, check the GitHub status page or contact support.
