⚡ Accelerate chat and IDE workflows with a proxy for llama.cpp, managing slots and cached context for efficient, low-latency interactions.
-
Updated
Apr 23, 2026 - Python
⚡ Accelerate chat and IDE workflows with a proxy for llama.cpp, managing slots and cached context for efficient, low-latency interactions.
Web proxy caching for Node [Not Maintained]
Add a description, image, and links to the proxycache topic page so that developers can more easily learn about it.
To associate your repository with the proxycache topic, visit your repo's landing page and select "manage topics."