Skip to content

Commit 45b31a9

Browse files
authored
feat(adapters): respect OLLAMA_HOST env var for remote connections (#2878)
1 parent 6c423e4 commit 45b31a9

File tree

4 files changed

+46
-13
lines changed

4 files changed

+46
-13
lines changed

doc/codecompanion.txt

Lines changed: 19 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
*codecompanion.txt* For NVIM v0.11 Last change: 2026 March 07
1+
*codecompanion.txt* For NVIM v0.11 Last change: 2026 March 09
22

33
==============================================================================
44
Table of Contents *codecompanion-table-of-contents*
@@ -1327,9 +1327,10 @@ The configuration for both types of adapters is exactly the same, however they
13271327
sit within their own tables (`adapters.http.*` and `adapters.acp.*`) and have
13281328
different options available. HTTP adapters use `models` to allow users to
13291329
select the specific LLM they’d like to interact with. ACP adapters use
1330-
`commands` to allow users to customize their interaction with agents (e.g.�
1331-
enabling `yolo` mode). As there is a lot of shared functionality between the
1332-
two adapters, it is recommend that you read this page alongside the ACP one.
1330+
`commands` to allow users to customize their interaction with agents
1331+
(e.g. enabling `yolo` mode). As there is a lot of shared functionality between
1332+
the two adapters, it is recommend that you read this page alongside the ACP
1333+
one.
13331334

13341335

13351336
CHANGING THE DEFAULT ADAPTER ~
@@ -1383,7 +1384,7 @@ the adapter’s URL, headers, parameters and other fields at runtime.
13831384
<https://github.com/olimorris/codecompanion.nvim/discussions/601>
13841385
Supported `env` value types: - **Plain environment variable name (string)**: if
13851386
the value is the name of an environment variable that has already been set
1386-
(e.g.`"HOME"` or `"GEMINI_API_KEY"`), the plugin will read the value. -
1387+
(e.g. `"HOME"` or `"GEMINI_API_KEY"`), the plugin will read the value. -
13871388
**Command (string prefixed with cmd:)**: any value that starts with `cmd:` will
13881389
be executed via the shell. Example: `"cmd:op read
13891390
op://personal/Gemini/credential --no-newline"`. - **Function**: you can provide
@@ -1560,8 +1561,15 @@ LLAMA.CPP WITH --REASONING-FORMAT DEEPSEEK
15601561

15611562
OLLAMA (REMOTELY)
15621563

1563-
To use Ollama remotely, change the URL in the env table, set an API key and
1564-
pass it via an "Authorization" header:
1564+
The simplest way to connect to a remote Ollama instance is to set the
1565+
`OLLAMA_HOST` environment variable (the same variable used by the Ollama CLI):
1566+
1567+
>bash
1568+
export OLLAMA_HOST="http://192.168.1.100:11434"
1569+
<
1570+
1571+
Alternatively, configure it directly in your setup using `extend()`. If you
1572+
need authentication, set an API key and pass it via an "Authorization" header:
15651573

15661574
>lua
15671575
require("codecompanion").setup({
@@ -3384,7 +3392,7 @@ The fastest way to copy an LLM’s code output is with `gy`. This will yank the
33843392
nearest codeblock.
33853393

33863394

3387-
APPLYING AN LLM€�S EDITS TO A BUFFER OR FILE ~
3395+
APPLYING AN LLMS EDITS TO A BUFFER OR FILE ~
33883396

33893397
The |codecompanion-usage-chat-buffer-agents-tools-files| tool, combined with
33903398
the |codecompanion-usage-chat-buffer-editor-context.html-buffer| editor context
@@ -4222,7 +4230,7 @@ message to the LLM.
42224230

42234231
[!IMPORTANT] With the exception of `#{buffer}` and `#{buffers}`, editor context
42244232
captures a point-in-time snapshot when your message is sent. If the underlying
4225-
data changes (e.g.new diagnostics, a different quickfix list), simply use the
4233+
data changes (e.g. new diagnostics, a different quickfix list), simply use the
42264234
context again in a new message to share the latest state.
42274235

42284236
#BUFFER ~
@@ -5118,7 +5126,7 @@ These handlers manage tool/function calling:
51185126
as a great reference to understand how they’re working with the output of the
51195127
API
51205128

5121-
OPENAI€�S API OUTPUT
5129+
OPENAIS API OUTPUT
51225130

51235131
If we reference the OpenAI documentation
51245132
<https://platform.openai.com/docs/guides/text-generation/chat-completions-api>
@@ -7024,7 +7032,7 @@ tool to function. In the case of Anthropic, we insert additional headers.
70247032
<
70257033

70267034
Some adapter tools can be a `hybrid` in terms of their implementation. That is,
7027-
they’re an adapter tool that requires a client-side component (i.e.a
7035+
they’re an adapter tool that requires a client-side component (i.e. a
70287036
built-in tool). This is the case for the
70297037
|codecompanion-usage-chat-buffer-agents-tools-memory| tool from Anthropic. To
70307038
allow for this, ensure that the tool definition in `available_tools` has

doc/configuration/adapters-http.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -362,7 +362,13 @@ require("codecompanion").setup({
362362

363363
### Ollama (remotely)
364364

365-
To use Ollama remotely, change the URL in the env table, set an API key and pass it via an "Authorization" header:
365+
The simplest way to connect to a remote Ollama instance is to set the `OLLAMA_HOST` environment variable (the same variable used by the Ollama CLI):
366+
367+
```bash
368+
export OLLAMA_HOST="http://192.168.1.100:11434"
369+
```
370+
371+
Alternatively, configure it directly in your setup using `extend()`. If you need authentication, set an API key and pass it via an "Authorization" header:
366372

367373
```lua
368374
require("codecompanion").setup({

lua/codecompanion/adapters/http/ollama/init.lua

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,9 @@ return {
2626
["Content-Type"] = "application/json",
2727
},
2828
env = {
29-
url = "http://localhost:11434",
29+
url = function()
30+
return os.getenv("OLLAMA_HOST") or "http://localhost:11434"
31+
end,
3032
},
3133
handlers = {
3234
setup = function(self)

tests/adapters/http/test_ollama.lua

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -343,4 +343,21 @@ T["Ollama adapter"]["No Streaming"]["can output for the inline assistant"] = fun
343343
h.eq("Dynamic Scripting language", adapter.handlers.inline_output(adapter, json).output)
344344
end
345345

346+
T["Ollama adapter"]["OLLAMA_HOST"] = new_set()
347+
348+
T["Ollama adapter"]["OLLAMA_HOST"]["uses OLLAMA_HOST when set"] = function()
349+
local adapter_utils = require("codecompanion.utils.adapters")
350+
vim.env.OLLAMA_HOST = "http://192.168.1.100:11434"
351+
adapter_utils.get_env_vars(adapter)
352+
h.eq("http://192.168.1.100:11434", adapter.env_replaced.url)
353+
vim.env.OLLAMA_HOST = nil
354+
end
355+
356+
T["Ollama adapter"]["OLLAMA_HOST"]["fallback to localhost when OLLAMA_HOST is not set"] = function()
357+
local adapter_utils = require("codecompanion.utils.adapters")
358+
vim.env.OLLAMA_HOST = nil
359+
adapter_utils.get_env_vars(adapter)
360+
h.eq("http://localhost:11434", adapter.env_replaced.url)
361+
end
362+
346363
return T

0 commit comments

Comments
 (0)