Skip to content

[Feature]: Add ollama-cloud: configured model ollama-cloud/kimi-k2.5:cloud is not valid #3515

@Pivert

Description

@Pivert

Prerequisites

  • I will write this issue in English (see our Language Policy)
  • I have searched existing issues and discussions to avoid duplicates
  • This feature request is specific to oh-my-opencode (not OpenCode core)
  • I have read the documentation or asked an AI coding agent with this project's GitHub URL loaded and couldn't find the answer

Problem Description

  • When using kimi k2.5 for Sisyphus, we have the warning message: Agent Sisyphus - Ultraworker's configured model ollama-cloud/kimi-k2.5:cloud is not valid despite kimi k2.5 already being a fall back model for Sisyphus.
  • ollama-cloud/glm-5.1:cloud should be a fallback model

Proposed Solution

I think it boils down to:

  • Registering ollama-cloud as a provider (nothing to do with ollama / local inference)
  • Properly manage the :cloud suffix that we have on all ollama-cloud models.
  • Match glm-5.1, and provide sisyphus integration & prompt
  • Ensure we do not have messages :
    • Agent Sisyphus - Ultraworker's configured model ollama-cloud/kimi-k2.5:cloud is not valid despite kimi k2.5 already being a fall back model for Sisyphus.

Alternatives Considered

There's no direct alternative. Just accept the permanent warnings, and work like that. It's non blocking warnings.

Doctor Output (Optional)

bunx not used

Additional Context

No response

Feature Type

Other

Contribution

  • I'm willing to submit a PR for this feature
  • I can help with testing
  • I can help with documentation

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions