Skip to content

[Bug]: response_format.type #344

@mr-Lime197

Description

@mr-Lime197

Do you need to file an issue?

  • I have searched the existing issues and this bug is not already filed.
  • I believe this is a legitimate bug, not just a question or feature request.

Describe the bug

I successfully deployed the model gemma-4-e2b in LM Studio, and the chat is working properly.
I can't make visualization, error:
openai.BadRequestError: Error code: 400 - {'error': "'response_format.type' must be 'json_schema' or 'text'"}
I'm looking at the logs, I saw that the request for the model is:
"response_format": { "type": "json_object" }
I haven't checked the other functions, maybe they have this problem too.

Steps to reproduce

No response

Expected Behavior

The visualization function is working properly

Related Module

API/Backend

Configuration Used

No response

Logs and screenshots

No response

Additional Information

  • DeepTutor Version:
  • Operating System:
  • Python Version:
  • Node.js Version:
  • Browser (if applicable):
  • Related Issues:

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions