Do you need to file an issue?
Describe the bug
I successfully deployed the model gemma-4-e2b in LM Studio, and the chat is working properly.
I can't make visualization, error:
openai.BadRequestError: Error code: 400 - {'error': "'response_format.type' must be 'json_schema' or 'text'"}
I'm looking at the logs, I saw that the request for the model is:
"response_format": { "type": "json_object" }
I haven't checked the other functions, maybe they have this problem too.
Steps to reproduce
No response
Expected Behavior
The visualization function is working properly
Related Module
API/Backend
Configuration Used
No response
Logs and screenshots
No response
Additional Information
- DeepTutor Version:
- Operating System:
- Python Version:
- Node.js Version:
- Browser (if applicable):
- Related Issues:
Do you need to file an issue?
Describe the bug
I successfully deployed the model
gemma-4-e2bin LM Studio, and the chat is working properly.I can't make visualization, error:
openai.BadRequestError: Error code: 400 - {'error': "'response_format.type' must be 'json_schema' or 'text'"}I'm looking at the logs, I saw that the request for the model is:
"response_format": { "type": "json_object" }I haven't checked the other functions, maybe they have this problem too.
Steps to reproduce
No response
Expected Behavior
The visualization function is working properly
Related Module
API/Backend
Configuration Used
No response
Logs and screenshots
No response
Additional Information