Fix #180: Handle unsupported response_format parameter for non-OpenAI providers#186
Open
hkc5 wants to merge 1 commit into666ghj:mainfrom
Open
Fix #180: Handle unsupported response_format parameter for non-OpenAI providers#186hkc5 wants to merge 1 commit into666ghj:mainfrom
hkc5 wants to merge 1 commit into666ghj:mainfrom
Conversation
The LLM client now gracefully handles providers (like Groq) that don't
support the OpenAI-specific response_format={type: json_object} parameter.
Changes:
- llm_client.py: Added try/except with fallback when response_format fails
- simulation_config_generator.py: Same fallback logic for direct API calls
- oasis_profile_generator.py: Same fallback logic for direct API calls
When a 400/500 error related to response_format is detected, the code
automatically retries without the parameter and relies on the system
prompt to enforce JSON output format.
|
很关键,希望赶紧修复 |
|
感谢,已经验证确实有用,可以用,终于可以使用我自己的API了。 |
Author
|
This PR fixes the 500 error when using non-OpenAI LLM providers (like Groq). The issue was the hardcoded The fix adds graceful fallback - when a provider doesn't support response_format, it retries without it while relying on the system prompt for JSON enforcement. A user has already verified this works (see comments above). Would appreciate a review when you have time! Thanks |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Problem
Ontology generation fails with 500 error when using LLM providers like Groq that don't support the OpenAI-specific
response_format={"type": "json_object"}parameter.Root Cause
The code uses
response_format={"type": "json_object"}which is an OpenAI-specific feature. When using Groq or other providers that don't support this parameter, the API call fails with a 500 error.Solution
Added graceful fallback logic in three files:
When a 400/500 error related to response_format is detected, the code automatically retries without the parameter and relies on the system prompt to enforce JSON output format.
Testing
Fixes #180