-
Notifications
You must be signed in to change notification settings - Fork 68
fix: respect enable_encrypted_reasoning flag in Responses API #1208
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
The select_responses_options function was always adding 'reasoning.encrypted_content' to the include list for stateless requests, completely ignoring the LLM.enable_encrypted_reasoning flag. This caused errors when using models that don't support encrypted content: "Encrypted content is not supported with this model." Changes: - Only include encrypted reasoning when enable_encrypted_reasoning=True - Add test to verify the flag is respected when set to False - Default behavior (False) now prevents encrypted content from being added Fixes: https://github.com/OpenHands/OpenHands/issues/11406
| if "reasoning.encrypted_content" not in include_list: | ||
| include_list.append("reasoning.encrypted_content") | ||
| # Only add encrypted reasoning if the LLM has enable_encrypted_reasoning=True | ||
| if getattr(llm, "enable_encrypted_reasoning", False): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| if getattr(llm, "enable_encrypted_reasoning", False): | |
| if llm.enable_encrypted_reasoning: |
LLM class is a pydantic model, the field always exists, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we should make it True by default, but, as your PR does, respect it if set to False
| enable_encrypted_reasoning: bool = Field( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@OpenHands Lets make enable_encrypted_reasoning True by default, but make sure in this PR that we do respect it if set to False.
Run precommit and tests before pushing, then push to PR.
enyst
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you! Left a small question, but this looks good to me!
|
[Automatic Post]: It has been a while since there was any activity on this PR. @mr-karan, are you still working on it? If so, please go ahead, if not then please request review, close it, or request that someone else follow up. |
The select_responses_options function was always adding 'reasoning.encrypted_content' to the include list for stateless requests, completely ignoring the LLM.enable_encrypted_reasoning flag.
This caused errors when using models that don't support encrypted content:
"Encrypted content is not supported with this model."
Changes:
Fixes: https://github.com/OpenHands/OpenHands/issues/11406