fix(llm): ensure store=False is passed to Codex Responses API (#7089)
Forces store: false into the extra_body payload for Codex-style models so that LiteLLM successfully passes it down to the ChatGPT Responses API backend, fixing the BadRequestError. Fixes #7056. Original investigation and first PR by @Darshan174 (#7065). Co-authored-by: Darshan174 <Darshan002321@gmail.com>
This commit is contained in:
@@ -1959,6 +1959,10 @@ class LiteLLMProvider(LLMProvider):
|
||||
if self._codex_backend:
|
||||
kwargs.pop("max_tokens", None)
|
||||
kwargs.pop("stream_options", None)
|
||||
# Pass store directly to OpenAI in case litellm drops it as unknown
|
||||
if "extra_body" not in kwargs:
|
||||
kwargs["extra_body"] = {}
|
||||
kwargs["extra_body"]["store"] = False
|
||||
|
||||
request_summary = _summarize_request_for_log(kwargs)
|
||||
logger.debug(
|
||||
|
||||
Reference in New Issue
Block a user