Skip to content

fix: change prompt_cache_retention literal from in-memory to in_memory#2898

Open
thakoreh wants to merge 1 commit intoopenai:mainfrom
thakoreh:fix-prompt-cache-retention-literal
Open

fix: change prompt_cache_retention literal from in-memory to in_memory#2898
thakoreh wants to merge 1 commit intoopenai:mainfrom
thakoreh:fix-prompt-cache-retention-literal

Conversation

@thakoreh
Copy link

Fixes #2883

The SDK type annotations declared prompt_cache_retention with Literal[in-memory, 24h] (hyphen), but the API expects in_memory (underscore). Using the typed value as-is causes a 400 Bad Request.

Changed all occurrences in 9 files:

  • src/openai/types/chat/completion_create_params.py
  • src/openai/types/responses/response.py
  • src/openai/types/responses/response_create_params.py
  • src/openai/types/responses/responses_client_event.py
  • src/openai/types/responses/responses_client_event_param.py
  • src/openai/resources/chat/completions/completions.py
  • src/openai/resources/responses/responses.py
  • tests/api_resources/chat/test_completions.py
  • tests/api_resources/test_responses.py

…mory'

The API expects 'in_memory' (underscore) but the type annotations
declared 'in-memory' (hyphen), causing 400 errors when using the
typed value.

Fixes openai#2883
@thakoreh thakoreh requested a review from a team as a code owner February 26, 2026 02:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

prompt_cache_retention type declares "in-memory" but API expects "in_memory"

1 participant