Skip to content

fix(openrouter): support monitoring cache tokens / reasoning tokens on OpenRouter response#992

Open
kargnas wants to merge 1 commit intoprism-php:mainfrom
kargnas:fix/openrouter-cache-usage
Open

fix(openrouter): support monitoring cache tokens / reasoning tokens on OpenRouter response#992
kargnas wants to merge 1 commit intoprism-php:mainfrom
kargnas:fix/openrouter-cache-usage

Conversation

@kargnas
Copy link
Copy Markdown
Contributor

@kargnas kargnas commented Mar 28, 2026

Currently, you can't know how much was cache read or cache write.

1. I added two cache token details:

Usage object fields

The usage object in API responses includes detailed cache metrics in the prompt_tokens_details field:

{
  "usage": {
    "prompt_tokens": 10339,
    "completion_tokens": 60,
    "total_tokens": 10399,
    "prompt_tokens_details": {
      "cached_tokens": 10318,
      "cache_write_tokens": 0
    }
  }
}

Document: https://openrouter.ai/docs/guides/best-practices/prompt-caching#usage-object-fields

2. Additionally, I also added reasoning_tokens too (but it's not about cache)

@kargnas kargnas force-pushed the fix/openrouter-cache-usage branch from dc4f809 to a85d46f Compare March 28, 2026 13:31
@kargnas kargnas changed the title fix(openrouter): support cache usage tokens on OpenRouter response fix(openrouter): support monitoring cache tokens / reasoning tokens on OpenRouter response Mar 28, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant