feat(telemetry): emit system prompt on chat spans per GenAI semconv#1818
Open
sanjeed5 wants to merge 2 commits intostrands-agents:mainfrom
Open
feat(telemetry): emit system prompt on chat spans per GenAI semconv#1818sanjeed5 wants to merge 2 commits intostrands-agents:mainfrom
sanjeed5 wants to merge 2 commits intostrands-agents:mainfrom
Conversation
System prompt was missing from chat (model invoke) span events, so observability backends could not render the full conversation context on individual LLM calls. Legacy mode (v1.36): emits a gen_ai.system.message span event before the conversation message events. Latest experimental mode: emits gen_ai.system_instructions on the gen_ai.client.inference.operation.details event, keeping it separate from gen_ai.input.messages per the spec guidance that system instructions provided separately from chat history should use gen_ai.system_instructions. Backwards compatible: new optional params default to None.
…-prompt # Conflicts: # tests/strands/telemetry/test_tracer.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Motivation
Observability backends that render per-LLM-call conversation history are missing the system prompt on
chatspans. Currently,start_model_invoke_spanemitsgen_ai.user.message,gen_ai.assistant.message,gen_ai.tool.message, andgen_ai.choiceevents, but the system prompt is never included even though it is sent to the model on every call.Per OpenTelemetry GenAI semantic conventions, system instructions provided separately from chat history (which is exactly how Strands handles them via
agent.system_prompt/agent._system_prompt_content) should be recorded using:gen_ai.system.messageevent (v1.36 legacy conventions)gen_ai.system_instructionsattribute ongen_ai.client.inference.operation.detailsevent (latest experimental conventions)Related: #822, #1452
Public API Changes
start_model_invoke_spangains two optional keyword arguments withNonedefaults:No breaking changes. Callers that omit the new params get identical behavior as before.
Type of Change
New feature
Testing
Updated
test_start_model_invoke_spanto verifygen_ai.system.messageevent is emitted before conversation messages.Updated
test_start_model_invoke_span_latest_conventionsto verifygen_ai.system_instructionsappears on the details event.Added
test_start_model_invoke_span_without_system_promptfor backward-compatibility regression.Added
test_start_model_invoke_span_with_system_prompt_contentto verify content block priority.Updated
test_event_loop_cycle_creates_spansto assert system prompt kwargs are forwarded.I ran
hatch run prepare(fmt + lint + tests pass)Checklist