Skip to content

fix(models): avoid duplicate interaction final responses#4385

Open
sasukearata123 wants to merge 1 commit intogoogle:mainfrom
sasukearata123:codex/fix-is_final_response-duplication-issue
Open

fix(models): avoid duplicate interaction final responses#4385
sasukearata123 wants to merge 1 commit intogoogle:mainfrom
sasukearata123:codex/fix-is_final_response-duplication-issue

Conversation

@sasukearata123
Copy link

Please ensure you have read the contribution guide before creating a pull request.

Link to Issue or Description of Change

1. Link to an existing issue (if applicable):

  • Closes: #issue_number
  • Related: #issue_number

2. Or, if no issue exists, describe the change:

If applicable, please follow the issue templates to provide as much detail as
possible.

Problem:
A clear and concise description of what the problem is.

Solution:
A clear and concise description of what you want to happen and why you choose
this solution.

Testing Plan

Please describe the tests that you ran to verify your changes. This is required
for all PRs that are not small documentation or typo fixes.

Unit Tests:

  • I have added or updated unit tests for my change.
  • All unit tests pass locally.

Please include a summary of passed pytest results.

Manual End-to-End (E2E) Tests:

Please provide instructions on how to manually test your changes, including any
necessary setup or configuration. Please provide logs or screenshots to help
reviewers better understand the fix.

Checklist

  • I have read the CONTRIBUTING.md document.
  • I have performed a self-review of my own code.
  • I have commented my code, particularly in hard-to-understand areas.
  • I have added tests that prove my fix is effective or that my feature works.
  • New and existing unit tests pass locally with my changes.
  • I have manually tested my changes end-to-end.
  • Any dependent changes have been merged and published in downstream modules.

Additional context

Add any other context or screenshots about the feature request here.

@google-cla
Copy link

google-cla bot commented Feb 5, 2026

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @sasukearata123, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses and resolves issues related to the handling of streaming LLM responses, specifically focusing on preventing duplicate final responses and correctly aggregating text parts that might overlap. By introducing more precise logic for determining interaction completion and implementing a robust text merging mechanism, the changes ensure that the system consistently provides a single, accurate, and complete final response from streamed interactions.

Highlights

  • Refined Final Response Detection: The is_final_response logic in event.py now explicitly checks the turn_complete status, ensuring more accurate determination of an interaction's finality.
  • Duplicate Text Prevention in Streaming: Introduced new utility functions (_merge_text_with_overlap, _append_delta_text_part) in interactions_utils.py to intelligently merge streamed text parts, preventing duplication, especially with overlapping chunks.
  • Streamlined LLM Response Emission: Modified convert_interaction_event_to_llm_response to avoid emitting redundant final responses from content.stop and interaction events during streaming, relying instead on interaction.status_update or stream end for final content.
  • Robust Streaming Completion Handling: The generate_content_via_interactions function now tracks whether a turn_complete response has been emitted, providing a fallback mechanism to ensure a final aggregated response even if a completion event is missed.
  • Comprehensive Unit Testing: New unit tests have been added for the updated is_final_response logic and for the streaming response aggregation, including scenarios for single final responses and correct merging of overlapping text deltas.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • src/google/adk/events/event.py
    • The is_final_response method now incorporates a check for self.turn_complete is not False to refine its logic for identifying final responses.
  • src/google/adk/models/interactions_utils.py
    • Added _merge_text_with_overlap function to merge two strings by finding and removing the largest overlapping suffix/prefix.
    • Added _append_delta_text_part function to append text to a list of types.Part objects, utilizing _merge_text_with_overlap to prevent duplication if the last part is text.
    • In convert_interaction_event_to_llm_response, text deltas are now processed using _append_delta_text_part.
    • The content.stop and interaction event types no longer directly emit LlmResponse objects in streaming mode to prevent duplicate final responses.
    • In generate_content_via_interactions, a has_emitted_turn_complete flag was introduced to manage the emission of the final aggregated response, ensuring it's only sent once.
  • tests/unittests/events_test.py
    • New test cases test_is_final_response_false_when_turn_incomplete and test_is_final_response_true_when_turn_complete were added to validate the updated is_final_response logic.
  • tests/unittests/models/test_interactions_utils.py
    • Added TestGenerateContentViaInteractionsStreaming class with tests test_emits_single_final_response_with_status_update and test_merges_overlapping_text_deltas_in_final_response to verify the new streaming behavior and text merging logic.
Activity
  • The pull request was opened by sasukearata123.
  • The author has provided a detailed description including a problem statement, solution, and testing plan.
  • Unit tests have been added and passed locally.
  • Manual end-to-end tests have been performed.
  • The author has completed a self-review and checked all items in the contribution checklist.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@adk-bot adk-bot added the live [Component] This issue is related to live, voice and video chat label Feb 5, 2026
@adk-bot
Copy link
Collaborator

adk-bot commented Feb 5, 2026

Response from ADK Triaging Agent

Hello @sasukearata123, thank you for creating this PR!

To help us review your contribution, could you please fill out the pull request template in the description? This includes linking to an existing issue (or creating one), describing the problem and solution, and providing a testing plan.

This information will help reviewers to review your PR more efficiently. Thanks!

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request effectively resolves an issue with duplicate final responses in streaming mode from the interactions API. The core of the fix involves centralizing the logic for emitting the final response to either an interaction.status_update event or a fallback at the end of the stream, while ignoring content.stop and interaction events that previously caused duplicates. A flag, has_emitted_turn_complete, is cleverly used to prevent the fallback from firing unnecessarily.

Additionally, the PR introduces robust logic to merge overlapping text deltas from the stream, ensuring that the aggregated text content in the final response is correct. This is implemented in the new _merge_text_with_overlap and _append_delta_text_part helper functions.

The changes are supported by a comprehensive set of new unit tests that validate both the prevention of duplicate responses and the correct merging of text content. The code is clear, well-commented, and the overall solution is solid.

@ryanaiagent ryanaiagent self-assigned this Feb 5, 2026
@ryanaiagent
Copy link
Collaborator

Hi @sasukearata123 , Thank you for your contribution! We need additional information to start the review. Please provide a clear description of the changes, and if applicable, supporting details such as screenshots, logs, and/or unit tests. Providing more context will help us understand the purpose of your contribution and expedite the review process.

@ryanaiagent ryanaiagent added the request clarification [Status] The maintainer need clarification or more information from the author label Feb 5, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

live [Component] This issue is related to live, voice and video chat request clarification [Status] The maintainer need clarification or more information from the author

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants