Skip to content

Comments

fix(openai): parse finish_reason from chat completion stream#1526

Merged
hassiebp merged 1 commit intomainfrom
openai-add-finish-reason
Feb 17, 2026
Merged

fix(openai): parse finish_reason from chat completion stream#1526
hassiebp merged 1 commit intomainfrom
openai-add-finish-reason

Conversation

@hassiebp
Copy link
Contributor

@hassiebp hassiebp commented Feb 17, 2026

Important

Parse finish_reason from chat completion streams in _extract_streamed_openai_response() and include it in metadata if present.

  • Behavior:
    • Parse finish_reason from chat completion streams in _extract_streamed_openai_response() in openai.py.
    • Include finish_reason in returned metadata if not None.
  • Functions:
    • Modify _extract_streamed_openai_response() to capture finish_reason from choices in response chunks.
  • Misc:
    • Adjust return statement in _extract_streamed_openai_response() to include finish_reason in metadata.

This description was created by Ellipsis for a4f65ba. You can customize this summary. It will automatically update as commits are pushed.

Disclaimer: Experimental PR review

Greptile Summary

This PR extracts finish_reason from streamed OpenAI chat completion responses and passes it as metadata to the Langfuse generation update. Previously, the 4th tuple element returned by _extract_streamed_openai_response was hardcoded to None, meaning finish_reason was lost during streaming. Now it is captured per-choice and wrapped in a metadata dict {"finish_reason": finish_reason}.

  • Adds finish_reason variable tracking in _extract_streamed_openai_response
  • Extracts finish_reason from each streamed choice (chat type only)
  • Returns the value as metadata in the function's 4-tuple return, consistent with how _extract_streamed_response_api_response returns metadata
  • Only applies to chat completions, not legacy completion or embedding types

Confidence Score: 4/5

  • This PR is safe to merge — it adds a small, well-scoped feature to capture finish_reason from streamed responses.
  • The change is minimal (3 lines), follows existing patterns in the codebase, and only affects streaming metadata. The logic correctly handles the standard OpenAI streaming protocol where finish_reason appears in the final content chunk. One minor consideration: finish_reason could theoretically be overwritten by a subsequent chunk with choices, but this doesn't occur in practice with the OpenAI API. No tests were added, but the change is low risk.
  • No files require special attention

Important Files Changed

Filename Overview
langfuse/openai.py Adds finish_reason extraction from streamed chat completion chunks and passes it as metadata. The logic correctly preserves the value from the last chunk with choices, which is the standard behavior for OpenAI streaming.

Sequence Diagram

sequenceDiagram
    participant Client
    participant OpenAI as OpenAI API
    participant Extract as _extract_streamed_openai_response
    participant Update as _create_langfuse_update
    participant Langfuse as Langfuse Generation

    Client->>OpenAI: Chat completion (stream=True)
    loop For each streamed chunk
        OpenAI-->>Extract: chunk (delta, finish_reason=null)
        Extract->>Extract: Accumulate content
    end
    OpenAI-->>Extract: Final chunk (finish_reason="stop")
    Extract->>Extract: Capture finish_reason
    OpenAI-->>Extract: Usage chunk (choices=[])
    Extract->>Extract: finish_reason preserved (no choices)
    Extract-->>Update: (model, completion, usage, {finish_reason})
    Update->>Langfuse: generation.update(metadata={finish_reason})
Loading

Last reviewed commit: a4f65ba

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 file reviewed, no comments

Edit Code Review Agent Settings | Greptile

@hassiebp hassiebp merged commit 2246953 into main Feb 17, 2026
7 of 12 checks passed
@hassiebp hassiebp deleted the openai-add-finish-reason branch February 17, 2026 17:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant