Skip to content

Fix whisper long-form generation when eos_token_id is a list#45570

Open
ronansgd wants to merge 1 commit intohuggingface:mainfrom
ronansgd:fix-whisper-eos-token
Open

Fix whisper long-form generation when eos_token_id is a list#45570
ronansgd wants to merge 1 commit intohuggingface:mainfrom
ronansgd:fix-whisper-eos-token

Conversation

@ronansgd
Copy link
Copy Markdown

@ronansgd ronansgd commented Apr 22, 2026

What does this PR do?

Fixes: #45584

Fixes a bug in Whisper generation code, happening when generation_config.eos_token_id is a list[int] and not an int (happens for instance after align_special_tokens is called in Trainer.train).

Fix is to normalize to a list and use membership checks instead of equality.

Code Agent Policy

The Transformers repo is currently being overwhelmed by a large number of PRs and issue comments written by
code agents. We are currently bottlenecked by our ability to review and respond to them. As a result,
we ask that new users do not submit pure code agent PRs at this time.
You may use code agents in drafting or to help you diagnose issues. We'd also ask autonomous "OpenClaw"-like agents
not to open any PRs or issues for the moment.

PRs that appear to be fully agent-written will probably be closed without review, and we may block users who do this
repeatedly or maliciously.

This is a rapidly-evolving situation that's causing significant shockwaves in the open-source community. As a result,
this policy is likely to be updated regularly in the near future. For more information, please read CONTRIBUTING.md.

  • I confirm that this is not a pure code agent PR.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

cc @eustlb

`generation_config.eos_token_id` can be `int | list[int]`, but the
whisper long-form generation code compared it as a scalar in two
places, causing silent wrong behavior or a RuntimeError. Normalize
to a list and use membership checks instead of equality.

Made-with: Cursor
@github-actions
Copy link
Copy Markdown
Contributor

[For maintainers] Suggested jobs to run (before merge)

run-slow: whisper

Copy link
Copy Markdown
Contributor

@eustlb eustlb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @ronansgd, thanks for opening this PR!

  1. can you share a reproducer of the bug your describing? even better if you open an issue and reference it in this PR
  2. ideally we'd like to align whisper's custom generate to how it's handled in GenerationMixin's generate (so rather initializing an eos_token_tensor). I have a commit ready to be pushed to this PR for that but I'm waiting for your reproducer first

@ronansgd
Copy link
Copy Markdown
Author

ronansgd commented Apr 22, 2026

Hey @eustlb, I've opened an issue #45584 with a minimal reproducer script

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Whisper generation fails on empty transcription after align_special_tokens

2 participants