Skip to content

Fix block size handling for cross attention in TPU flash attention

6bb59bb
Select commit
Loading
Failed to load commit list.
Open

Fix block size handling for cross attention in TPU flash attention #301

Fix block size handling for cross attention in TPU flash attention
6bb59bb
Select commit
Loading
Failed to load commit list.
Google CLA / cla/google succeeded Dec 23, 2025 in 1s

✅ All contributors are covered under a CLA with Google

See https://cla.developers.google.com/ for more info about Google's Contributor License Agreement (CLA).

ℹ️ Googlers: Go here to view more details and manage scans for this pull request.

Details

The following contributors were found for this pull request:

6bb59bb Author: @michelle-yooh <y**h​@google.com>

(Only the first commit for a unique contributor is listed.)