Skip to content

generalize across sharding parallelisms

a7345e2
Select commit
Loading
Failed to load commit list.
Open

Add support for TransformerEngine flash attention in WAN #299

generalize across sharding parallelisms
a7345e2
Select commit
Loading
Failed to load commit list.
Google CLA / cla/google succeeded Dec 17, 2025 in 2s

✅ All contributors are covered under a CLA with Google

See https://cla.developers.google.com/ for more info about Google's Contributor License Agreement (CLA).

ℹ️ Googlers: Go here to view more details and manage scans for this pull request.

Details

The following contributors were found for this pull request:

a7345e2 Author: @cpersson-amd <car******son​@amd.com>

(Only the first commit for a unique contributor is listed.)