Skip to content

[megatron] fix: update patch for MLA flashattn forward#6005

Merged
wuxibin89 merged 1 commit intoverl-project:mainfrom
HollowMan6:mcore_patch
Apr 15, 2026
Merged

[megatron] fix: update patch for MLA flashattn forward#6005
wuxibin89 merged 1 commit intoverl-project:mainfrom
HollowMan6:mcore_patch

Conversation

@HollowMan6
Copy link
Copy Markdown
Collaborator

@HollowMan6 HollowMan6 commented Apr 14, 2026

What does this PR do?

Now NVIDIA/Megatron-LM@5dcda19 has already been merged into main, so the patch becomes optional when mcore version is greater or equal than 0.16.2

Checklist Before Starting

  • Search for similar PRs. Paste at least one query link here: ...
  • Format the PR title as [{modules}] {type}: {description} (This will be checked by the CI)
    • {modules} include fsdp, megatron, veomni, sglang, vllm, rollout, trainer, ci, training_utils, recipe, hardware, deployment, ray, worker, single_controller, misc, perf, model, algo, env, tool, ckpt, doc, data, cfg, reward, fully_async, one_step_off
    • If this PR involves multiple modules, separate them with , like [megatron, fsdp, doc]
    • {type} is in feat, fix, refactor, chore, test
    • If this PR breaks any API (CLI arguments, config, function signature, etc.), add [BREAKING] to the beginning of the title.
    • Example: [BREAKING][fsdp, megatron] feat: dynamic batching

Test

For changes that can not be tested by CI (e.g., algorithm implementation, new model support), validate by experiment(s) and show results like training curve plots, evaluation results, etc.

API and Usage Example

Demonstrate how the API changes if any, and provide usage example(s) if possible.

# Add code snippet or script demonstrating how to use this

Design & Code Changes

Demonstrate the high-level design if this PR is complex, and list the specific changes.

Checklist Before Submitting

Important

Please check all the following items before requesting a review, otherwise the reviewer might deprioritize this PR for review.

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the Megatron mcore MLA (MultiLatentAttention) forward patch logic so it is only applied for mcore versions older than 0.16.2, aligning with the upstream fix merged into Megatron-LM.

Changes:

  • Add an mcore version gate (>= 0.16.2) to skip applying the local MultiLatentAttention.forward patch.
  • Refine THD packed-sequence handling by padding/slicing V only when Q/V head dims differ, tracking the original V dim for output slicing.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the Megatron-Core patching logic to support versions 0.16.2 and later, while refactoring the patch_forward method to handle cases where query and value head dimensions differ in THD packed sequences. The review feedback identifies a potential AttributeError when accessing configuration attributes and a logic regression where the DSA attention variant might incorrectly trigger post-attention reshaping. Suggestions were provided to use getattr for safer attribute access and to restore the exclusion of the DSA variant in the affected block.

Now NVIDIA/Megatron-LM@5dcda19
has already been merged into main, so the patch becomes optional
when mcore version is greater than 0.16.2

Signed-off-by: Hollow Man <hollowman@opensuse.org>
@wuxibin89 wuxibin89 merged commit b9d71f9 into verl-project:main Apr 15, 2026
122 of 136 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants