Skip to content

Fix: moe_align_block_size() supports non-power-of-2 num_experts#24

Merged
xjmxyt merged 2 commits intoNVIDIA:mainfrom
huanghua1994:fix-moe-align-block
Dec 30, 2025
Merged

Fix: moe_align_block_size() supports non-power-of-2 num_experts#24
xjmxyt merged 2 commits intoNVIDIA:mainfrom
huanghua1994:fix-moe-align-block

Conversation

@huanghua1994
Copy link
Contributor

Description

Support non-power-of-2 num_experts in moe_align_block_size().

CI Configuration

config:
  build: true
  # valid options are "ops" and "benchmark"
  test: [ops]

Checklist

  • Code formatted and imports sorted via repo specifications (./format.sh)
  • Documentation updated (if needed)
  • CI configuration reviewed

@copy-pr-bot
Copy link

copy-pr-bot bot commented Dec 29, 2025

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@xjmxyt
Copy link
Collaborator

xjmxyt commented Dec 30, 2025

/ok to test 72691dc

@xjmxyt
Copy link
Collaborator

xjmxyt commented Dec 30, 2025

Thank you for the contribution! As this is your first contribution to TileGym, please submit your signed CLA document as outlined in CONTRIBUTING.md.

@xjmxyt xjmxyt enabled auto-merge (squash) December 30, 2025 06:33
@xjmxyt
Copy link
Collaborator

xjmxyt commented Dec 30, 2025

/ok to test f53a4ff

@xjmxyt xjmxyt merged commit 469fd81 into NVIDIA:main Dec 30, 2025
6 checks passed
@huanghua1994 huanghua1994 deleted the fix-moe-align-block branch January 5, 2026 17:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants