Skip to content

[CI] Add llama case for profile test #1716

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

RUIJIEZHONG66166
Copy link
Contributor

Add llama inference test for call number test when profiling

@RUIJIEZHONG66166 RUIJIEZHONG66166 changed the title Add llama case for profile test [CI] Add llama case for profile test Jun 4, 2025
@pytorchxpubot
Copy link

@sys_pytorchxpubot triage result for run 15434458769Triage bot UT analaysis result for reference only, please note unique error message only report once:
  1. third_party.torch-xpu-ops.test.xpu.test_nn_xpu.TestNN test_LayerNorm_3d_no_affine_large_feature_cuda got failed with error message
 AssertionError: Tensor-likes are not close! 

Triage bot response:

{
  "similar_issue_id": 845,
  "similar_issue_state": "closed",
  "issue_owner": "daisyden",
  "issue_description": "The test TestNN.test_LayerNorm_3d_no_affine_large_feature_cuda failed with an AssertionError: Tensor-likes are not close! The error suggests a mismatch in tensor values between CUDA and XPU implementations. The failure is consistent and not random, indicating a potential discrepancy in the implementation or computation between the two devices.",
  "root_causes": [
    "Discrepancies in LayerNorm implementation between CUDA and XPU leading to tensor value mismatches.",
    "Potential differences in precision, kernel behavior, or synchronization issues between CUDA and XPU implementations."
  ],
  "suggested_solutions": [
    "Align the XPU LayerNorm implementation with CUDA to ensure consistent results.",
    "Investigate and correct any precision or kernel-specific issues causing the mismatch.",
    "If alignment is not feasible, consider temporarily skipping the test until the underlying issue is resolved."
  ]
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants