Skip to content

chore: improve reliability metric #71

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

Christopher-Chianelli
Copy link
Collaborator

  • Don't use disjoint types in test assertions
  • Rename equal to pythonEquals in numeric types to avoid any potential confusion with Object.equals

- Don't use disjoint types in test assertions
- Rename `equal` to `pythonEquals` in numeric types to avoid
  any potential confusion with `Object.equals`
@zepfred
Copy link
Contributor

zepfred commented Jun 13, 2024

The daemon thread PR will solve the failing test, right?

@Christopher-Chianelli
Copy link
Collaborator Author

The daemon thread PR will solve the failing test, right?

Look like still flaky even with the daemon PR; I will add a @pytest.mark.xfail until the root cause of the flakiness is eventually found

Copy link

Quality Gate Failed Quality Gate failed

Failed conditions
60.0% Coverage on New Code (required ≥ 75%)
25.8% Duplication on New Code (required ≤ 3%)

See analysis details on SonarCloud

@Christopher-Chianelli Christopher-Chianelli merged commit 8ac2833 into TimefoldAI:main Jun 13, 2024
4 of 5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants