Skip to content

feat: Add error data to ApifyApiError #314

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Dec 16, 2024

Conversation

Pijukatel
Copy link
Contributor

@Pijukatel Pijukatel commented Dec 13, 2024

Add error data to ApifyApiError when available.
Add tests.

Closes: #306

@Pijukatel Pijukatel added enhancement New feature or request. t-tooling Issues with this label are in the ownership of the tooling team. labels Dec 13, 2024
@github-actions github-actions bot added this to the 104th sprint - Tooling team milestone Dec 13, 2024
@github-actions github-actions bot added the tested Temporary label used only programatically for some analytics. label Dec 13, 2024
@Pijukatel Pijukatel changed the title Add error data to ApifyApiError feat: Add error data to ApifyApiError Dec 13, 2024
@Pijukatel Pijukatel marked this pull request as ready for review December 13, 2024 15:19
@Pijukatel Pijukatel requested a review from janbuchar December 13, 2024 15:19


@respx.mock
@pytest.fixture
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this autoused somehow?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is used by
@pytest.mark.usefixtures('mocked_response')

I would normally just pass the fixture into the args, but then ruff is complaining about unused args :-(, so I thought this somewhat less common way of using fixtures is better than #ignore comment.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would prefer either fixture(autouse=True) or disabling that rule for tests

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@Pijukatel Pijukatel requested a review from janbuchar December 16, 2024 06:33
@vdusek vdusek self-requested a review December 16, 2024 11:49
@Pijukatel Pijukatel requested a review from vdusek December 16, 2024 14:28
Copy link
Contributor

@vdusek vdusek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@Pijukatel Pijukatel merged commit df2398b into master Dec 16, 2024
28 checks passed
@Pijukatel Pijukatel deleted the dataset-schema-validation-errors branch December 16, 2024 15:32
Pijukatel added a commit to apify/apify-docs that referenced this pull request Dec 16, 2024
#1356)

Add Python variant example of catching dataset validation errors.

Documents this change:
apify/apify-client-python#314
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request. t-tooling Issues with this label are in the ownership of the tooling team. tested Temporary label used only programatically for some analytics.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Dataset schema validation: add data property to API error object
3 participants