Skip to content

Feature: Don't "skip" this file, "ignore" this file. #3844

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
dev351 opened this issue Aug 21, 2018 · 7 comments
Closed

Feature: Don't "skip" this file, "ignore" this file. #3844

dev351 opened this issue Aug 21, 2018 · 7 comments
Labels
topic: reporting related to terminal output and user-facing messages and errors type: enhancement new feature or API change, should be merged into features branch type: proposal proposal for a new feature, often to gather opinions or design the API around the new feature

Comments

@dev351
Copy link

dev351 commented Aug 21, 2018

While the pytestmark = pytest.mark.skip in the global scope is useful for skipping all tests in a file, it would be useful to be able to "ignore" the file altogether using a similar global assignment. That is, be able to provide a marker in the test file that will hide the module from discovery in the same manner as --ignore=<test-file> does on the command line.

My motivation here is as follows. pytest.mark.skip emits SKIPPED messages for each test in the file when the -v flag. Whereas, it would also be useful to suppress that output altogether by simply "ignoring" the file.

My enhancement proposal is provide "ignore" behaviour around the follow global syntax (or similar):
pytestmark = pytest.mark.ignore

@nicoddemus
Copy link
Member

Hi @dev351 thanks for writing.

May I ask why do you need to suppress the SKIPPED message for the test files? Seems strange because users won't get any feedback and might wonder if there's some problem with their configuration or environment that is not picking up the test modules that are being ignored.

@nicoddemus nicoddemus added the status: needs information reporter needs to provide more information; can be closed after 2 or more weeks of inactivity label Aug 21, 2018
@dev351
Copy link
Author

dev351 commented Aug 22, 2018

May I ask why do you need to suppress the SKIPPED message for the test files?

Yes. I have a Python package skeleton project that provides a number of example test_*.py. These test files are useful references for developers writing their tests files in the same directory (tests/). Running pytest -v tests/ results in quite a bit of noise while your are trying to view the output of the run on your "real" test files.

The only ways I see to suppress the "SKIPPED" output on all these example tests was to:

  1. reference these example files on the command line with the --ignore=<file-path> option. However, I find this approach tedious. Or
  2. rename the test files outside of the standard discovery pattern. However, I was trying to emphasize correct test file names while having the files ignored during a test run.

My proposal, I feel, extends nicely from the existing mechanism for "skipping" all tests in a file.
It provides a means to automatically enable or disable an entire file for discovery by commenting or un-commenting out a single line inside each file (i.e., something like pytestmark = pytest.mark.ignore).

Thanks for your consideration.

@dev351 dev351 closed this as completed Aug 22, 2018
@dev351 dev351 reopened this Aug 22, 2018
@RonnyPfannschmidt
Copy link
Member

there is multiple mechanisms for ignoring files already which actually prevents it from collection at all

so i don`t see any value in a ignored marker

@blueyed
Copy link
Contributor

blueyed commented Aug 25, 2018

Related: #3730 ("silently skip a test").

@nicoddemus nicoddemus added type: enhancement new feature or API change, should be merged into features branch type: proposal proposal for a new feature, often to gather opinions or design the API around the new feature and removed status: needs information reporter needs to provide more information; can be closed after 2 or more weeks of inactivity labels Oct 13, 2018
@h-vetinari
Copy link

This would be extremely useful to me in several scenarios.

Pytest makes it easy (esp. through parametrization and parametrized fixtures) to test a cartesian product of parameter combinations. It's slightly less easy (not least because fixtures can't be reused as parameters) to reduce that cartesian product where necessary.

Often, certain combination simply do not make sense (for what's being tested), and currently, one can only really skip / xfail them (unless one starts complicated plumbing and splitting up of the parameters/fixtures).

But skips and xfails get output to the log (and for good reason - they should command attention and be eventually fixed), and so it's quite a simple consideration that one does not want to pollute the result with skipping invalid parameter combinations.

For this task, pytest.ignore would be the perfect tool. That this would be very intuitive is underlined by the fact that I wanted to open just such an issue before I found the exact same request here already.

@blueyed blueyed added the topic: reporting related to terminal output and user-facing messages and errors label Nov 14, 2018
@RonnyPfannschmidt
Copy link
Member

@h-vetinari now ignoring a file and deselecting tests are 2 very different kinds of things, please open a new issue

i'm going to close this one as wont-fix as its about ignoring whole files

@h-vetinari
Copy link

@RonnyPfannschmidt
I think that #3730 is very close to the sort of issue I'd have opened, so I commented there. The only part that would be helpful to improve is to change the title to include the word "ignore" somehow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
topic: reporting related to terminal output and user-facing messages and errors type: enhancement new feature or API change, should be merged into features branch type: proposal proposal for a new feature, often to gather opinions or design the API around the new feature
Projects
None yet
Development

No branches or pull requests

5 participants