Skip to content

allow to disable the asserts #135

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
pytestbot opened this issue Apr 10, 2012 · 6 comments
Closed

allow to disable the asserts #135

pytestbot opened this issue Apr 10, 2012 · 6 comments
Labels
type: proposal proposal for a new feature, often to gather opinions or design the API around the new feature

Comments

@pytestbot
Copy link
Contributor

Originally reported by: Anonymous


Hey - we use py.test as our product test platform - and we
find it very useful.

One feature though we uses and might be nice if will be shared with the word is disable asserts during test development.
When we write integration test and system test - that take long time to run - we asserts a lot of thing (the single assert per test methodology is nice but in practice it reduce the time take the tests to run) - and most of the time (again in development) we fail on asserts.
Moreover - we use mocks for speed the development process of this kind of tests (find name errors and other thing that we don't want to waste half a hour to discover) - but the mocks tend to fail on the first assert the meet.

Therefore we used your great frame work and write some plugin that monkey patch the assert rewrite module with the following class instead of AssertionRewriter class.

#!python

class AssertionRemover(AssertionRewriter):
    def visit_Assert(self, assert_):
        return ast.copy_location(ast.Pass(), assert_)

and patches PYC_EXT - to '.pycNoAsserts'.
That works nice and helped us a lot. but because of the entire monkey patching is kind of fragile (the code
might of rewrite module might change someday and we will never know) it would be nice if this option will be
supported naturally by py.test. (we know it sounds a strange feature to ask - the ability to disables asserts in test - but it works and allow us to reduce our development time)


@pytestbot
Copy link
Contributor Author

Original comment by Ronny Pfannschmidt (BitBucket: RonnyPfannschmidt, GitHub: RonnyPfannschmidt):


that use-case does indeed sound rather strange

before accepting we'd like to get a better understanding about the reasons you do things the way you do them

your general description leaves me with a "something is not right/missing" feeling

@pytestbot
Copy link
Contributor Author

Original comment by Ronny Pfannschmidt (BitBucket: RonnyPfannschmidt, GitHub: RonnyPfannschmidt):


lt me recap my basic understanding of the problem

in real tests you use system objects, stuff is fine

in quick tests you use mocks, asserts fail

is it correct that you want to skip over assertions if a certain mock object is used?

@pytestbot
Copy link
Contributor Author

Original comment by Anonymous:


Thanks for your quick response.
I will try to give more details about how we works in order to
clarify the issue.

We develop a complex project not in python - but we use py.test and python
in order to write our integration tests.
Most of our tests look like this :
{{{
#!python

def test_scenario(a_remote_object):
res1 = do_step1(a_remote_object)
check_results(res1)

res2 = do_step2(a_remote_object)
check_results(res2)

res3 = do_step3(a_remote_object)
check_results(res3)

...

}}}

As you can see our tests are just series of actions against a remote object.
What we trying to test is the remote object behavior (not the API for working
with it).
The problem is that the work against the remote object is very
very very slow (as i said complex project..) .
So, when we develop an integration test - we run it against mock objects.
We don't want to run the tests against mock object after we finished develop them
because as we mentioned before - we test the remote object.
We use mocks to discover python NameErrors\TypeErrors and mistakes in the tests flows
(most of the test are more complex than 3 steps scenarios).
The problem is that our mock object don't mock entirely the remote objects so they tend to fail
by the asserts in the test in early steps. its frustrated to work like this.
The solution found by one of our programmers is to use the AST rewrite that pytest contains to
add --cancel-asserts command line option that removes the asserts for the test and replace them
with pass or printing.

I guess the feature that we really asks is the ability customize the use of py.test AST rewrite code.
Now we just hack around it - but we think it would be great if we could implement a hook
that return our implementation of AssertionRewriter class.
For example - wouldn't it be nice if instead of failed assert - you could run your tests in a mode
that open a pdb shell in the assert place and allow you to continue the test after your see what failed?
All it's required is to replace:
{{{
assert (expr) (msg)
}}}

with a nodes that does:

{{{
if not (expr) :
print "Assert Failed : (msg)"
import pdb
pdb.set_trace()
}}}

And the options are endless.

@pytestbot
Copy link
Contributor Author

Original comment by Benjamin Peterson (BitBucket: gutworth, GitHub: gutworth):


I'm reluctant to expose such initimate implementation details of py.test as hooks.

@pytestbot
Copy link
Contributor Author

Original comment by holger krekel (BitBucket: hpk42, GitHub: hpk42):


Anonymous, the example you are giving does not directly contain asserts. Are they contained in the check_function?

I am also reluctant to expose AssertionRewriter and i don't think you could implement this pdb behaviour easily or did you try that?

Did you try to a route your checks through a function which could be instrumented via e.g. --cancel-asserts to ignore assertion errors?

Maybe in the end a general --cancel-asserts like option might make some sense. Probably spelled --assert=remove. I am not convinced yet.

@pytestbot
Copy link
Contributor Author

Original comment by Ronny Pfannschmidt (BitBucket: RonnyPfannschmidt, GitHub: RonnyPfannschmidt):


seems like this is now a dead end and more constructuve discussion did not happen, closing

@pytestbot pytestbot added the type: proposal proposal for a new feature, often to gather opinions or design the API around the new feature label Jun 15, 2015
mgorny pushed a commit to mgorny/pytest that referenced this issue May 27, 2023
Revert "dist: Remove support for building and distributing *.egg files"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: proposal proposal for a new feature, often to gather opinions or design the API around the new feature
Projects
None yet
Development

No branches or pull requests

1 participant