Skip to content

coverlet is 'blind' to code coverage data in certain XUnit test projects #1311

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
jasells opened this issue Mar 13, 2022 · 5 comments
Closed

Comments

@jasells
Copy link

jasells commented Mar 13, 2022

Similar to #578

MS's coverage collector in VS enterprise/test sees the coverage just fine, but coverlet is "blind" to all coverage that should be available from a particular XUnit test project. It seems to be linked to a particular call to a dependency lib that starts a thread.

If anyone else is experiencing this, this link will help to use MS's collector in your pipeline to collect coverage data (without VS Enterprise). This is my plan B.
https://medium.com/swlh/generating-code-coverage-reports-in-dotnet-core-bb0c8cca66.
I can verify that MS's coverage collector works.

i have a repro here. The issue revolves around a .NetStnd library project referencing a 3rd party package/lib: NetMQ. But, the symptoms are the same as issue #578 , so possibly a similar underlying issue?

See the readme in the repro for more details, the highlights are:

  • All tests are run, results are displayed.
  • MissingCoverlet.Repro.Tests.SubscriberTests.NoCoverage() test does not collect coverage data while MissingCoverlet.Repro.Tests2.UnitTest1.CoverageIsCollected does (the difference is a single line calling into the 3rd party lib).
  • Even if other tests are added to the test project MissingCoverlet.Repro.Tests, NO tests will be seen by coverlet (or at least, no coverage data is collected)
  • VS Code Coverage analysis does see coverage in all cases

Note: I have run this on WIndows 10 both manually (VS) and in a devOps pipeline, same results.

@petli
Copy link
Collaborator

petli commented Mar 14, 2022

@jasells, a first comment on the repro is that you shouldn't typically use both coverlet.msbuild and coverlet.collector. It looks on the output like coverlet.msbuild is the one actually used, and that driver suffers from a limitation in the vstest platform that it can only collect coverage on the process exit event, and have to do it quickly.

I suspect starting the NetMQ thread may interfere with this, so that the process exit event doesn't trigger. The method your repro calls starts a foreground thread, so you can try instead starting NetMQ as a background thread so that it doesn't interfere with process exit:
https://docs.microsoft.com/en-us/dotnet/api/system.threading.thread.isbackground?view=net-6.0#remarks
https://github.com/zeromq/netmq/blob/0b58c232799ce578868524814ce7a59ed13a0a37/src/NetMQ/NetMQPoller.cs#L438

You can also try switching to using coverlet.collector which collects coverage statistics on the test run finished event, and is thus not impacted by any issues around the limitations for the process exit event.

@petli petli added the waiting for customer Waiting for customer action label Mar 14, 2022
@jasells
Copy link
Author

jasells commented Mar 14, 2022

You can also try switching to using coverlet.collector which collects coverage statistics on the test run finished event, and is thus not impacted by any issues around the limitations for the process exit event.

No luck there. Same issue.

However, re:

I suspect starting the NetMQ thread may interfere with this, so that the process exit event doesn't trigger. The method your repro calls starts a foreground thread, so you can try instead starting NetMQ as a background thread so that it doesn't interfere with process exit:

Yes! Starting that thread as background did result in coverlet gathering coverage data! TY! I did have to update the NetMQ package to the latest, as the older version i used in repro did not make the isBackground param available in public API. (I linked to the latest source though... sorry if that caused confusion)

I have been picking at this for 2+ yrs, finally got the time to track it down to the source. This was my first time using XUnit and coverlet, so I thought I was doing something wrong, but as I used XUnit and coverlet on >12 other repos, and never had an issue, i slowly built evidence I needed that it was related somehow to NetMQ.

In my defense, even if I knew about the prcess exit event, NetMQ says that it starts a background thread in the API to start it, even though looking @ source it doesn't.

@petli petli removed the waiting for customer Waiting for customer action label Mar 14, 2022
@petli
Copy link
Collaborator

petli commented Mar 14, 2022

Great that it was solved, @jasells!

@petli petli closed this as completed Mar 14, 2022
@jasells
Copy link
Author

jasells commented Mar 14, 2022

@jasells, a first comment on the repro is that you shouldn't typically use both coverlet.msbuild and coverlet.collector. It looks on the output like coverlet.msbuild is the one actually used, and that driver suffers from a limitation in the vstest platform that it can only collect coverage on the process exit event, and have to do it quickly.

Is there anything to get a little more insight into this?

And what does MS's tool do differently than coverlet that allows it to not be subject to the same issue? They both seem to be using vstest...

@petli
Copy link
Collaborator

petli commented Mar 14, 2022

https://github.com/coverlet-coverage/coverlet/blob/master/Documentation/KnownIssues.md discusses it a bit more, and links to a vstest issue with further information. I don't know what the MS coverage collector might be doing differently, but given what you described I would hazard a guess that it collects coverage stats continuously as the tests run, rather than transferring them at the end.

We made an attempt at solving that using shared memory, but (somewhat ironically) that implementation made coverlet unusable for the .NET runtime coverage testing. There's a discussion initiated here about having another go at it: #1251

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants