-
Notifications
You must be signed in to change notification settings - Fork 711
Solver chooses mono-traversable-0.4.0 when attempting to build stack-0.1.3.0 #2759
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Wrong. There is an issue in stack (and its dependencies): mono-traversable-0.4.0 has an upper bound on base "<5", while it does not compile with ghc>=7.8.* Nonetheless i agree it looks strange that cabal chooses that version of mono-traversable. The behaviour can be observed on the install-plan for chunked-data-0.2.0. The diff of the plans without/with a manual constraint on mono-traversable is
It appears that cabal has to choose between non-latest
It seems to me that Also: Just a rant, please take it as such. It is a response the communities' attitude towards cabal in general, not this particular report. |
@snoyberg maybe the best practical solution would be to x-revision the bound on base to <ghc-7.10 for mono-traversable<=0.4.0 ? (not that i am a big fan of x-revisions). |
Ah, the vector-algorithms restrictive upper bound is indeed the problem, thanks for figuring that out:
|
Revisions published. I can't really test this though since the problem is non-deterministic (or at least appears to be, since my |
I gave a bug report, with what I did, what I expected to happen, and what happened instead. Expecting more than that from end-users is unreasonable, as it assumes that only people with a certain level of expertise should be allowed to alert the developers when funny shit happens.
I did spend an hour or two trying to track down the culprit. Not everything that is wrong is due to Cabal. In fact, were it not for Cabal and Hackage then Haskell would probably still be quite successfully "avoiding success". So thanks. |
When i now do |
How about now? I added some preferred version info and a lower bound in chunked-data. |
@tebello-thejane I agree with your criticism of what i seem to expect of end-users; that is asked too much. But then you should admit your limited knowledge and ask where the problem might lie, not make a bold claim "So, this is not an issue with stack [..]". There is a big difference in tone between "there is an error in this tool" and "i wish this command succeeded; please help". Also i suggest you mention how much time you invested into trying to diagnose the issue. I am aware that it was @snoyberg who first claimed a fault in |
@snoyberg seems all fine now :) |
Looks like it wasn't a solver bug after all, so closing. @lspitzner Thanks for diagnosing the cause of the error. |
I'm not going to go much deeper on this... but equating "technically speaking this is doing something that can be justified even though no one would ever want it to behave this way" with "not a bug" has been a large source of frustration with cabal. The fact that the dependency solver chose an ancient version of mono-traversable instead of a recent version of vector, when release dates would provide clear indication that the other way around would be more likely to succeed, is the kind of technically-correct-yet-practically-wrong thing that causes users grief constantly. |
@23Skidoo I've investigated that before, and yes, release dates are available in the form of timestamps on the package descriptions in the index tar. |
So I never said that this definitely is not a bug/that the implementation could not be improved to handle such a situation better. I left room for more concrete input. The criticism seems inappropriate. And regardless of interpretation, such vague criticism is not constructive. |
@lspitzner Please look at the context of my comment, it was clearly not pointed at you, nor was it a vague criticism, but an actual suggestion for a change in approach to which issues are treated by the cabal team as bugs and which aren't. In fact, the rest of my comment was very concrete on how I think cabal could do better here. I'm actually not sure how someone who read my entire comment could possibly interpret it as a vague criticism. |
@ttuegel IIRC the release dates are not reliably available in the 00-index tarball, as they get shadowed by cabal edits PS: I think I've stumbled over a couple of other pitfalls you'd run into by relying on timestamps for the solver, but I need to think it through a bit more... I've been toying with the idea of using timestamps for guiding supervised |
@tebello-thejane for future reference, this sounds a lot like the kind of issues the Hackage Trustee issue tracker over at https://github.com/haskell-infra/hackage-trustees/issues is for as this isn't cabal's fault (even though I see a point that the |
@23Skidoo You could do me and others who occasionally work on the solver a huge favour if you'd turn the enhancement request to use package timestamps for the solver into a new issue. It's difficult to figure out what's actually the reason why this is still open from a bug with a lot of discussion, where the important info is buried deep inside a few comments. The new bug can of course reference this one. |
I was not aware of the Trustee issue tracker, @hvr. Thanks for the link. |
@hvr You're quite right! I forgot momentarily that
I don't think the idea here is to rely on the timestamps in general, but only to use them to prioritize conflicting constraints; i.e. prefer constraints that rule out older packages to constraints that rule out newer ones. |
It needs to be traversed anyway when creating the index cache; the upload date could be saved in the cache as well. |
Yes, but if the archive is append only, it becomes much larger, right? By a factor of the average number of package revisions. |
Right, with a persistent package-index this should work again, assuming the history won't be truncated. Btw, can we please create a new focused ticket like @kosmikus suggested, and more importantly specify exactly how the timestamps are to affect the solving process? That'd make it easier to reason about the semantics and test it on paper by trying to come up with counter-examples. |
@ttuegel some numbers: We currently have 8565 packages, 58397 package-versions, 4574 of those package-versions have a non-zero x-rev, and in total there were 5543 cabal edits so far. Right now if we created an uncompressed index-tarball with full history, it'd not be "much larger" but rather just roughly 10% larger. Compressed with an algorithm with a large enough dictionary and/or window-size should be able to bring the relative 10% increase down a bit. |
Trying to install the latest
stack-0.1.3.0
causes the following failure:However, specifying a constraint manually via
cabal install --constraint 'mono-traversable >= 0.9' stack
succeeds.So, this is not an issue with
stack
, since it builds just fine whencabal
is given a nudge.The text was updated successfully, but these errors were encountered: