-
Notifications
You must be signed in to change notification settings - Fork 214
Support for minimum language version enforcement in macro deps #3466
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Note that per-library language versions can also cause problems here. Using a macro in a library which has had its version downgraded via language version comment is probably generally unsafe, if that language version is below the min SDK constraint of the macro package. |
A macro should probably have a way to access the language version of the library is generating code for, so it can choose which solution to use for a given library, or it can fail gracefully or informatively if it cannot generate code for that language version. There is nothing wrong with a macro refusing to work for a particular library, you just shouldn't be using it then. The problem comes if the macro library changes which library versions it works for. That should probably be a breaking change and require a major version increment for the macro. Which language version a library supports can also affect the introspection API, and definitely the code generation API. Which may need to be versioned. Macros are really contravariant in language version, since they don't (only) consume code, they produce it. The min-SDK needed by a macro package is the SDK needed to run it. |
I don't think it is macros' responsibility to emit an error if they are run on a library with an incompatible language version. For me, that's 100% the job of |
In any case, I don't think macro authors should have to consider this scenario. This scenario is too intuitive for folks to consider it when working on their macros. |
They can, but as @rrousselGit says this would be much better solved by the version solver.
The big problem here is that a macro might fail on some transitive dependency, not your own code.
Breaking changes are one option, but that could cause a lot of churn in macro package versions that could be expressed in a way that does not cause that churn. Especially for highly used macros, major version releases are very painful.
This is true but it matters less - it affects the package being actively developed and not transitive packages. It is definitely possible that I might not be able to use some new language feature immediately if I am using a macro that can't handle it, but I can just not use that feature until the macro does support it. However when a macro starts using a new feature itself, that macro is no longer usable from any package on an earlier language version, so it can break any of your transitive deps. The solution would be to restrict the macro version, but you might not even have a dependency at all on that macro, so it both feels wrong to have to add that dependency just to enforce an older version, and it is probably much harder to figure out that is what you need to do.
This is exactly the point of this issue, discussing if we can have this 2nd constraint expressed in such a way that it can be a part of the pub solve :). |
I also doubt that many macros will use different constraints in their implementation than in the generated code. At least that wasn't my experience with build_runner This is in part caused by lints. I'm sure that mistake is going to be quite common. Even if the consequences aren't that bad |
For build to source code generators the problem is less pronounced, because you ship the generated code with your package. So the output is fixed at a point in time where it was valid, and you don't have this "transitive deps got randomly broken" issue. I think that is why you probably didn't get issues filed, the fix was always easy (and local, just update your sdk constraint). For macros it is much more likely people would get broken in such a way that they can't fix the problem themselves, and will file issues. |
If we were able to guarantee that we'd have a reliable |
Can you elaborate more on this? |
It might break our performance targets, but if a macro outputs code at language version The reverse situation (and main topic of the issue) would be handled by changing the pub solver so new language macro is never used with old language usage. |
Ah ok, that is an interesting point, but It also is less of a concern generally, given the fix is local :).
Yeah this was the solution @rrousselGit suggested as well. We would have to know that a package has any macros at all, ideally by only looking at the pubspec, and we weren't originally planning on requiring anything special in the pubspec for macros. But it is probably the simplest solution. |
Yeah, I think it's also something users may want to see on the pub site. |
I don't think we should assume that that is possible. If a macro can only be applied to Dart 3.0+ libraries, because the code it generates uses records, then using the version solver to prevent misuse means that the macro cannot be used by any package which has a min-SDK below 4.0. Any package with a min-SDK below 4.0 can be used by another package which has a min-SDK of 3.X, and which has a library with a Version solving doesn't work for that scenario The min-SDK of a macro is the SDK needed to run the macro. (If it can only generate code for a higher language version than that, it should probably up its min-SDK too.) The "min-supported-library-language-version" of a macro is the minimum language version it can be applied to. That should be checked for each macro application. We can, and probably should, make that part of the macro API, something the macro must declare, and that the analyzer can check when it sees a Then the macro should still be able to check the language version it's generating code for, because it might want to generate different code for different language versions. The upper-bound is more tricky. When we add new language features in the future, that changes the introspection API, and possibly the code-generation API. The macro API, which is used both for reading and writing, and is therefore, generally, language version invariant. We can try to promise some kind of backwards compatibility, so you can always generate old code. But it won't be allowed to hold back language evolution, so we might as well be prepared for how to handle breaking changes. We could have a "macro API version", so that any change (or any breaking change, for some definition) to the macro APIs increases the version, and a macro must declare itself compatible with the new version before it gets access to it, maybe even allow it to declare a specific entry point per macro API version, so that we can provide completely different types if needed. (Maybe it just needs to implement We can then provide the newest API that a macro supports, for it to use, also for older langauge versions if they are compatible. If not, we'll have to provide the old API version for those libraries. |
The lower bound is what we are discussing with the version solver.
Those packages cannot use macro syntax without changing their lower constraint to a version high enough for the pub solver to work. The SDK must be new enough if the macro package was in the version solve. Any other packages in the version solve with older language versions necessarily will not reference or use any macro. Any other packages that use the macro with new-but-not-new-enough language version will be constrained by the pub solve and not in the solution. |
All that said, it's likely that adding these type of backwards constraints would be challenging or impossible in the pub solver... @jonasfj |
@lrhn IMO the case of
I think it would be more reasonable to flip the logic. That would enable the macro to run on older language versions. As opposed to the other way around, where even though the macro supports |
I don't understand why this wouldn't be possible, can you explain?
I agree that language version comments complicate things. I am not sure if you can alleviate this with lints or something. Ultimately, I don't think these are very common any more, and we don't have to solve for that case. In other words, I would consider this a "don't let perfect be the enemy of good" scenario. Having some level of checking in the pub solver is better than having none.
I agree this could work if macros always do breaking changes when they update this version. I would just like to avoid that solution if possible.
I agree but I am also not nearly as concerned about the reading part. The difference is the kind of breakage that happens. The "reading" scenario is (mostly) going to occur when a user explicitly opts their own code into a newer language version. At that point, it is possible that some macro might fail, but the user can just reset their language version back to what it was (or possibly upgrade the macro package). We also are planning on using a package to expose the macro apis (sort of similar to how dart_internal works). We could do breaking changes in that package when new features are released, if we thought it was useful. Which is related to the last point:
We are planning to use a package for this versioning. It will only directly export some internal library, but it will provide a way to version the macro API separately from SDK versions. This package will have tight SDK constraints. |
A few thoughts. Generally speaking macro implementations should not be super excited about using new language features for implementation, by which I mean, concealed in the generated code. Let's take the addition of records to the language as an example. As a macro author I don't think I would update my existing macro implementation to generate implementation code that uses records--I know it would make the macro apply to less code, so there would have to be a big benefit, and there likely isn't one. (Switch to records for better memory/performance?) On the flip side if a macro is applied to code that uses records in the API there is a good chance the macro generated code also wants records in the API. So for example if a field in a Then this comes almost for free: it takes input code with records to trigger output code with records, so the language version automatically is high enough. Stretching a bit it's possible to imagine a case where this breaks: what if the macro is applied to a file with too low language version for records, but that file references source with a higher language version that does use records, and it references it in a way that triggers the macro to want to generate with records. But is that something we'll see in practice? In the examples I've considered so far it looks like not: the main information a macro is going to want to pull from distant sources is about which types have that same macro applied, so it can assume the features it itself generates. It would be useful if we can find concrete examples. The most important concrete example we have so far is #3466 (comment) about lints: lints that force generated code to move to new features are a big problem :) ... if we don't do that--if we instead encourage generated code to stick to the minimum language version possible ... then what kind of problems are we left with? Thanks. |
I do generally agree that if macros just stay somewhat off of the bleeding edge of the language that likely avoids many of the problems. My main concern is communicating that to macro authors in a way that is reliable and doesn't require them to break their users before understanding the issue. |
I think the main issue about staying somewhat off of the bleeding edge would be that a macro written with Dart 2.x (assuming we could write macros with it) wouldn't have access to the Ast/Elements related to 3.0 features. So if that macro were to be applied on a sealed class, the macro couldn't do something special for it without bumping the minimum necessary Dart version. |
A macro can usually still support newer language versions without dropping support for older ones. It can increase its min SDK constraint (or macro API package constraint) and still be applied to packages on older language versions. Typically it will only need to actually emit code using newer language features if they appear in the library they are running against. Consider for instance extension types, they can support running on extension types (and generating augmentations of them) without dropping support for libraries that existed prior to extension types, since those libraries won't contain any extension types. Similarly it can support generating record types in APIs, because they will generally appear in the library they are generating code into as well. This breaks down for super types from other libraries at some point, but I don't see any way around that. |
I guess we will at some point have a recommended way for macro authors to test their implementations. Do we have the capability anywhere to detect language version? i.e. can we give macro authors a method:
So that if they change the macro to output a 3.2 feature in this case, the test fails? That would be for unit testing. I guess for e2e testing they can easily do this with a test case input containing |
@jakemac53 The problem is, if a macro relies on a higher SDK to generate code for older language levels, folks may not have the new SDK installed. For instance, a macro may want to support both the stable and master channels of Flutter. |
I guess that means that each major version of that package will have a tight SDK range, and if we release a new SDK which it is compatible with, we'll release a minor version update with the wider SDK range. If not, it's a major version increment of the package. That still means that the macro cider itself will either inherit the limited SDK range, no matter what range it will itself claim compatibility with, or it will have a package dependency which spans multiple major versions. |
Can we usefully introduce flexibility at the serialization boundary? It would be possible to have a macro built against a newer macros API talk to an older SDK, if the serialization is forwards/backwards compatible. The older SDK is guaranteed to not send data that the macro won't understand. The other way around is slightly more awkward, a macro built against an older macros API talking to a newer SDK. But we can have a well-defined concept of "unknown to this macros API version". |
That's just normal dependency solving. Running the macro code is the least of the issues here, that's just a dependency any other, a late-applied dev-dependency. The possible problems are:
I think my conclusion is that successful macro packages will support generating code for any language version that is supported by its min-SDK, at least any version that it has supported at any time. A new macro package release will keep the major version number, and support more language versions, backwards compatibly, but never fewer. Not until the SDK itself stops supporting that language version. |
If new features aren't needed by macros until Dart 4, does Dart even need those new features? 😝 |
It certainly places limits on what macros can do. And I'm not entirely sure it actually solves any fundamental problem. If
Then a macro in a package with an SDK constraint of But doing so will probably mean that, either:
I think I'm beginning to see how introducing a Essentially, giving Obviously, different versions of I don't know if this is sound, I can see @lrhn's argument that increasing An alternative would be have each macro definition declare the minimum supported language-version. Either using an annotation, or by virtue of implementing an interface from either Then it's perhaps reasonable to have Again, this would treat bumping the minimum supported language-version for a macro as breaking change, something that requires bumping the major version. But leave it to the package author to decide if they wish to do so. Only difference would be some tooling to help package authors not accidentally break their stable APIs. I'm not sure how feasible it is to do this. I suppose a cheap version is simply to check if a previous version of the package being published has a lower lower-bound dependency on |
A macro package can always refuse to work on a library with a language version above what it understands. Or it can choose to (try to) work, but only when applied to declarations that it understands. So a macro that works on classes can try to work on classes in a later language version library than what it knows about, but reject to work on a As long as the macro still works on a lower-language-level library the way it used to do, even on a new SDK. |
I think all these discussions ignore one core issue: They will 100% make mistakes, likely multiple times. For the sake of having a good experience at scale, I think it is critical to have a good default. Expecting certain good practices of macro authors is not a good default IMO. |
Agreed, but we do already expect package owners not to break their APIs unless they do a major version bump. That said, I'd agree that it much easier to think it's fine to: update your macro to generate code that uses new language features. That's not fine, and something to help users avoid doing so by accident is probably warrented. That could be warnings at publishing time, lints (maybe), different code-gen APIs for different language versions, or annotations in source or pubspec.yaml, or ? |
For me this issue is not tied to breaking change. A brand new user of a macro could still install the macro and end-up with a compilation error, because their min SDK doesn't match the macro's min SDK. That's bad experience, regardless of what happens when a new version is released. |
The "my min-SDK is lower than any min-SDK for my dependency" problem is not unique to macro packages. It's just easier to hit when depending on packages that increase their min-SDK or major version often. If I have my package That's not something you can see just from looking at a single package version solve. If it worked, all it proves is that there exists a solution for the current SDK. I really do want all existing code to keep running successfully on a new minor version SDK increment. Maybe that's OK. But it's going to be annoying. The macro code generation API should definitely warn or err if a macro tries to generate new features into pre-feature libraries. But it shouldn't prevent a macro from generating older code into older libraries, or older code into newer libraries, where it will still work. The macro package can always require a min SDK version big enough for it to have the features it needs. |
This is (one reason) why packages should run tests against their lower SDK bound, as well as at least one of the latest stable or dev. If you can't get a version solve, it will fail on that job, and you can increase the lower bound such that it is actually real. And you shouldn't claim support for a version you can't test. Language versioning being tied to the min SDK does throw a bit of a wrench into this, you might end up needing to increase your language version at the same time. These things should have never been so tightly linked, but luckily most language versions are easy to migrate to.
Absolutely, and it will, this is really the primary reason for
The cost is likely to be very high though, and it isn't even clear this would be possible for all categories of breaking changes. Note that being a package, we really don't have to address this right now. We will have the flexibility in the future to do either option (a regular breaking change, add a new library with the API changes, etc).
I don't think the code generation API itself needs to do this, whatever tool that is consuming the generated code is just going to treat it as normal code, and produce whatever errors it needs to if invalid code is generated. Macro generated code doesn't get any kind of back door to bypass static checking.
If you can still get a valid version solve, yes. It has never been the case that we guarantee a valid version solve across SDK releases. We should certainly strive to have that be the case the vast majority of the time, but with flutter package pinning and max SDK constraints this guarantee isn't realistic. |
I do think it is unique to macros, at least with reasonable coding practices.
And if a version of Macros currently don't meet that standard. Afaik, the only case where there's a similar issue without macros is when using using I think we're going circle. I don't want to push too much. But there's one thing I'm failing to understand at the moment. What's the problem with adding a new optional entry in There's clearly some pushback against it. But I'm not sure why. |
I'm saying that you must be able to get a version solve. Otherwise I'll consider that a breaking change for the SDK. A new minor version of the SDK should be able to achieve the same version solve as the previous version. That's the goal I'd aim for, and anything less than that is not a good user experience. |
@leafpetersen do you want to weigh in here, we are going in circles |
@lrhn I don't understand what you're saying here. Whether or not it is a best practice to not have a max SDK constraint (or pin to an SDK) it is definitely possible for packages to do so. And moreover, even if there is a version solve, there is absolutely no expectation that the same version solve will work. We have packages that we tightly bind to the SDK: when we release an SDK, we rev those packages. That absolutely means that an existing solve may stop working, right? So I don't understand what you mean when you say "I'll consider that a breaking change for the SDK". This has nothing to do with the SDK, right? |
Let me try to summarize the various concerns here, since it seems like the discussion has ranged fairly widely. First, a macro is a package, and like any package, it requires a certain language version in order to run. This requirement is expressed via an SDK constraint like any other package. So I believe that we can leave this concern to the side. We have a solution for this (though we need to ensure that any changes we make for subsequent concerns doesn't break this). Second, a macro is a consumer of code. It introspects on code. And when we add new features to the language, we need to worry about (at least): the macro erroring out because it encounters an unexpected thing during introspection; or the macro failing to generate the correct output because it is unaware of the new feature. Tools we might have to (partially or entirely) address this include doing something with versioning in the pubspec; versioning the package which does the introspection; or perhaps explicitly structuring the introspection API in such a way as to be as forgiving as possible. Third, a macro is a producer of code. As such, it must produce code for for use in a library which has a specific language version.
Is that a fair summary of the problem space? Am I missing anything major? Given the above, I would encourage us to be very pragmatic in how we think about this. In an ideal world, we would provide guarantees about breakage, but in practice, I suspect that given the level of tight coupling here, we may need to aim for "not too much breakage". With that in mind, it might be worth trying to quantify a bit how serious the different issues are. Some comments. I would highly recommend that we simply not try to deal with or address It feels to me that the introspection problem is relatively tractable via package versioning. Does that seem right? That is, we propose to release a package which does the introspection, and my vague sense is that we should in most cases be able to update this package to handle new language versions in a non-breaking way. Much like package:analyzer, this package will claim to handle "all future 3.X versions" when it may in fact not, but in practice (much like package:analyzer), this will likely mostly work out. Is that right? Is that the story? Do we need something more on the introspection side? That leaves the code generation side, which is harder. Let's look at a few cases. First, it seems to me that trying to handle the case where code which was valid in language version X becomes invalid in language version Y is probably intractable. Preparing for that in general, would, I believe, simply require that every macro package always declare itself as only applying to one language version, and releasing a new macro package every time. In practice, we tend not to release these kind of changes unless we expect the impact to be relatively small. And if we release such a change with larger impact, then I think (much as with null safety) we would need to explicitly work with macro authors to ensure a smooth transition. Am I missing anything here? Is there any feasible path to do better? Assuming not, that leaves the case of a macro which generates code which uses a new feature not available in previous language versions. As discussed above this can happen either unconditionally, or conditionally based on what code the macro reads. Let's start with the unconditional case. This one is in some sense easier, it seems to me. A macro which unconditionally generates code that requires language version X, should never be applied to a library at a lower language version.
Finally, that leaves the conditional code generation case. The macro propagates types/features that it reads into its output. If it reads something at language version X (possibly via a transitive dep), it may produce something at language version X.
Is that a fair summary of the problem space (even if you don't agree with my tentative map of the severity/possible solutions)? |
This is a good summary! I think this may be missing the usability concern with regards to macros being code producers. The problem is twofold:
|
The macro author should always be able to know the language version of the library they are generating code into. Our macro API should be able to say, for each library, whether any known language feature is supported by it. A macro author should not just generate, fx, records because they see a record somewhere. Or maybe they do, because it's necessary to satisfy the macro annotations on a pre-record-feature library which refer to a post-feature typedef from another library, and then it just fails to compile. But if someone used a macro which generates literals of some predictable type, and literals of that type are not allowed in this library, then you're holding the macro wrong. But they should easily be able to check whether the records feature is supported or not, if they want to err gracefully. Maybe we can prevent it, if the code generation goes through an API that is versioned by the library language version, but more likely we can just make it easy to avoid creating something invalid, if one tries. (I don't think |
It can apply or not, but that's a property of the macro implementation. That doesn't mean that that particular macro can't be used in a program containing a version 3.4 package, that package will just not have a direct dependency on the macro package. And therefore not use it directly. I guess a "minimum applicable language level" for a macro package makes sense for this. A macro can declare itself incompatible with a direct dependee that won't have any libraries the macro can apply to, ensuring that if the package did have a dependency, it'll get a prior version of the macro. Not sure how solving will work for this. |
It's true that we can't always guarantee that a new SDK is valid for an existing package solve. I'll bet that in most cases, it will be, because while users can have SDK upper bounds, most of them really have no reason to. There is no benefit. If we make a package which does have an SDK upper bound, I consider it our damn responsibility to ensure that there is a new backwards compatible version when the next minor version and otherwise unsupported SDK is released, so that a Otherwise upgrading the SDK will break existing code that uses only If |
@lrhn I think it would be useful for us to be very precise about what we mean by statically here, because it's one of the confusing things about macros and about this discussion. "At runtime" for a macro is "at compile time" for a program. Which do you mean here by statically? More generally, I think the discussion here touches on multiple different points in time, any of which could be considered "statically known".
Different solutions afford different user experiences here, which do we aim for? |
Who knew that having a rapidly evolving language, supporting multiple versions of a language in a single program, requiring packages to share versions of their dependencies, and adding static typing would lead to complex versioning problems? 😆🫠 In case another voice helps, here's mine. This thread wandered off into discussing issues around evolving the macro API itself but I'll ignore that here. The core problem in this thread is:
In practice, I think this will be a fairly rare problem. But when it does happen, the user will be both profoundly confused about what's going on (a compile error in some library they know nothing about) and have no clear path to un-wedging themselves. Even if uncommon, I think the latter raises the severity of this concern. I think it makes sense to try to tackle it using pub during version solving:
Here's an idea for how we could model this in the solver:
This doesn't help with Would that work? |
Yes, that is IMO the best fix. I think it should possibly be slightly generalized, and less specific to macros, although I don't know what to call it. But essentially this is just a way of enforcing the min dart library version a package can have, in order to depend on your package. The original suggestion was I think to just always do this by default for any package with macros, using the macros own min SDK constraint as the min dart library version that is required to depend on the macro. This would also work but is slightly less flexible. |
We probably will generalize it, and not make it about macros. Whether a package exposes a macro is ... probably decidable, but I'm not absolutely certain. There could be a helper package which imports So, just allow any package to declare a minimum required min-SDK for other packages to take a direct dependency on the first package. (If we can see that a package is a "macro providing package", then we could use its min-SDK as default min-supported-language-requirement. As long as you have a way to override it. If I am writing a macro, I'd want to be able to support older language versions too.) It's still a little iffy to make this about direct dependencies, because we have never formalized that you can only directly depend on packages that you have a direct declared dependency on. Programs still work if you import a library from a package that you only have an indirect dependency on. Including using macros that you only have an indirect dependency on. So, again completely generally, a package can declare a Assume a package release,
(And, also as usual, both have SDK ranges which contains the current SDK, otherwise they'd have been filtered out earlier.) I don't know how this affects constraint solving. If it contains edges for all valid direct dependencies, then we can filter those out easily. If it just remembers ranges and assumes those are contiguous, then it might need modification. Also, it means that a package which exposes significant non-macro functionality, and maybe also a few optional macro helpers, may become more restricted by adding the dependency-language-support-requirement, even for other packages which don't actually use the macros. Not sure that'll be a problem in practice. |
I did implement a lint for this a while back and it has been in the core set for a while now. Of course that is just a lint, but being in the core set hopefully most people have it (although I don't know if we have any idea the actual % of users?) |
I really don't think having more kinds of versioning constraints is what we want here. Macro's should be designed to "just work"(tm) especially given that we are only allowing augmentations to existing code, we should put an effort in defining augmentations (and thereby macros) such that different language versions can interact. We end up compiling everything to the same internal representation anyway, so it should be possible to define what augmenting a 2.x class with a 3.y (or vice versa) macro means (we anyway are going to run 2.x and 3.y code in the same isolate). In the few cases where things cannot be well defined, I think it is ok for the compiler to give up with an error: "Sorry you cannot extend a pre-3.0 class with a newfancyfeature-method." That is easy for me (pub developer) to say, but I think that will make the maintaining macros 100x more smooth (no need to release an update for each new language version, no need to have different codepaths for generating code for different versions, no need to understand the implications of a new kind of constraint...). |
IMO the best case scenario would be to treat generated code as being part of the macro package instead of the package using said macro. That would make generated code behave exactly like code from a random package. |
This implies either multi-language-version libraries or changing augmentations such that they aren't a part of the same library. Both of those I think are actually very challenging issues to solve, much more so than the dependency solver solution. But, of course that is easy to say as a language spec writer :).
There are very few cases where I think this change would help here. It will already be the case that macros will not have to update for most language versions. As an example, many code generators today generate code into part files, and it is extremely rare for them to have to update due to a new language version. We have good existing evidence this will not be a problem.
While some macros may choose to do this, it should also be exceedingly rare. Very few code generators today do this and they work out just fine. It might be slightly more problematic than previously because the code generator (macro now) is no longer a dev dependency, but version solving should be sufficient to resolve the issue. |
If we can allow an augmentation file to have a different language version than the file it's augmenting, then that would remove one source of incompatibility - that of generating code newer than the library it's added to. Both should be technically possible, as long as we can maintain a single consistent interpretation of the declarations at all times, one which is compatible with all the code, both that which is added before and that which is added later. What would the pitfalls be? Macro introduces feature that the macro application library doesn't understand, and cannot be compatible with. We try to make such things backwards compatible, but if (using today's features as example) the macro implementation introduces a So it's not really the language version difference, it's that the macro generates code which introduces requirements, and the existing code doesn't satisfy those requirements. (The language version just makes it so that it cannot satisfy them, without upgrading language version, but that's really a secondary issue.) The more I think about this, the more it feels "solvable". |
Uh oh!
There was an error while loading. Please reload this page.
Spawned from a discord thread here.
Macros generate code which has some language version requirement, but the way current SDK constraints work, they are not sufficient to enforce that macros are only applied to packages that meet that requirement.
Consider the following scenario:
a
, which uses a macro from packageb
. Packagea
has a min SDK constraint of3.0.0
, but packageb
does not generate code that uses any features from>3.0.0
so everything is good.b
then releases a new version, which requires a feature from3.1.0
, but it is only used as an implementation detail. They update their min sdk constraint, and ship a feature release.3.1.0
(or greater) SDK runs apub upgrade
, which selects the new version of packageb
and they start getting errors (in packagea
😭). This is because the macro re-runs on the same version of packagea
, but it now requires a language version of3.1.0
, and it is running on libraries at version3.0.0
.All the SDK constraints are met here, but the existence of a package with an older language version also using that macro causes things to break down.
I think we will need to get a bit creative here, with a solution that is integrated into the version solver itself. In the scenario above, everything would have been fine if we just didn't select the newest version of the macro.
Today, any time a macro uses a new language feature, it would need to release that as a breaking change.
cc @jonasfj @sigurdm @natebosch @davidmorgan
The text was updated successfully, but these errors were encountered: