Skip to content

Macros API stability across language versions #3479

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
davidmorgan opened this issue Nov 23, 2023 · 6 comments
Closed

Macros API stability across language versions #3479

davidmorgan opened this issue Nov 23, 2023 · 6 comments
Labels
static-metaprogramming Issues related to static metaprogramming

Comments

@davidmorgan
Copy link
Contributor

Forked from discussion in #3466, the topic is how to avoid breaking changes to the macros API when there are language changes.

If the macros API can always avoid major version increases then macros will continue working "for free" on new SDK releases.

#3466 (comment)

The simplest is to have a package:macros/3.5.dart library for each SDK release that supports macros, and keep the old ones until we stop supporting that language version. Many will likely be identical, exports of the same underlying library. Some will have additions. Some will have breaking changes, which is why it's great that you have to opt-in to it.

@jakemac53
Copy link
Contributor

I am currently of the opinion that simply exporting a single version of the APIs through a versioned package (with only one library in it), is all we need here. That package will only have a breaking change on SDK releases that have breaking changes in the macro APIs, which I expect to be rare.

@davidmorgan
Copy link
Contributor Author

Thanks! I guess the suggestion is to review once the APIs are substantially complete to check if anything seems like it could be made more robust against changes, e.g. by adding "unknown" to some enum.

@lrhn
Copy link
Member

lrhn commented Nov 28, 2023

@jakemac53 We need to consider the version constraints on those packages too, and how creating a new SDK or macro API package release affects the ecosystem.

Assume we have a package:macros which is the macro API package, version 1.0.0 released along with Dart SDK 3.5 where we released macros.
That package depends on SDK hooks to work, so it will either:

  • Have a narrow SDK constraint, >=3.5.0 <3.6.0, which ensures that we can change those hooks in SDK 3.6, or
  • Have a large SDK constraints, >=3.5.0 <4.0.0, which requires us to maintain the same hooks until next major SDK release. We can always add more hooks, but the old ones must keep working. (Major SDK releases are rare and difficult enough already, so also updating and phasing out macro hooks is probably not a big extra effort).
  • Have a slightly larger, but still narrow constraint, >= 3.5.0 <3.7.0, where we promise to keep the hooks valid for at least n=1 more release, giving people time to move to the next package release before we stop being able to run the old one on a new SDK.

What you write sounds like it's the first choice, but they can all work.

In either case, a client of this package then has a dependency of macros: ^1.0.0, and all is well.

We keep this up for a while, releasing package:macros up to 1.4.0, along with and depending on SDK 3.9, without needing to make breaking changes, and client code keeps working. Some macros may occasionally see a declaration they don't know, but we should document that that can happen. As long as we don't need add to an enum or sealed family of types, or use an interface based visitor pattern, adding new types isn't breaking, just surprising and they're not directly supported by existing macros.
If someone depends on package:fancy_macro version 1.17, which depends on package:macros version ^1.3.0, it still works in SDK 3.9 by solving to version 1.4.0, which is API compatible. All is well.

Then we need to do a breaking change to the macro API for SDK 3.10. For some reason, that we can't predict.
If we didn't make package:macros 1.4.0 accept SDK 3.10 (and promise hook compatibility), and we just release SDK 3.10 and package:macros 2.0.0 with SDK constraint >=3.10.0 <3.11.0, then every existing macro package will be incompatible with SDK 3.10. Any program using macros will fail package resolution on SDK 3.10.

If we can release a package:macros 1.5.0 too, which is backwards API compatible but runs on SDK 3.10, then at least everybody can keep compiling, even if they can't use the new package API yet.
I think that is the minimum of backwards compatibility that we must provide. New minor-version SDKs must compile all code that was valid in the previous SDK version, and package:macros should be considered a part of the SDK in that regard, especially if it has a tight SDK constraint.

Then, best case, package:fancy_macro quickly (maybe even early, based on dev-releases of the SDK and pre-release package:macros version 2.0.0-dev.124 or something) releases a version 1.18.0 which depends on package:macros version 2.0.0.
So does every other macro package. Projects depending on a macro package, directly or indirectly, can start solving to the new versions, everything works again, and people can move on.

But if just one macro package author is on vacation, and doesn't update their package for a tenday, nobody who depends on that macro package, directly or indirectly, can compile their programs on SDK 3.10.
There is just no solve for the package constraints.

That failure mode is not viable. We must not end in that situation for anything less than a major SDK release.

Even if we can keep using the previous major version of package:macros on the new SDK, it's a migration issue when every program depends on multiple macros, which have to agree on the package:macros version. No program can use new language features with macros until all macro packages support the new package:macros version. Whether they use it or not.

This is where it might be reasonable to release the new API as package:macros_2 version 1.0, instead of package:macros version 2.0.
At first look, every use of the macros API is internal to a single macro implementation, running independently of every other macro. However, it's likely that there will be some helper libraries, based on the macros API, which will be shared by multiple macro packages, and then we're back to shared dependency version restrictions again.

Which kinds of breaking macro API changes are we expecting at all?
Do we have enums, sealed type hiearchies or interface based visitors? (If so, we probably shouldn't.)
Do we expect API changes to be correlated with SDK hook changes?

@jakemac53
Copy link
Contributor

What you write sounds like it's the first choice, but they can all work.

Correct, I think the first choice is the right option. It is the lowest maintenance burden and simplest model, and I believe that dev/beta release periods are sufficient to shake out any dependency upgrades that need to happen such that when a new stable drops, most users can in fact use it.

In the cases where a macro dependency hasn't yet upgraded, you can always try a dependency_override as well.

If we can release a package:macros 1.5.0 too, which is backwards API compatible but runs on SDK 3.10, then at least everybody can keep compiling, even if they can't use the new package API yet.
I think that is the minimum of backwards compatibility that we must provide. New minor-version SDKs must compile all code that was valid in the previous SDK version, and package:macros should be considered a part of the SDK in that regard, especially if it has a tight SDK constraint.

I do not agree that this is a requirement. If you look at the flutter SDK, they already violate this principle through their package pinning. Yes, it makes it painful to do a breaking change in any of those packages that are pinned, but it ultimately is possible to do still. In fact, it is all but guaranteed that flutter will pin package:macros anyways (assuming they ship some macros, but I think that is quite likely), so it will already be forced to exactly one version for each SDK, regardless of what we do, for the vast majority of our users.

Then, best case, package:fancy_macro quickly (maybe even early, based on dev-releases of the SDK and pre-release package:macros version 2.0.0-dev.124 or something) releases a version 1.18.0 which depends on package:macros version 2.0.0.

For any macro with a sufficient number of users, this should come up way before any stable release. We could even maintain a list of popular macro packages and pro-actively file issues on them if/when a breaking change is coming. This is a pretty common tactic actually, you just keep a text file with a list of package names in the repo, and people can file a PR to add their package to the list.

But if just one macro package author is on vacation, and doesn't update their package for a tenday, nobody who depends on that macro package, directly or indirectly, can compile their programs on SDK 3.10.
There is just no solve for the package constraints.

It is worth mentioning that they can use dependency overrides, and this might be enough (if all that is needed is a version constraint change in that package). But yes, in this situation a user may be blocked from upgrading their SDK to the latest version.

This is where it might be reasonable to release the new API as package:macros_2 version 1.0, instead of package:macros version 2.0.

The primary issue with this is it would be a potentially very large maintenance burden for us. All supported versions of the APIs would have to exist side by side in the SDK, and at a minimum the CFE/Analyzer would have to support all of those versions explicitly.

It is imo too much complexity to take on compared to the size of the problem it fixes, at least under the assumption that breaking API changes are quite rare anyways.

Which kinds of breaking macro API changes are we expecting at all?

I honestly can't think of any case in the past that would have required one. Even null safety probably would not have necessitated a breaking change. And even if we removed something from the language (lets say, we removed classes), the API for classes would remain because macros could still be running against older language versions.

We would essentially have to fundamentally change the structure of some declaration in the language in such a way that the API for introspecting on that declaration had to be changed in order to support it. As an example, if we gave functions multiple return values/types.

The API surface area is nearly identical to that of dart:mirrors.

Do we have enums, sealed type hiearchies or interface based visitors? (If so, we probably shouldn't.)

We have a Severity enum for Diagnostics. This is really not intended to be "read" though, it is only for creating diagnostics. We could possibly hide it, and instead have special constructors for each severity level.

The Code object is also sealed, as well as the DiagnosticTarget class, and it might be worth changing those. It is useful for them to be sealed for our internal code that deals with them, but may be harmful as a public API. The Code API would be the more concerning one here.

There are no visitor interfaces, my thought on that is if people want them it should be fairly trivial to implement those as a separate package on top of the existing APIs.

Do we expect API changes to be correlated with SDK hook changes?

Yes, the package will almost certainly be just a re-export of a private SDK library so they will be directly correlated, if I understand your question correctly. We would probably do explicit exports of certain symbols to ensure we don't accidentally leak any API we didn't mean to, which could allow us some time to be working on an addition to the API without directly exposing it to users.

@munificent
Copy link
Member

This is where it might be reasonable to release the new API as package:macros_2 version 1.0, instead of package:macros version 2.0.

The package manager for Go does this implicitly. Major versions of the same package are considered different package names, and an application can have multiple major versions of the same package in the. Go's structural typing makes that much more tenable, but I do wonder sometimes if a similar approach would be a good idea for Dart.

@davidmorgan
Copy link
Contributor Author

Closing in favour of #3706

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
static-metaprogramming Issues related to static metaprogramming
Projects
Development

No branches or pull requests

4 participants