Closed Ericson2314 closed 1 year ago
I greatly appreciate the amount of work put into the proposal, and I'd love to fall in love with it. This is an awesome idea, but IMHO (and I hate to say it!) wrong time: too late or too early.
In the current state we don't have enough resources.
Firstly, I'm not confident that the disruption on the way to the eventual goal is sustainable for the ecosystem. Indeed, we Haskellers tend to brush off such temporary concerns for the greater good of eternal shining. However, even if we accept such school of thought, I claim that even skipping ahead several years of churn we would not be able to maintain the split base
in the long term.
Currently we support only 1 major version of base
at a time, which needs to be compatible with exactly 1 version of compiler, and most of daily maintenance is done by GHC developers themselves. Under the proposal, to achieve the goal "upgrading GHC does not require bumping base
", someone'd have to support N major series of base
, each of them compatible with N versions of compiler, and do it all on their own (because GHC developers will be tasked with their internal ghc-base
), maintaining N^2 instances of CI and test infrastructure.
The thing is that we don't have resources for any N > 1. So in practice, once a couple of burnt-out maintainers leave the frustrated community, we'll get back to N = 1 and status quo.
With regards to Step 1A: Task the CLC with defining new standard libraries
, currently I don't see this happening. Ten person-years and one million dollars later? Maybe. We can barely cope with a never-ending flood of community proposals business-as-usual. There is no bandwidth for anything else unless we are full-time occupied.
I greatly appreciate the amount of work put into the proposal, and I'd love to fall in love with it.
:heart: this is still the most positive feedback I have heard you on this sort of thing, so yes, I will take it! Thank you.
Firstly, I'm not confident that the disruption on the way to the eventual goal is sustainable for the ecosystem.
Well there are no actual breaking changes proposed here, base
stays base in perpetuity. The new stuff is also supposed to be more stable. There is disruption on a meta-level in that this is a long marathon of work that will take up people's braincells, but I don't think more code will be broken at all.
Under the proposal, to achieve the goal "upgrading GHC does not require bumping base", someone'd have to support N major series of base, each of them compatible with N versions of compiler, and do it all on their own (because GHC developers will be tasked with their internal ghc-base), maintaining N^2 instances of CI and test infrastructure.
The thing is that we don't have resources for any N > 1. So in practice, once a couple of burnt-out maintainers leave the frustrated community, we'll get back to N = 1 and status quo.
OK a few things to take a part.
n
the length of the sliding support window, and so n
is bounded at, say, 2 or 3. n^2 = 4 or 9 isn't such a large number. cabal-install
has about that many CI jobs, for example.instances of CI and test infrastructure.
Computers are cheap! I would gladly CI an order of magnitude more, say 100 combinations! Compared to the volunteer person-hours put into Haskell this is nothing. Maybe the dev-ops of managing that much testing is large, but if so that is simply a sign that our CI needs to be more reliable.
someone'd have to support N major series of base, each of them compatible with N versions of compiler, and do it all on their own (because GHC developers will be tasked with their internal ghc-base)
Considering the amount of time we waste individually (human individuals and firms) trying to upgrade our projects. I think this is less work overall. And pooling our resources so we don't inefficiently suffer in isolation is the exactly why the Haskell Foundation exists in the first place.
And the more we rationalize the libraries the easier it should get. We have lots of friction / maintainship overhead precisely because the division of labor is so bad between these libraries today. This is why base
takes a whole committee, meanwhile @RyanGlScott can maintain a fuckton of libraries almost all by himself.
I am banking on the productivity gains here swamping the fixed initial investment; I don't think that is pie-in-the-sky thinking.
With regards to
Step 1A: Task the CLC with defining new standard libraries
, currently I don't see this happening. Ten person-years and one million dollars later? Maybe. We can barely cope with a never-ending flood of community proposals business-as-usual. There is no bandwidth for anything else unless we are full-time occupied.
So I too think front-loading the CLC bike-shedding is silly and will waste CLC time! But the feedback I got in https://github.com/haskell/core-libraries-committee/issues/105 was repeatedly "let's not do anything tell we have a design of where we want to end up" . I would much rather do the ghc-base
split to de-risk and start locking into those productivity gains before tasking the CLC with The Big Bikeshed, but I didn't want to write the proposal in a way people had specifically told me not too.
If there is consensus that it does make sense to explore the the behind the scene stuff first after all, I will gladly change this.
Similarly, I would much rather just straight up commit to the idea that there should be an IO-free 100% portable standard library that the IO-full ones build upon. If we can commit to that, I would also delay wrestling with what the Browser and WASI IO interfaces should look like until later --- in fact we can simply experiment with designs on top of the IO-free design and then have the CLC ratify something that is implemented vs sketch something tentative out from a blank slate.
If there is consensus that that too sounds good --- also in the past I got feedback that having too many libraries would piss off users --- I will also gladly rearrange the steps to reflect this.
Finally (and I should put this in the proposal), I know this proposal is a big lift --- not because it is technically challenge, but because it is administratively challenging --- but I think that can be a good thing. This can be the marquee project that the Haskell Foundation takes on, something super user-visibility and impactful, that can drive a lot of interest and fundraising.
As the saying goes, "you have to spend money to make money": I think if the Haskell Foundation can rise to the occasion and pull this up, we'll grow our resources and administrative capacity to meet the needs, and with such momentum be in a better place to tackle whatever comes next.
Regarding the concern of CI complexity and testing: we have a full time engineer working on that already, employed by HF.
Afaiu their scope currently lies on fixing GHC tests and CI stability. However this work seems very much in alignment to the required work of this proposal.
@chreekat
Very nice and thoughtful write up, thanks a lot! I hope it'll happen, one way or another.
Thank you for this thoughtful writeup John. I appreciate it. It is not easy to navigate the best path given the differing needs of our users, and limited resources. But debating and (I hope) agreeing a North Star destination would be reallly helpful, even if it takes us a while to get there.
So I'd argue for not getting enmeshed too quickly in "we can't afford it". (Having said which, "we can never afford it" is a reasonable argument. e.g. It's a waste of time to debate which particular exoplanet we want to colonise when we have no feasible way to get to any of them.)
New Goal: Split Base
I wonder if you could elaborate the proposal to explain why splitting base will help? After all, if we become 100% clear about what is
@hasufell , thanks for the mention. :) Yes, I'm here, and in fact you could say my current mission is to increase the GHC team's bandwidth. I trust that will have far-reaching positive effects on topics such as this one.
This is definitely an interesting topic and a cool development. I look forward to watching it progress.
I echo Simon's request for more information about how splitting base will help. As a matter of fact, I do have a couple guesses, but they are only guesses! It would be good to see the reasons fleshed out in the proposal.
Good point. Splitting base is done to address Problem 4 without the maintenance burden explosion @Bodigrim warms of. However this is indeed not yet described well. I will update the "New Goal: Split Base" section to make the connection between these 3 things (Problem 4, maintenance cost control, and split base) clear.
Thanks for this. I left some minor spelling/grammar suggestions.
On the subject of template-haskell
: I am quite sympathetic to the concern over the churn that TH imposes on its users. The quotes-as-patterns proposal is clever, although it also strikes me as non-trivial to implement. I can't help but wonder whether we can't cheaply improve the TH story with the tools that we already have at our disposal. I have a few ideas along these lines:
There already exist widely-used abstractions built on top of template-haskell
which provide a simpler, more stable interface to a subset of its functionality (e.g. th-abstraction
, template-haskell-compat-v0208
). Making these more well-known both improve users' lives (since th-abstraction
in particular is much nicer to use) and reduce sensitivity to churn in template-haskell
.
Moreover, many TH uses are essentially only working with the Haskell 98 subset of the language. Perhaps we could simply expose a stable Language.Haskell.TH.Haskell98
module, consisting of a set of pattern synonyms exposing precisely the features of the Haskell 98 language. The same could be done for, e.g., GHC2021. This would greatly reduce the sensitivity of splices constructing Haskell from language changes.
Finally, template-haskell
's current implementation is not well-designed to shield users from language changes. The simple act of adding record field names to its data constructors would solve problem of pattern matching nearly completely. In conjunction with a set simple (e.g. Haskell 98-inspired) smart constructors, I suspect we could significantly improve the status quo with no changes to the compiler at a..
@bgamari Yes, I think it is good to "walk before we run", and I support starting with those lower-tech solutions first. The template haskell stuff being a "bonus step" I meant more as future work just to provide context to the rest / establish a larger pattern. I would happily include the things you mentioned in that bonus step too.
(Indeed, the use of field names to avoid breakage is also a pattern I'd like to promote; c.f. https://github.com/ghc-proposals/ghc-proposals/discussions/513.)
I think that the concerns that this proposal addresses are important, and that the outlined solutions plausibly address them. However, I worry that this proposal is so big that we aren't really able to have a realistic discussion about what it would take to implement it.
As far as I can see, there are two essentially orthogonal sets of concerns being addressed:
The first concern comes from a real problem that existing users have today. A new GHC release might enable support for a crucial piece of hardware or a fancy new cool-running long-battery-life laptop, and yet changes to base
can make it inaccessible until the long chain of transitive dependencies have had their compatibility issues worked out. Or, a new version of base may implement a needed function, but the newer compiler has a regression on the code in question or needs to undergo some auditing or validation prior to adoption.
The second concern seems to be less about existing users (the Windows impedance mismatch is real, but my understanding is that it's more a mild annoyance than anything), and more about opportunities for strategic expansion in the coming years.
What about explicitly picking one of these to solve first? I think that disentangling them would make it easier to evaluate the fit between the proposed solution and the particular subsets of your problem statement that they address.
Also, I don't think that asking the CLC to design a library is a workable solution. They signed up to evaluate proposals, rather than to herd cats. I think that a process that takes cat herding more into account and proposes a way to get from what we have today to the desired goal more incrementally might be more likely to get done.
Thanks for this important work!
What about explicitly picking one of these to solve first? I think that disentangling them would make it easier to evaluate the fit between the proposed solution and the particular subsets of your problem statement that they address.
Definitely!
I think the best way to do this is focus on a standard library that does no external IO. (It can have an IO Monad for IORefs, MVars, and other "Internal" IO). That is both exactly what portable libraries that want to run everywhere would want to rely on and a good foundation to brainstorm (without CLC or HF hand-holding) what such a WASI or Browser library should look like. (Indeed @hamishmack's work binding Browswer APIs has existed for years, is very good quality, and could depend on such a library instead of base
)
Also, I don't think that asking the CLC to design a library is a workable solution. They signed up to evaluate proposals, rather than to herd cats. I think that a process that takes cat herding more into account and proposes a way to get from what we have today to the desired goal more incrementally might be more likely to get done.
Yes. I agree that it is better something get thrown together and wrapped up into a proposal so they can vote on it. Rather than they having to baby sit the whole process.
I sent a PR with some grammar fixes. I support the proposal overall. One improvement would be to clarify that step 1A, Task the CLC with defining new standard libraries is literally just tasking, or putting CLC in long-term charge of approving new libraries to encapsulate and later replace parts of base
. There would probably be no visible results for a while.
This was not completely clear to me until I reached section 6, Timeline. I think the overall proposal could be clearer if it made this timeline more prominent.
Looking at the current state of discussion, I'd summarize the status as:
base
may not be enough to accomplish goals such as staggered compiler/library version upgrades, due to other strongly coupled libraries like template-haskell
.Is that a fair summary of the current status?
- Working with
base
may not be enough to accomplish goals such as staggered compiler/library version upgrades, due to other strongly coupled libraries liketemplate-haskell
.
Fundamentally I don't think that bumping upper bounds once a year is a big deal. Even if a maintainer is absent, one can file an issue at https://github.com/haskell-infra/hackage-trustees and pass --allow-newer
to Cabal / Stack in the meantime. It's actual breakage which makes things troublesome.
There are some low-hanging fruits in this area. One is to educate maintainers and migrate packages away from ghc-prim
: clients are supposed to consume it via GHC.Exts
from base
only. Voila! One less annoying upper bound to update.
Another interesting proposal is https://github.com/haskell/core-libraries-committee/issues/64 - yet I don't see much activity there, even while it's magnitudes easier to accomplish than actually splitting base
.
Unless they're coauthors on a proposal, I'd prefer that a proposal instead say that a standard library revision will be sent to CLC for approval as part of execution.
Just to be clear about it upfront: CLC proposals with breaking changes require impact assessment and patches for all Stackage packages.
My high-level refactor suggestion to John would be the following (we chatted about this a bit verbally): Make a proposal solely for a non user-facing refactor of base into two libraries, ghc-base
which is not reinstallable, and base
which is. Perhaps the latter can in fact be pure-base
which has no IO, and base
which depends on it.
This is a single, conceptually easy but technically difficult task.
The remainder of the proposal can then be motivating this change, talking about potential difficulties in carrying it out, and also sketching various future changes this would enable, without committing to any such particular future changes in detail.
I think that if one presumes we need forward motion, then this proposal is a baseline necessity for any sort of forward motion, and hopefully innocuous enough on its own (no breaking changes, clarifies and refactors existing dependencies), that it can reach wide consensus and support.
In so far as this "big bang proposal" is a response to folks asking "where is stuff going" the future plans section can still have lots of that sketched out.
Thanks everyone for the feedback.
My plan has been to rewrite the technical content and timeline sections to first answer "where are we trying to go?" and only after answer "how are we trying to get there?". I think that will be a good way to address just about all the feedback so far.
Sorry for the delay, I have pushed a new version doing basically @gbaz's high level suggestion while leaving the rest as non-normative follow-up steps to provide more context.
I'll next try to go over @david-christiansen's more in-the-weeds feedback seeing which parts still apply. I am sure many parts still do, but I wanted to do make the broad brushstrokes before the little details.
I just want to echo the gratitude expressed elsewhere — these are really important issues and the analysis is impressive; it is great that we are thinking about these things.
The hard bit is, of course working out a realistic evolution path and the key to that is surely:
We in the HFTT are happy with the current state of this proposal and are proceeding to get feedback, buy-in and approval from the CLC and ghc developers.
FWIW, I just noticed some conversation notes in https://github.com/haskellfoundation/stability/blob/main/meetings/2023-01-23.md that I think was meant to be cross-posted here, but that didn't happen.
@Ericson2314 The proposal does not seems to speak about whether the split out package has to follow PVP. It only says so for base
.
To me, that is a major concern (as in: I'd want both of them to strictly adhere to PVP).
My understanding of the intention is that we end up with two packages that both follow PVP. ghc-base
ends up being strongly coupled to GHC, just as base
is today, which implies a major release of ghc-base
for each major release of GHC. Once some of the sorting out has been done, it should be possible to make a minor release of base
that supports a new major release of ghc-base
in many cases, either because base
doesn't exposed the things that were changed in ghc-base
, or because some work is put in to provide a consistent interface over time.
That way, people who really do need to depend on GHC guts end up with strongly-coupled dependencies that they have to keep updating, and the build plans tell them about it. Those who just use the "public" parts of base
get more flexibility around upgrades. Having both packages adhere to PVP seems essential to achieving this goal.
Here's a summary of what I see as the current state of discussion in the various threads, classified by topic. Many of these issues have already been discussed, and I think even answered in a satisfactory manner, but it would be good to see these answers incorporated into the text, as the proposal text will serve as a basis for getting the work done.
This summary is based on the discussion on https://github.com/haskell/core-libraries-committee/issues/145, the ghc-devs
mailing list thread, and a number of phone calls and chat threads that I've participated in.
The problem statement:
No clear boundary between private/unstable and public/stable interfaces in the standard library.
includes a lot of implicit knowledge that needs to be expanded to have a more full view. In particular, this lack of a boundary conjoined with the PVP implies new major versions of base
for every major version of GHC, which imposes an ongoing upgrade cost. Furthermore, it makes it more difficult to know whether an application or library (or its transitive dependencies) rely on GHC internals and would thus be more fragile in the face of upgrades. Spelling out these consequences would be useful.
Today, part of the reason for the close coupling between base
and GHC that necessitates a one-to-one major version correspondence is that various bits of GHC internals are part of base
. Exposing these internals is important, as they allow programs to be written that otherwise could not be, but the tight coupling introduces fragility around updates.
CLC issue 146 is a proposal to begin a process that indicates which parts of base
are GHC internals, and which parts are version-agnostic standard library. That proposal suggests a documentation field to capture this.
The relationship here is that this proposal is an alternative means to accomplish the same ends. The process of untangling the GHC internals in base
from the standard library of Haskell will need to happen to get the full value of this proposal; this proposal sets the stage for it to happen incrementally and without needing special exemptions from or changes to the PVP.
One question that has come up in discussion is whether a documentation field is sufficient to meet the goals of the proposal. The explicit problem statement is about clear boundaries between public and private declarations.
It seems that this boundary could be made clear through Haddock. In discussion in a variety of fora, the following points have been raised:
Control.Applicative
's "Stability" field read "experimental" as recently as GHC 9.2.One further advantage of this proposal that should probably be more explicit is that a single major version of base
could support multiple major versions of ghc-base
, which could make it easier to adopt new compilers for those who don't depend on the internals. The proposal does not achieve this, but it's a preliminary step in this direction.
base-compat
?base-compat
is a long-lived library that makes it possible for a library or application to support a wider range of GHC versions without needing to write their own CPP. Essentially, it centralizes the work of many of the compatibility shims into one place.
Today, base-compat
is widely used, but it is not universally used. One alternative way to achieve the proposal's goals could be to either promote the use of base-compat
in fresh projects, or to widely publicize it to users who do not use it.
Here are some pros and cons that the proposal should address:
base-compat
already exists, so using it more widely could be a cheap way to achieve the goals of the proposalbase-compat
requires using its namespace, making compatibility shims a visible part of the code (at the very least in the import lines)base-compat
does not backport everythingbase
at all?One question that came up is why GHC-specific details can't be in the ghc
package. The answer to this is that the ghc
package contains the implementation of GHC, while base
contains things that GHC provides Haskell programs but that reasonable other Haskell compilers (including past or future versions of GHC) may not, especially things like low-level access to system and RTS primitives.
base
and exempting them from CLC supervision?Reasons why not:
@gbaz posted an excellent summary of how the proposal is intended to help all three by more explicitly negotiating expectations and boundaries.
@cartazio asked a number of insightful questions about how GHC development might work. I'm not so sure that we need answers on them right now - it seems clear to me that GHC development can proceed as before when this proposal is implemented, and that the GHC team can make whatever decisions they see fit WRT combinations of decoupled base
and GHC itself.
The proposal should state, but does not, that both base
and ghc-base
will follow PVP, and be able to do so more strictly than base
has been able to. We should expect that every major GHC release results in a major ghc-base
release (and even that minor ones might), but that most of these will result in only a minor base
release. This allows programs that do depend on GHC internals to be honest about this relationship, while not requiring everyone to bump bounds as much.
implies new major versions of base for every major version of GHC,
Is not necessarily a bad thing, as the base
version is the only well working way to specify Haskell language version as implemented by GHC.
Small or big changes in how GHC works happen essentially in every major GHC version (otherwise it wouldn't be major, would it?) e.g. whitespace sensitivity or requiring enabled FlexibleInstances to type-check some instances or upcoming change like requireing TypeOperators for ~
to work.
However the base
version is the only tool currently available to specify bounds on GHC.
The
if impl(ghc <x.y)
buildable: False
does not work, it doesn't do what most people think it does; and the current semantics of that cannot be changed easily either. (Search for Cabal issues if you want to learn more).
Therefore, as Hackage Trustee who tries to keep metadata on Hackage correct, so the packages which don't work with GHC declare that:
Please continue to couple new major GHC versions with new major base
versions
Alternatively, give some another way to specify GHC compatibility ranges, such that (perfectly) all existing .cabal
files on Hackage can use it (i.e. old Cabal
s don't error on them), and so maintainers and Hackage trustees can make revisions modifying them.
This point is too often forgotten, but it's vital for the health of the ecosystem to be able to declare which GHCs packages work with.
Alternatively, give some another way to specify GHC compatibility ranges, such that (perfectly) all existing .cabal files on Hackage can use it
What's the problem with existing .cabal files? They specify base
versions that already exist, a technique which will continue to work retrospectively for base
versions prior to this proposed change.
Thank you for this important feedback, I don't think that this point had come up yet in our discussions.
I'm not convinced that the benefits you mention for tight coupling outweigh the costs in terms of expensive upgrades, delayed access to new compilers, and the high need for coordinated updates throughout the ecosystem. But I also don't really know how to measure either of these.
What about this slightly different design: we could retain base
as the name of the package that contains both the standard library and exposed GHC primitives that is tightly coupled to GHC. Then, we could introduce a new package called something like stdlib
that all "new project" templates point at. In this version base
would play the role of ghc-base
in the proposal, and stdlib
the role of base
, and contain all the same things.
To interpret your question further, you write
it's vital for the health of the ecosystem to be able to declare which GHCs packages work with
Do you see this as something that it's good to be able to do, or something that should be normative for all Haskell packages? Today, we essentially have the latter, and it imposes significant costs.
@tomjaguarpaw
What's the problem with existing .cabal files? They specify base versions that already exist, a technique which will continue to work retrospectively for base versions prior to this proposed change.
Say there is package foo
uploaded five years ago, which uses ~
constraint. It works fine. It has base <5
. No-one touched it in these five years.
Than GHC-X.Y makes use of ~
require enabling TypeOperators
, but still has say base-4.20.u.v.
as GHC-(X-1).Y and GHC-(X-2).Y
How would you revise the foo
's metadata so it works with all GHC's prior to GHC-X.Y
, but not with new GHC-X.Y
Do you see this as something that it's good to be able to do, or something that should be normative for all Haskell packages? Today, we essentially have the latter, and it imposes significant costs.
Yes, it is significant cost, and there is no way around it, until GHC stops breaking the language, like the whitespace change recently, or upcoming requirement for enabling TypeOperators
to use ~
.
To say it clear:
There is no guarantee that any package working with GHC-9.6 will compile with GHC-9.8, even if nothing changes in bundled libraries. The language may change in a non-backward compatible way.
And such changes happened even when using just a report language, when using extensions the risk is exponentially bigger.
(Search for Cabal issues if you want to learn more)
@phadej please link the issues that matter. The idea that Cabal couldn't somehow express bounds on the GHC version is, until further evidence is given, plainly ridiculous.
Maybe the current buildable: False
and impl(ghc bound)
don't work and can't be fixed for back-compat reasons, but then we can just introduce a new mechnanism that does work.
There is no guarantee that any package working with GHC-9.6 will compile with GHC-9.8, even if nothing changes in bundled libraries. The language may change in non-backward compatible way.
I absolutely do agree with that. So let's have compiler bounds that work. Problem solved. And intent better expressed too.
@phadej please link the issues that matter. The idea that Cabal couldn't somehow express bounds on the GHC version is, until further evidence is given, plainly ridiculous.
You can express bounds indirectly using
if !impl(ghc >=8.0 && <9.7)
build-depends: unbuildable <0
but it's currently impossible to add such block using a revision.
EDIT: and it's also not very beginner friendly way either, but more convenient way can be introduced in newer cabal-version
s of the spec, if someone puts the design and implementation work in.
It's currently possible to add dependency to base
even if package didn't depend on it previously (but most do).
How would you revise the foo's metadata so it works with all GHC's prior to GHC-X.Y, but not with new GHC-X.Y
There is still no need for a new constraint on ghc specifically. You simply place a base constraint that makes use of the minor version as well as a major version.
I.e. this proposal would not eliminate the fact that distinct versions of base correspond to distinct versions of ghc. It would simply mean that distinct major versions of ghc might ship versions of base with only minor version bounds differences. We currently already have the situation where distinct minor versions of ghc ship base with minor version bounds differences (e.g. there are versions 4.16.0.0, 4.16.1.0... 4.16.4.0), and existing mechanisms are fine for using those to distinguish ghcs.
We currently already have the situation where distinct minor versions of ghc ship base with minor version bounds
We don't. GHC-9.2.5, GHC-9.2.6 and GHC-9.2.7 all ship with the same minor version base-4.16.4.0
.
But if it's required that new (major) GHC will ship with a new (minor) version of base
even if nothing changes there, that would work too.
That said: it will still be sad situation that we use base
version as a proxy for GHC version. Hopefully we can be more direct eventually. (And be clear where we don't care about base
library version, but only about GHC's or vice versa).
But if it's required that new (major) GHC will ship with a new (minor) version of base even if nothing changes there, that would work too.
I think that's the simplest thing to do at the moment. Introducing a way to control for version of ghc directly in the future seems like it would be important, but I think it is a bridge too far for this immediate proposal, because the design space is rather large, and there are some interesting considerations involved (in particular, if one doesn't use extensions, one is ideally writing something that is conceptually "compiler-independent" despite our basically one-compiler world, so we don't want to force all packages to genuinely specify a ghc, etc).
so we don't want to force all packages to genuinely specify a ghc, etc).
Indeed. But I think the ship has sailed. We are in one-compiler world, and we might just take advantage of that.
I.e. this proposal would not eliminate the fact that distinct versions of base correspond to distinct versions of ghc.
Yes. There is important follow up work that does, but that is not this proposal. But if/when we got to a proposal for that, that can also call out finding an easy ergonomic way to have bounds in cabal files that are revisable.
Say there is package foo uploaded five years ago, which uses ~ constraint. It works fine. It has base <5. No-one touched it in these five years.
Than GHC-X.Y makes use of ~ require enabling TypeOperators, but still has say base-4.20.u.v. as GHC-(X-1).Y and GHC-(X-2).Y
How would you revise the foo's metadata so it works with all GHC's prior to GHC-X.Y, but not with new GHC-X.Y
It sounds like you are saying that the difficulty arises (only) when trying to perform revisions on existing .cabal files, is that right?
it's currently impossible to add such block using a revision.
What causes it to be impossible? What solutions could there be? I don't know how Hackage revisions work. Could Hackage simply be extended to allow adding such a block?
@tomjaguarpaw Yes as far as I can tell that is true. No fundamental issues, just some accidental complexity to wade through while being mindful of Chesterton fences.
There’s no technical obstacle, merely a design decision around supported types of revisions being very restricted to much more limited edits than what this would need
On Wed, Mar 22, 2023 at 10:58 AM tomjaguarpaw @.***> wrote:
it's currently impossible to add such block using a revision.
What causes it to be impossible? What solutions could there be? I don't know how Hackage revisions work. Could Hackage simply be extended to allow adding such a block?
— Reply to this email directly, view it on GitHub https://github.com/haskellfoundation/tech-proposals/pull/47#issuecomment-1479722514, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAABBQV4GSCDHKZNWZ74JDTW5MHQ7ANCNFSM6AAAAAAS3JJLOE . You are receiving this because you were mentioned.Message ID: @.***>
one is ideally writing something that is conceptually "compiler-independent"
Library code and its cabal file should not mention ghc, a specific ghc version or range of versions anywhere. Instead, each ghc version should specify the range of base versions it can build.
The task of a compiler is to build code written in a specific language, but it is not the task of a library to pay attention to which compiler version it can be built with.
Everything else is bad design.
@jdkr
Currently each major GHC implements different language. Some wish it would be the same langauge, or at least that newer languages would be strict supersets of previous ones, but that is not the case now and doesn't look like to be any time soon.
And even with strict superset approach one need to be able to specify the minimum required language somehow.
Everything else is bad design.
I'd prefer it if discussion in this repository did not include broad, general, negative statements that are unsupported by concrete arguments. I don't think it tends to lead to progress in our collective understanding of the issues at hand.
I'd prefer it if discussion in this repository did not include broad, general, negative statements that are unsupported by concrete arguments.
Sorry, didn't mean to offend anyone personally. I'll try to be more factual.
Sorry, didn't mean to offend anyone personally. I'll try to be more factual.
No worries - I don't think that the message caused anyone personal offense. The concern here is to avoid unproductive flamewars, and that's easier to do while temperatures remain low. Statements like "X is bad" are much more likely to escalate than statements like "Here are some drawbacks of X".
For what it's worth, I have recently stumbled across a use-case where ghc-base
would be extremely useful. Specifically, the exception backtrace proposal proposes that exceptions carry stack backtrace information represented by GHC's new StackSnapshot#
type, among other representations. It further proposes that backtraces can be decoded and formatted for display to the user. The logic for decoding such stack snapshots currently lives in the ghc-heap
package, which cannot be depended upon by base
(since this would induce a dependency cycle).
One way to address this would be to move the implementation of ghc-heap
into ghc-base
and expose it as a set of internal modules. These could then be used by the exception machinery in ghc-base
/base
and re-exported from ghc-heap
(which would continue to be their canonical home).
OK pushed some more stuff based on https://github.com/haskellfoundation/tech-proposals/pull/47#issuecomment-1479101114 / @david-christiansen and my discussing some things about a month ago.
Issues with the standard library are holding back the Haskell ecosystem. The problems and solutions are multifaceted, and so the Haskell Foundation in its "umbrella organization" capacity is uniquely suited to coordinate the fixing of them.
Here is proposed a first step in this process, splitting base. Future work addressing a larger set of goals is also included to situate splitting base in context.
Rendered