Randgalt / record-builder

Record builder generator for Java records
Apache License 2.0
758 stars 55 forks source link

null values and addConcreteSettersForOptional #107

Closed lpandzic closed 2 years ago

lpandzic commented 2 years ago

I find that addConcreteSettersForOptional uses Optional#of instead of Optional#ofNullable quite surprising especially since Intellij IDEA by default doesn't warn against passing nulls to javax.validation.constraints.NotNull annotated parameters. What is the reasoning behind this decision?

Randgalt commented 2 years ago

Yeah - I think that might be a mistake. I see no reason why it shouldn't be ofNullable().

lpandzic commented 2 years ago

@mads-b I see you are the https://github.com/Randgalt/record-builder/pull/95 author, is this a bug or intentional?

mads-b commented 2 years ago

@lpandzic it is intentional, but feel free to change it.

My reasoning is that if you have nulls you shouldn't be calling the setter at all; you should know whether your value is nullable or not. Another way of wording it: If you have a code base full of optionals you should already be passing parameters as optionals, so you should be using the setter for the optional type.

That being said, there is no harm in just tolerating it in the library code like you suggest.

lpandzic commented 2 years ago

So my reasoning why this is problematic:

  1. if I have Optional parameter then why wouldn't I use record builder method with Optional parameter?
  2. if I have a parameter that I'm not sure whether is null or not if I have to do nullness check so I might as well wrap it in Optional and use Optional parameter method

Now of course there's the third case as you've mentioned where you have a non Optional parameter that you know whether is null or not and for that case current API makes sense.

So with @NotNull parameters case 2. is problematic.

If we flip the @NotNull to @Nullable we solve the 2. case and I don't see any hindrance to third case - if you know whether it's null or not then I don't see any value in double checking.

So to finish off - I see it being slightly more sensible to have @Nullable with Optional#ofNullable implementation because chances of that case being a pitfall or surprise is slimmer than in the opposite case, or that at least is my opinion.

agentgt commented 2 years ago

since Intellij IDEA by default doesn't warn against passing nulls to javax.validation.constraints.NotNull annotated parameters.

Just a clarification for @lpandzic and @mads-b you really really should not use javax.validation.constraints.NotNull annotation for null analysis and that is probably why intellij doesn't do anything.

The javax.validation is designed for validation. They values are expected to be possibly null. Before the object is validated the values could be null!

Null analysis on the other hand is an invariant and is type based. Thats why you should either use JSpecify or Checker or Eclipses annotations as they are TYPE_USE annotations. That is why #106 is important to fix.

@Randgalt it doesn't even make since to have methods with javax.validation.constraints (your gist you shared on reddit showing full builder) unless you immediately call the validator. In fact the requireNonNull on justAString is wrong as well as the previous lines. It should throw constraint validation exception. The other two parameters are @Nullable and should be marked as that.

To be honest a record isn't the ideal use case for using validation. It's actually the builder. A web request or whatever fills the builder and then when you build the record the builder is validated makes sure record has all the invariants likes not null.

lpandzic commented 2 years ago

Just a clarification for @lpandzic and @mads-b you really really should not use javax.validation.constraints.NotNull annotation for null analysis and that is probably why intellij doesn't do anything.

I'm not using it - I'm using the fact that IDEA lacks inference for bean validation @NotNull as an example of why current Optional handing doesn't make sense.

On the topic of whether bean validation @NotNull should be separate from one of many null inference annotation inference libraries is a debatable question. Whether we should be using Optionals in non jdk libraries is a even more debatable question. My opinion on the matter is that once they introduce inline types (or value types as they were known back in 2013/2014...) they should make Optional a non nullable inline type and this nullness check annotations problem can go away and we can stop wasting time on this.

To be honest a record isn't the ideal use case for using validation. It's actually the builder. A web request or whatever fills the builder and then when you build the record the builder is validated makes sure record has all the invariants likes not null.

I disagree. Invariants should be both documented and validated against records and not builders. Records are single source of truth for fields and properties of those fields. Whether a builder validates anything is and should be a configurable option and not a requirement. Any larger codebase with larger API surface cannot guarantee that builders are always used (e.g. external developers). If an external developer creates a record directly what happens then? Is validation done twice?

Anyway, I believe this side story is a topic for another issue. I'd like to focus on changing the Optional handling in this issue. @Randgalt can you please review the PR?

agentgt commented 2 years ago

I'm not using it - I'm using the fact that IDEA lacks inference for bean validation https://github.com/NotNull as an example of why current Optional handing doesn't make sense.

Maybe you can clarify that some more since I'm working on the other issue. I don't want to impact it. What inference are we talking about that isn't null analysis? Some completion? I haven't used IntelliJ in a year.

I disagree. Invariants should be both documented and validated against records and not builders. Records are single source of truth for fields and properties of those fields. Whether a builder validates anything is and should be a configurable option and not a requirement. Any larger codebase with larger API surface cannot guarantee that builders are always used (e.g. external developers). If an external developer creates a record directly what happens then? Is validation done twice?

Well thats the thing they are not really invariants like types. I don't really care what people's interpretation of the javax.annotations is so long as your not using them for null analysis (you should not) as they are not TYPE_USE annotations. I have seen libraries do this and it is really annoying.

The java validation mostly is used for this workflow: You have a HTML form presented on a GET request. Your model is an object that has validation annotations is either completely blank or partially filled. The object that gets put in the model (e.g. data binding) is often largely not valid. Then a POST of the form happens. Then you run validation in your controller. If you don't run validation first then you could get a null pointer exception instead of a constraint violation exception. Regardless you need the object in invalid state at times. That isn't the same as static analysis invariance I'm talking about with null analysis which you would want with Records. I meant earlier Builders are a better fit for model objects to be validated because they are inherently mutable and can very easily be in the wrong state (as well as some databinding requires mutable pojos). It's fine if the records have the validation annotations but they are not invariants.

If you do magical AspectJ or whatever byte weaving magic where you validate on constructing I guess I could see it being invariant but I largely disagree with using them as "invariant" like types. Besides that totally breaks the data binding workflow that validation is normally used for.

Any larger codebase with larger API surface cannot guarantee that builders are always used (e.g. external developers). If an external developer creates a record directly what happens then? Is validation done twice?

You can certainly use records with validation annotations but the expectation is you will be creating an invalid objects all the time... and usually not fail immediately like a null pointer exception. This is completely separate from null analysis.

My opinion on the matter is that once they introduce inline types (or value types as they were known back in 2013/2014...) they should make Optional a non nullable inline type and this nullness check annotations problem can go away and we can stop wasting time on this.

Assuming Valhalla does come in the next 5 years (which we will be lucky if loom comes in that time frame) we still have 20 or more years of code bases that can easily be annotated without breaking compile time including the JDK (e.g. Map.get isn't going to be changed to return Optional). Loom largely will not break existing libraries but trying to retrofit everyone to use Optional will. Besides null will still exist.

The only reason I make a big deal out of the validation annotations is to get folks at least use proper annotations and this does not include javax.validation or the findbugs annotations. If people did we might have JSpecify or something like it widespread but we can't if you use validation annotations for null analysis or we just assume the JDK will fix it.

Anyway its probably all moot because I assumed you were doing null analysis but if intelliJ does something else then you can clearly ignore me.

agentgt commented 2 years ago

IMO the fundamental problem is record-builder is interpreting what javax.validation.NotNull is (by accident because of the regex) when clearly it is open for interpretation by design. In my code base javax.validation.NotNull is usually @Nullable because the object is usually filled in an invalid state first (you may disagree but I largely say this is by design as you can clearly see it used this way in Spring MVC).

TYPE_USE @Nullable is way more important because it is not ambiguous unlike javax.validation. That is it is very possibly to write @Nullable @NotNull String someAttribute. Not to mention javax.validation has check groups. For example @Nullable @NotNull( groups = UpdateChecks.class) String someAttribute. You see validation has different contexts unlike Null analysis annotations. How should this library interpret different validation groups?

        /**
         * Add not-null checks for record components annotated with any annotation named either "NotNull",
         * "NoNull", or "NonNull" (see {@link #interpretNotNullsPattern()} for the actual regex matching pattern).
         */
        boolean interpretNotNulls() default false;

        /**
         * If {@link #interpretNotNulls()} is true, this is the regex pattern used to determine if an annotation name
         * means "not null"
         */
        String interpretNotNullsPattern() default "(?i)((notnull)|(nonnull)|(nonull))";

Obviously I can configure it so it ignores it but the library should be designed on how it interprets a TYPE_USE @Nullable not @NotNull or @NonNull particularly because in a null analysis code base you almost never use @NonNull (JSpecify doesn't even have an annotation for it.... btw javax.validation thankfully does not have a @Nullable). Libraries that have TYPE_USE Nullable are not open for interpretation. The type is inherently nullable. If #106 is implemented all this comes for free if you are using TYPE_USE annotations.

And the reason that is important right now is to your idea of creating the concrete setter which I'm not against but it should be annotated @Nullable if you are using those annotations. However if you are using Optional particularly as fields (which I find disgusting but each to their own) you are probably not using @Nullable in which case all the more reason to @mads-b point of just keeping it the way it is. Your objects are either always NonNull or wrapped in an Optional.

Frankly I think the whole trying to do validation and interpret NotNull of non TYPE_USE adds bloat and confusion to the library. Propagating javax.validation annotations onto setters doesn't make much since other than documentation and actually requires the library to guess what you want (e.g. should we do Object.requireNonNull or use a validator). And as you saw it serves little purpose as IntelliJ is doing the right thing on not complaining if you pass null. However if you used intellijs @NotNull which is TYPE_USE instead of javax.validation than it would complain to you.... but guess what that will not work till #106 is fixed even though it is literally part of the type (when you remove the TYPE_USE annotations its essentially type erasure).

As for validation being bloat for a record you could trivially add a mixin interface in your own codebase to do validation perhaps with a method called validate.

public interface ValdiateMixin {
    default void validate() {
       SomeStatic.getValidator().validate(this);
    }
}

I think even adding the concrete setter for optional is bloat. It's not hard to write someSetter(ofNullable(v)).

Randgalt commented 2 years ago

@Randgalt it doesn't even make since to have methods with javax.validation.constraints (your gist you shared on reddit showing full builder) unless you immediately call the validator. In fact the requireNonNull on justAString is wrong

Wrong is a loaded term here. As I recall we were trying to emulate what the Immutables library does and, again as I recall, it treats any non-null-like annotation as implying a null check. The goal of the library is utility so while it may not conform to a particular spec it does conform to those who wish to migrate from the Immutables library. If there's a strong desire to conform to a different spec that can be done via another option but I'd like to see a community desire for this.

agentgt commented 2 years ago

Can you provide a link of that behavior because the behavior I'm proposing of focusing on @Nullable is what immutables seems to be doing not @No[tN]Null

Now let us go look at your gist again.

    @Generated("io.soabase.recordbuilder.core.RecordBuilder")
    public static FullRecord FullRecord(@NotNull List<Number> numbers,
            @NotNull Map<Number, FullRecord> fullRecords, @NotNull String justAString) {
        numbers = __list(numbers);
        fullRecords = __map(fullRecords);
        Objects.requireNonNull(justAString, "justAString is required");
        return new FullRecord(numbers, fullRecords, justAString);
    }

__list and __map coerce nulls to empty collections and not just any empty collections but special immutable ones that will fail of you do something like contains(null) (regular ArrayList does not do this). Why not fail fast here? The irony is you just made the contract actually

    public static FullRecord FullRecord(@Nullable List<Number> numbers,
            @Nullable Map<Number, FullRecord> fullRecords, @NotNull String justAString) {

Then you arbitrarily decide that Strings on the other hand cannot be coerced. That null cannot be coerced to an empty string.

I'll go check but I doubt Immutables does the above.

It just seems like the NotNull behavior needs some rethinking and was added way too quickly to this library. Especially so if you are trying to mimic immutables. Also given that the last three issues that are currently open have to do with nonnull it clearly has some issues even in the community.

agentgt commented 2 years ago

Speaking of community. To rope this all back to the original issue (#107) and why I have gone on this crusade (I'm sorry I am an asshole but for good reason) is because I was putting in work to add Nullable and fixing #106 but is a significant amount of work (not so much the propagating but dealing with the fact builders are inherently nullable). BTW I don't need this project as we have our own processor but I'm knowledgeable of annotation processing so I figured I could help.

So when @lpandzic proposes to change a generated method from implied @NonNull to @Nullable I need to know to mark it as Nullable just like how immutables would. Tangental but relative was their assumption of javax.validation.NotNull which I dwelled on too long. BTW @lpandzic change is a breaking change that requires a major version but seeing how this project is using Guava style versioning I guess no worries. But seriously this change could break peoples code and it's the worse kind because it would appear to be their code.

And thats the thing this change is largely caused by a previous PR being added which was addConcreteSettersForOptional as well as the NotNull work which from what I can tell Immutables does none of that. Thats what I mean by bloat and how it "was wrong" (whatever shade you want to make that). They all seem so minor and "reasonable" requests but I'm sure you know you can't just keep accepting "reasonable" PRs without a holistic view or some sort of spec otherwise it becomes a house of cards.

lpandzic commented 2 years ago

@agentgt like I said in one of my previous replies - I'm not proposing to change anything about nullness inference - only that the implementation for concrete setter on Optional record field uses Optional#ofNullable instead of Optional#of. If you look at the PR that fixes this issue you can clearly see that nothing has changed regarding nullness inference.

agentgt commented 2 years ago

Let us say I have a record that has an Optional and as an extra but not requirement lets say I am doing null analysis where the default is nonnull (that is only nullable is marked. I stress analysis and not inference like you are saying).

In my code I expect this to fail fast with a null pointer exception:

someSetter(null);

I might even have a unit test to expect that behavior. It certainly will get kicked on if mutation testing is done. So hopefully you only break peoples unit tests with your change.

Now for the null analysis without it being marked this method appears to be:

someSetter(@NonNull SomeType argument)

And indeed that was the original contract. Thus your change will never be seen and possibly misunderstood as well debatable breaking the contract.

Furthermore its confusing. If you are following the Optional paradigm of preventing null you wrap it immediately in Optional. Otherwise you are in the annotation paradigm house or you just don't care which we can ignore that latter case for now.

Maybe if the setter gets prefixed with someSetterNullable or maybeSomeSetter it might be better and we add @Nullable to the parameter. But is it really worth it when you can and really should be doing ofNullable or your own explicitly?

Speaking of which ofNullable is often considered bad practice (as well as get but I disagree on that one). I'll have to find the links but I remember even Brian mentioning it at some point. Some of it is stupid things like breaking monad rules but I believe there were some good points.

Finally let's not forget why this whole issue was brought up in the first place. You assumed that javax.validation.NotNull would prevent you from passing null which was caused in my opinion by another issue (basing nonnull on a notnull annotation regex) and not that the setter should be nullable. Do you want that fixed (albeit with the proper annotations) or do you just want more @Nullable stuff exposed? I'm saying if you want to avoid null there should be less of that.

agentgt commented 2 years ago

@lpandzic Immutables which apparently is what this library is so supposed to emulate does not do what you want.

import org.immutables.value.Value;

@Value.Immutable
public interface Stuff {
    Optional<String> someString();
}
  @Generated(from = "Stuff", generator = "Immutables")
  public static final class Builder {
    private String someString;

  //snip

    /**
     * Initializes the optional value {@link Stuff#someString() someString} to someString.
     * @param someString The value for someString
     * @return {@code this} builder for chained invocation
     */
    public final Builder someString(String someString) {
      this.someString = Objects.requireNonNull(someString, "someString");
      return this;
    }

See how this "reasonable" PR can lead to many more issues than you might think.

Randgalt commented 2 years ago

Can we pause this discussion for a bit please? When I get some more time I'll look into this deeper. I feel confident there's a good solution here that can meet everyone's needs.

agentgt commented 2 years ago

Yes that will probably be good for all. However I can't keep working on #106 or the other nullable annotation bugs till some decisions are made. Which is why I have been so back and forth on this as there are some serious previous design decisions that are impacting me.

So I'm fine with holding off indefinitely till some decisions are made (like lets copy what immutables does).

And I'm sorry for coming off rough and flooding the inbox but I have already sunken some time on this (a lot more than just changing some call from of to ofNullable) as I sort of felt obligated given previous discussions on reddit and whatnot.

I'll stop work and commenting till the dust clears.