dfinity / motoko

Simple high-level language for writing Internet Computer canisters
Apache License 2.0
517 stars 97 forks source link

Allow explicit imports #2354

Closed kritzcreek closed 2 years ago

kritzcreek commented 3 years ago

Story

As a developer, I want to be able to explicitly import unqualified types and functions from other modules.

Motivation

Currently we can only import modules as a whole with a qualified identifier. This leads to a lot of Result.Result which looks silly, and makes people create module local private aliases. This is bad because:

  1. They're adding boilerplate to work around compiler limitations
  2. They don't get hyperlinked documentation because their type signatures point to module local private type definitions.

Demo I want something like this to compile

    import Result "Result";
    // Showcases both type as well as value level imports
    import { type Result, mkErr } "Result";

    module {
      public fun fromOption<Ok, Err>(err: Err, opt: ?Ok) : Result<Ok, Err> {
        switch opt {
          case null { mkErr(err) };
          case (?ok) { Result.mkOk(ok) };
        };
      };
    }

Design

This doesn't just work if we allow object patterns, because we can't project type components that way.

It might also be nice to have a more compact format for the "import both as qualified and explicit" pattern, as shown above.

nomeata commented 3 years ago

This doesn't just work if we allow object patterns, because we can't project type components that way.

Would that be hard to add, @crusso? Would this make sense as a orthogonal feature?

It might also be nice to have a more compact format for the "import both as qualified and explicit" pattern, as shown above.

Again, if we add as patterns (the dual of Alt, @) to the language, we could use this here, right?

crusso commented 3 years ago

Yes, I think this might be doable, hopefully just by extending object patterns and allowing imports with non-trivial patterns. Not sure how type field patterns would look for synthesized (not checked) patterns though - I guess you'd just have to provide the type definition.

The other question is whether you should be allowed to refer to the type identifier within the enclosing pattern, or just in continuation of the let. I also worry that our super-liberal recursion might pose problems here.

nomeata commented 3 years ago

Can’t you always desugar (conceptually) object patterns into accessors? If so, then hopefully there are no (conceptual) problems with recursive typing. But only trying it will tell…

rossberg commented 3 years ago

@crusso, I think it's inevitable that type fields are not allowed in synthesis patterns. But the only place with synthesis patterns are functions parameters, where they do not seem particularly useful anyway.

@nomeata, desugaring only works for patterns in analysis position, but those are unproblematic anyway. It's the synthesis case that we cannot handle. Consider:

func f({a; type B}) { ... }
nomeata commented 3 years ago

Is that different from the following?

func (o) { do { let a = o.a; type B = o.B; … } }
ggreif commented 2 years ago

EDIT: please ignore most of what follows. #3076 implements specific imports and the pattern syntax { external = local } allows renaming out of the box.


Wouldn't this be an instance of an and pattern with the small twist that the RHS is a label pattern (appearing in the record to be destructured) and that only applies to record values? Also the RHS is meant to be hidden from the enclosing/importing scope, right? Thus there are only bindings in the LHS, and stuff gets much easier to check.

Just spotted https://github.com/dfinity/motoko/issues/2354#issuecomment-779204472 above, we seem to be in agreement! @nomeata

import { type Result; mkErr = fooErr } "Result";

would import fooErr renamed as mkErr. Thus this could be regarded as the pattern dual of unpunned field definitions when building records. Also, in turn instead of the named pattern mkErr, we could further deconstruct.