Open gsscoder opened 9 years ago
Hey! It would be nice to have user-defined attributes, and this is something I'll ask @MovingtoMars about. I'm not sure about reflection, though. For now, we aren't too sure; but this will definitely be considered.
@ark-lang/owners ???
Also I'll fix that type-inference thing, functions used to be fn
instead of func
, we must've forgotten to change that, so I'll fix it :)
I like this idea. Maybe there should be something in the syntax to distinguish user-defined attributes from compiler ones, so you'll still get an error if you have a typo on a compiler-defined attribute.
@MovingtoMars, if I can use .NET as comparison: with reflection you're able to read also compiler-consumed attributes.
I think that a cool thing maybe:
core.compiler
(a library that can host a compiler-as-a-service version of ark).@gsscoder Yes, I agree with compiler and user-defined attributes being accessible by reflection.
I think different syntax for each (maybe [compiler_defined] [#user_defined]
) would help clarity and prevent the compiler from silently accepting misspelled compiler-defined attributes (for instance, [depercated]
).
On the topic of reflection, I like the idea but I'm not sure if it will make it into the Go-based Ark compiler. It would probably be easiest to wait until the compiler gets bootstrapped in Ark to implement compiler-as-a-service.
On the other hand, we could begin CaaS earlier by exposing a C API from the Go-based Ark compiler, which could be wrapped by an Ark library using Ark's C FFI. Scratch that, after reading up on cgo I've realised that won't work. We'll have to wait for the bootstrap.
@MovingtoMars, I'm agree that the compiler-as-a-service module should be implemented after Ark bootstrapping.
For used defined attributes (beyond using a token for discriminating these from built-in ones), I guess you don't want define a type for describing it.
Suppose I want to supply extra data:
[#bar(key1="value1",key2="value2")]
func foo(): int {
return 0;
}
But without using a type we're forced to use a plain string map (for representing data), which can be also acceptable for earliest releases. Another possible solution could be using types:
[attribute] //<- this is built-in/compiler-consumed
struct my_attr {
str_field: string = "a string",
int_field: int = 1234,
float_field: float = 1.234
}
[#my_attr(int_field=9999, float_field=9.999)] // str_field <- takes default
func foo(): int {
return 0;
}
What do you think?
P.S.: maybe I'll add a post proposing reflection syntax.
@gsscoder For attribute values, I think a string type and an integer type should be enough (only string type is implemented so far). Unless, of course, you have any compelling reasons for others (floats etc).
I'd like to keep the attribute system simple, so I don't really think multiple key/value pairs for one attribute is a good idea.
@felixangell ?
P.S.: maybe I'll add a post proposing reflection syntax.
Yes please, I don't think any of the team members have really thought about this!
@MovingtoMars I'm not sure, unless there are any useful benefits for flaring out the attributes a little more, I think it's a good idea to keep them simple. And I also don't see any reason to support floats or other types for attributes.
And yeah, I never considered reflection in the initial development of this compiler; so please do! :+1:
@MovingtoMars , @felixangell, I don't think that reflection could be implemented only within upcoming Ark standard libraries without some knowledge at compiler level. Probably should be done at both levels.
Before proposing something, I want to add that this thing could impact the eventual introduction of a macro subsystem. For example, will Ark implemented LISP-like macros becoming homoiconic? Or textual machinery like C (which I'll discourage)? Or some sort that lives between (which is probably a better choice)?
Coming back to reflection, I imagine an operator that returns a simple AST in form of struct
instances.
A complete extension to specification would be necessary to explain and writing all here will be off-topic, so I'll proceed with a simple snippet. (More, I've not thought to everything...)
!use "reflect" // contains reflection types
struct Cat {
name: str,
age: int,
weight: float
}
func main(): int {
terry: Cat = { // omissis
};
// query reflection info
tx : Type = reflect(terry); // alternative 1 from variable binding
ty : Type = typeof(Cat); // alternative 2 from type
assert(tx == ty);
// inspect reflection info
fields : []Field = tx.fields();
ageField : Field = fields[1];
ageFieldType : Type = ageField.Type;
assert(ageFieldType == typeof(int));
assert(fields[2].Name == "weight");
assert(fields[0].Type.Name == "str");
assert(fields[0].Type.BuiltIn == true);
return 0;
}
Just for start reasoning, nothing more...
@gsscoder Nice snippet, it looks like an efficient way of doing reflection from a syntax and semantic point of view, I don't know how it'll work on the compiler-infrastructure side though.
@gsscoder I'm liking the look of this, especially as it would go well with a Julia-like macro system, which I've been considering.
@SamTebbs33, @MovingtoMars, I'm very glad if I was helpful in some way!
I've not knowledge of Julia macro-system, even if I think that it's an interesting language.
Staying close to macro discussion, I'd like to suggest this reading that compares how a compilation check is implemented in F#, Nemerle and D.
In this section of reference is stated that few attributes are supported for the moment.
I guess that this is referred to compiler-specific attributes.
Is currently possible define custom attributes and programmatically examine these via reflection (if this feature is implemented)?
If not, it will be possible in future?
Thanks for sharing this interesting project!
P.S. (off topic): in Type Inference paragraph a function is defined using
fn
, maybe a typo?