Closed tamasfe closed 2 years ago
Beware that this is still valid Rhai code, if the user calls bar
via Engine::call_fn
. The variable foo
can be provided in an external scope (in which case it would essentially be the same as the let foo = 2;
here.
So, maybe a switch is needed to specify whether any scope (especially functions) is allowed to have undeclared variables.
Or, say, that foo
inside bar
can be recognized if it is in the scopes
definition file?
Ah this complicates it. I'll have to make a distinction between declarations in scripts and defintion files.
It still might need to be considered, because a lot of Rhai usage is calling functions directly with call_fn
.
I don't think we can express per-function external scopes since I don't want to alter the Rhai syntax the slightest. We can allow comments similar to // @ts-ignore
to suppress resolution errors.
We can allow comments similar to
// @ts-ignore
to suppress resolution errors.
OK. So you want to restrict the check within the parser itself. That's ok...
How about a simple syntax in the function or script's doc-comment?
//! This is my script
//!
//! # Expected Scope Variables
//! ```scope
//! /// a global variable that can be used for all functions
//! let global_variable: int;
//! ```
/// This is my function
///
/// # Expected Scope Variables
/// ```scope
/// /// a variable that is only available when calling func
/// let scope_variable: String;
/// ```
fn func() {
let x = "variable = " + scope_variable;
print(x);
}
As long as we agree on a standard syntax, the user can easily work this out as documentation itself, or filter the lines out.
For example, in the above proposed syntax, we wrap definitions in a code block marked with scope
.
I'll have to think about it, but I like your proposal.
I implemented your proposal, and it will work just fine:
/// ```rhai-scope
/// let name: ?;
/// ```
fn hello() {
return name;
}
However parsing after parsing is always tricky, currently errors are swallowed and spans are incorrect, meaning that hover and go to definitions will go to wrong locations.
In order to correct these, we just have to adjust the offsets so that the beginning of the code block is not 0
but rather its actual position in the file. However to do this I'd probably refactor the code that adds symbols to the HIR because we are recursively adding expressions and we are passing down a lot of stuff.
Is it in your parser or do you want me to add that offset option to Rhai's parser?
However parsing after parsing is always tricky
It truly is. Trick is to provide an initial value to the position when starting to render.
Also, those ///
stripped will also change the character position offset for each line...
Is it in your parser
Yes, I parse the docs when adding the function to the HIR.
It truly is. Trick is to provide an initial value to the position
Yes that is the plan, I just don't want to further bloat the recursive code that adds things to the hir with even more parameters, so I'll create some sort of context where I can shove stuff like the starting offset.
I just don't want to further bloat the recursive code
But you already need to pass in the current position in order to be able to "add on" to it, right? So you already have that parameter, or you keep the current position inside a variable.
The current line number is easy - you simply start with a larger number. The character offset for each line, however, I do agree will be a problem since you'd need to have a way to know how much it is for each line.
But you already need to pass in the current position
The add_expression
receives the syntax node of the fn expression, but the doc comments are not part of it, right now I only temporarily pass down the parsed comments without any position information. There are a lot of stuff passed down and propagated, so I'll need a refactor anyway.
Also it gets even more complicated, we need to parse each comment code block as a whole, the following will completely bork positions:
/// ```rhai-scope
/// /// Doc of "name"
/// let name: ?;
// Hello I'm a comment between doc comments
/// ```
fn hello() {
return name;
}
I don't think I'll handle these, if these issues pop up for users, we can detect splits between doc comment code and just throw a warning or error for them.
Can you simply ignore whitespace and so all these comment blocks become a single block with the blank lines skipped over during tokenization?
Probably not easily as you're keeping a parse tree, not a syntax tree, so whitespace matters...
I can't ignore anything, the syntax tree has to remain lossess. Ignoring whitespace and comments would throw off pretty much all positions, because the IDE doesn't ignore them either.
In fact the issue above comes from the fact that I'm ignoring whitespace and comments during parsing code in doc comments, a solution would be to include them in the doc comment code.
But I stand by my last comment for now, I don't want to spend any significant effort on dealing with this unless it actually becomes a problem.
But I stand by my last comment for now, I don't want to spend any significant effort on dealing with this unless it actually becomes a problem.
Yeah. Agree to that 100%. Better have something reasonable going than worry about the edge cases right now.
BTW, just found out that you only handle the case for three back-ticks, immediately followed by rhai-scope
.
In fact, the MarkDown code block standard handles the following variants:
`
or tildes ~
Therefore, the following should be allowed:
~~~~~ rhai-scope
let x: int;
~~~~~
Let's not worry about a code block not starting immediately, but with whitespace before, for the time being...
Not sure if this can be easily handled, but I suppose if you use a regex to match it, it should be pretty straight forward to match the closing by using the match in the opening.
I initially wanted to use a markdown parser for this, and if we want to handle every edge-case, we will use one. I did not think this alone would warrant an additional dependency.
I initially wanted to use a markdown parser for this, and if we want to handle every edge-case, we will use one. I did not think this alone would warrant an additional dependency.
True in that. Still, I think we should at least recognize tildes ~
instead of only backticks. Many people have the habit of using ~~~
.
BTW, does //! ```rhai-scope
work on a global level?
Seems to work fine for ///
on functions, but not in //!
.
I've implemented this for functions only, for global definitions, you can use definition files.
Ah OK.
We currently make no difference between closures and functions, so the following is valid:
We'll simply have to restrict function scopes.