dotnet / command-line-api

Command line parsing, invocation, and rendering of terminal output.
https://github.com/dotnet/command-line-api/wiki
MIT License
3.37k stars 378 forks source link

question: Do you have info on the philosophy of using dotnet-suggest #1000

Open michaelgwelch opened 4 years ago

michaelgwelch commented 4 years ago

My motivation for looking at command-line-api was that my javascript library took 0.2 seconds or so to do completions even when it was a fixed small list. Seemed to be related to node launch time.

My test prototype dotnet app returned completions in 0.02 seconds or less. I perceived them to be as quick as the completions for git which are super fast.

So I convinced myself that I was relying on completions enough with my app that I should switch to dotnet so I'd get lightning fast completions.

But relying on dotnet-suggest seems to make things much slower than my node app.

Curious what the motivation was. I know one obvious reason is so that you can have one shim file for all dotnet tools. Are there other goals as well? Perhaps the performance issues I'm seeing are related to the fact that I'm not really seeing completions work reliably. So there may be another factor in play on my box.

I see you are measuring performance so it's obviously important. Is it "fast" normally? I use git as my base line. It always seems "instantaneous".

jonsequitur commented 4 years ago

The primary goal of the dotnet-suggest approach is to consolidate the completion logic in one place so that you don't have to write different completion scripts for each shell, which is laborious enough that most people don't do it. Because the completions come from the actual tool's parser, rather than a separate piece of code (a bespoke, shell-specific script), it also means the completions will be more correct.

It's expected at this stage that the completions will be slower than a separate script. This can be fixed but this project is still in preview and we haven't done a lot of performance optimization. One area of planned performance optimizations includes looking into C# 9 source generators to improve the parser performance and potentially also emitting shell-specific scripts per platform to take dotnet-suggest out of the loop and avoid the extra process (of dotnet-suggest). Another approach that's been discussed is to allow the parser to be serialized and stored by dotnet-suggest, avoiding an extra process (of your tool). I'm confident we can close the performance gap but we need to do some experimentation to see by how much.

michaelgwelch commented 4 years ago

The primary goal of the dotnet-suggest approach is to consolidate the completion logic in one place so that you don't have to write different completion scripts for each shell, which is laborious enough that most people don't do it. Because the completions come from the actual tool's parser, rather than a separate piece of code (a bespoke, shell-specific script), it also means the completions will be more correct.

Can't the completion logic be consolidated in the command line api that is included in each tool? (Much like how you already know how to invoke my commands/subcommands based on the Invoke handler I register, and you know how to ask for suggestions based on the suggest directive).

It seems what would be needed is a shim that passes in $COMP_LINE, $COMP_WORDS etc into each tool. You'd define what that template looks like. It would be the same template for each tool only the command name changes. Then you'd maybe want an extra built in subcommand like install-shim that could be run once per tool (or maybe automated as part of dotnet tool install) that adds the shim to the standard completions folder for bash (and does whatever is needed for powershell).

perhaps all of this is a moot point if you already know things can be optimized.

It's expected at this stage that the completions will be slower than a separate script. This can be fixed but this project is still in preview and we haven't done a lot of performance optimization.

Sure, understood. Perhaps I'll keep my little node script project alive a little longer and keep an eye on this project.

Thanks for all your time.