Open DanTup opened 6 years ago
- Do nothing and assume that because the requests are parallel that it won't be too bad
Server is single threaded, so all requests are handled sequentially, not in parallel.
Server is single threaded, so all requests are handled sequentially, not in parallel.
Oh yeah, I was thinking about my round-trips. That said, depending on how you compute quick fixes it might not make much difference in the server (I guess you'd walk the tree looking for Widgets then compute their fixes, which might not be much different to me asking you for each location at a time?). Still should be some savings in the overhead of each request (and if you have to search to find the element at the offset I give you I guess it saves a bunch of them).
Do you have a preference for these? I think 3 is most convenient for me, but I don't know what overhead it adds for other editors that don't care. It's only really the IDs of the assists I need, but supplying just them might be a little VS-Code specific!
I don't like this. If VS Code does not support dynamic menus, we should request this feature from VS Code, not to perform lots of computations to work this around.
I'm not sure how this would work - they can't stall the rendering of the menu while waiting for a response from us? Slow context menus make a poor user experience.
Does IntelliJ have these on the context menu, or just the buttons? If the context menu, how does it get the data before the menu is rendered?
When the user right clicks on a tree item, this first causes selection event, and when selection happens we ask for assists. Once we receive assists from Analysis Server, we update toolbar. In the context menu provider we wait up to 100 ms
for receiving assists (scheduled on selection). This is a bit too generous, but we usually get response from the server in 3-5 ms
. So, there is never any noticeable delay between right click and displaying the context menu. It is OK-ish even with full 100 ms
, although if you know what to look for, you will notice it.
I've raised a request with Code (https://github.com/Microsoft/vscode/issues/45325) but I'm not hopeful it'll be accepted :( Honestly, I think it'd be fair to reject - laggy UIs really suck and it's a valid question about how we have the data to render the tree node but not the data to know which context menu commands are valid.
If it is accepted, it's going to be at least a month away (maybe two if they put a proposed API into the next release). We can decide what to do once we know though (we're already blocked in a bunch of things so I suspect it might be a little while before we have something we're happy to ship anyway).
This does seem expensive - that we'd need to request the available refactorings for every widget node, on most file changes. I'm not sure what workarounds I can think of. Perhaps we just depend in the source code as the best place for users to discover these actions?
@scheglov The case for dynamic context menus doesn't seem to be progressing well... Is there any possibility of including info about which refactors are available inside the FlutterOutline nodes (option 3 of the original list)? Is it expensive to look them up there, or is the logic for which are available simple enough that it wouldn't make it more expensive (we're only interested in those seven or so flutter refactors)?
I would not like computing assists for all nodes. This would make it slow for all clients, including IntelliJ based.
It is 3-5 ms
per node, and it will get too high quickly if we do this for all nodes.
We could limit this to only the nodes that represent items in the outline, but even then there can be quite a few.
Ah, yes, I was talking only about Flutter widgets in outline, not about all AstNode(s) in the compilation unit (ouch). The calculator demo in flutter_gallery
has about 50
widgets creations. Adding 150-250 ms
to every change is quite a cost.
Could it be opt-in?
If we don't get dynamic context menus in Code (I'm not hopeful it'll happen soon - feel free to 👍 or add support to https://github.com/Microsoft/vscode/issues/45325!) then we'll have to decide between not having the context menu (which will make Flutter Outline less useful) or paying this price (either internally in the server, or by Code just sending lots of requests).
Btw, we're only interested in some specific refactors - is there no short-cut to figuring out which ones are applicable without computing all assists?
Just to chip in, my sense is that the improved discoverability of adding a context menu likely isn't paid for by the additional performance cost (and small bit of customization to the API). My 0.02 -
... is there no short-cut to figuring out which ones are applicable without computing all assists?
There isn't currently, but there could be. AssistProcessor
could have a flutter-outline-specific (ahem, flutter-layout-view-specific) entry point.
Another approach would be to show all actions in the menu, but when invoked ask to the actual Quick Assist from the server, and report "unavailable" it is not returned, or timed out.
It doesn't feel like a great user experience to show items that then fail when clicked. I have added a comment to the Code case asking about whether they'd be more likely to allow us to enable/disable the items asynchronously if they won't let us control the menu asynchronously. It would be up to us to make sure it's fast (and handle the case if a user manages to invoke an item before we disable it).
It's not looking good for getting support in Code:
Blocking/delaying the right click with extension code won't happen
I've asked about allowing the menu to appear immediately, but allowing us to (asynchronously) mark some commands as disabled (so they'd become grey - I think we can do it fast enough that the user wouldn't notice the change, and we can gracefully reject them if they are able to click).
Is there a specific part of computing which refactors are available that is particularly expensive? If there was an option to do less expensive work to filter out nonsense refactors but perhaps leave a few false positives that could provide a good user experience.
There are some use cases outside of VSCode where getting an outline with all refactors might also be useful. For example, if we showed icons triggering some refactors directly in the code editor in IntelliJ it would be nice if we could get them with the outline.
In VS Code we need to context menu information at the time of rendering a tree node (context menus must be declared up-front with filters and then we apply context information to a note to tell VS Code which nodes have which contexts). This means in order to start the Outline expanded by default we send a
getAssists
request for every single node.I think there are a couple of options:
dartElement == null
)For now I'm doing 2 so I can progress - the collapsed-by-default tree is annoying. LMK what you think!
@devoncarew @scheglov