dotnet / roslyn

The Roslyn .NET compiler provides C# and Visual Basic languages with rich code analysis APIs.
https://docs.microsoft.com/dotnet/csharp/roslyn-sdk/
MIT License
19.04k stars 4.03k forks source link

RoslynCodeAnalysisService makes full-solution-analysis almost unusable with large projects #61832

Open lorcanmooney opened 2 years ago

lorcanmooney commented 2 years ago

Version Used: VS 17.2.3

Steps to Reproduce: Between each step, watch RoslynCodeAnalysisService.exe in task manager, and wait for it to settle down before moving to the next step.

  1. Acquire a solution with one or more large projects and open it in VS.
  2. Build the solution.
  3. Build the solution again.
  4. Open an empty .cs file from the large project.
  5. Move keyboard focus from the text editor to a tool window, then back to the editor.
  6. Open a file from another project, then Ctrl-Tab back the the original file.

Actual Behavior: After every single step, RoslynCodeAnalysisService kicks in and burns minutes of CPU time, despite the fact that nothing has changed after step 1. On my solution & machine, step 1 requires just over 3 minutes of CPU time, while each subsequent step burns almost the same 3 minutes.

During this, my system becomes noticably less responive. The editor becomes sluggish, the quick-action thing seems to time out, and my app takes a noticable performance hit during debugging sessions.

Expected Behavior: Maybe I'm misunderstaning the purpose of full-solution-analysis, but I expect it to make all diagnostics available all the time. So, since nothing has changed after step 1...

  1. Ideally no work needs to be done here.
  2. Since MSBuild reports everything is up-to-date, shouldn't RoslynCodeAnalysisService be able to figure this out too?
  3. You'll never convince me that opening a completely empty file requires 3 minutes of re-analysis.
  4. & 6. Simply navigating around the UI shouldn't incur minutes of computations!

I'd also expect that when backgound analysis is needed, it should be de-prioritized so that typing and debugging aren't negatively affected.

CyrusNajmabadi commented 2 years ago

@lorcanmooney can you file traces?

WRT to individual pieces:

So, since nothing has changed after step 1...

This is often not how things are seen. For example, if a build happens, that will produce new dlls that the compilations will hten point at. These new dlls could still change meaning and all reanalysis must be done.

Open an empty .cs file from the large project.

Generally, when a file is opened, it switches representation entirely. This is often because it may have an on-disk representation (like utf8) but then an in-memory representation backed by a completely different text object. From roslyn's perspective it's a new immutable snapshot, and that means recomputing everything as that new snapshot may have different meaning than before.

You'll never convince me that opening a completely empty file requires 3 minutes of re-analysis.

This is not correct. Consider any potential analyzer or source generator. ALl of them might produce different outputs on any file for any reason. It's totally legal for an analyzer/sg to even report different diagnostics for these empty files.

Simply navigating around the UI shouldn't incur minutes of computations!

This should not impact diagnostic computation (just the order we do things). If you're seeing this, we'd need some traces. @mavasani may also have ideas for logs to collect to help figure out what's going on here.

CyrusNajmabadi commented 2 years ago

Note: i would not Ever recommend Full Solution Analysis on a large project. It's actively something that will use enormous amounts of CPU as many things may require a full reanalysis for full correctness.

--

Note: This is also going through a rewrite now to a different system. But i would still expect that if your doing full-solution-analysis that you will see extremely high CPU all the time. Even if we optimized for the above cases, you'll still see the exact same heavy CPU usage the moment you start typing as edits will absolutely require full reanalysis no matter what.

Eli-Black-Work commented 2 years ago

@CyrusNajmabadi We're seeing somewhat similar behaviour. Should we file traces via the VS feedback tool or by some other means? 🙂

CyrusNajmabadi commented 2 years ago

@Bosch-Eli-Black sure. Traces would help. That said, FSA is expensive by definition. It's doing analysis of everything. This is why it's off by default.

Eli-Black-Work commented 2 years ago

@CyrusNajmabadi Thanks 🙂 Do you mean traces generated by passing /reportanalyzer to MSBuild or traces generated via the VS feedback tool?

CyrusNajmabadi commented 2 years ago

The latter. Thanks!

kirsan31 commented 1 year ago

---------UPD----------

Yes - wrong place :( Correct one: https://github.com/dotnet/roslyn-analyzers/issues/6412

---------UPD----------

I am 100% not sure if this is a right place to post - sorry. But lately (after 17.4) I can't normally work on systems with 16GB of memory or less. ServiceHub.RoslynCodeAnalysisService eat it all and then very often crash VS.

My developercommunity feedback with all needed data: https://developercommunity.visualstudio.com/t/Flow-analysis-for-CA2000-DisposeObjects/10237578 Same feedback: https://developercommunity.visualstudio.com/t/ServiceHubRoslynCodeAnalysisService-Tak/10230728

Changing this settings: image

from entire solution to opened documents or current document helps, but not for long, as I understand, until the problematic document is opened...