sourcegraph / go-langserver

Go language server to add Go support to editors and other tools that use the Language Server Protocol (LSP)
https://sourcegraph.com
MIT License
1.17k stars 89 forks source link

High memory usage #310

Open ehames opened 6 years ago

ehames commented 6 years ago

I'm using Atom to and its go-langserver integration. Memory usage is close to 18Gb now as I type this. I cannot capture the memory/CPU profiles because the port is not open.

$ go tool pprof -svg $GOPATH/bin/go-langserver http://localhost:6060/debug/pprof/profile > cpu.svg
Fetching profile over HTTP from http://localhost:6060/debug/pprof/profile
http://localhost:6060/debug/pprof/profile: Get http://localhost:6060/debug/pprof/profile: dial tcp [::1]:6060: connect: connection refused
failed to fetch any source profiles
screen shot 2018-08-23 at 11 12 00 am screen shot 2018-08-23 at 11 09 02 am

BTW, this was triggered when I changed a function name, and many compilation errors were triggered. After 4 or 5 minutes, memory went down to 1Gb, but my computer was quite slow in the meantime.

joshua commented 6 years ago

I'm experiencing the same issue but also with very high CPU usage as well. This issue is a lot more noticeable with diagnostics enabled.

keegancsmith commented 6 years ago

aah, yes this is very likely related to having diagnostics enabled. When you enable diagnostics it enables the typechecker, which can be a huge memory hog. Unfortunately there is no quick fix here, other than disabling diagnostics.

lloiser commented 6 years ago

Note: as stated by @keegancsmith this happens if the typechecker gets started, which happens quite a few times: diagnostics, references, implementation... see https://sourcegraph.com/github.com/sourcegraph/go-langserver@7df19dc017efdd578d75c81016e0b512f3914cc1/-/blob/langserver/loader.go#L27:23&tab=references

keegancsmith commented 6 years ago

yeah this is a pretty bad problem for us, since the typechecking is so useful :) I think the future is bright (once we have time to implement it), since the caching stuff in go has a lot more useful information for us, which means we can probably rely on the on-disk caching go has now.

ehames commented 6 years ago

This seems to be closely related to #209. Both issues are due to the typechecker.

harikb commented 6 years ago

image I have to periodically kill the language server and live with missing features with vscode. Is there anything I can help with? what logs/traces can I extract the next time this happens?

EmpireJones commented 5 years ago

Having the same issue here. It sometimes takes up all available memory (e.g. 30GB), resulting in OS freezing. Just a guess, but this feels more like a bug than an issue of something being inefficient. Any details I can provide?

slimsag commented 5 years ago

You can set "go.languageServerFlags": ["-pprof", ":6060"] in your VS Code settings and then follow the steps in the README to capture a heap profile and upload the SVG. That would tell us where the memory is allocated. I agree this looks more like a regression, but we can't know without more details I think.

If the memory usage is coming from typechecking and not a regression in e.g. leaking memory, then we likely cannot do anything yet. The long term fix for this will be in the official Go language server which the Go developers are working on actively (it is a difficult problem to solve).

doxxx commented 5 years ago

I've been using the language server since yesterday and it was relatively well behaved, using up to only a few hundred MB. This morning, I started making some edits and the language server started consuming 80-100% CPU and the memory spiked up to 5GB. I managed to capture a heap snapshot: heap.zip

Also managed to catch the tail end of the CPU spike: cpu.zip. It looks like it might just be the heap collector though. If it happens again I'll try collect a CPU profile first.

I should also note that this only lasted about a minute or two, and the CPU and memory usage dropped down again.

doxxx commented 5 years ago

This is probably a better CPU profile than previous one: cpu.zip

doxxx commented 5 years ago

This time the heap grew to 10GB: heap.zip

slimsag commented 5 years ago

@doxxx

Both traces show the memory was allocated in the golang.org/x/tools/go/loader package, which is the entrypoint for type checking. This is unfortunate and a known issue, but expected currently. It'll improve in the future when the official Go language server is released.

If you notice the memory usage does not drop down after a minute or two, that would indicate a leak and a bug we could fix, however.

doxxx commented 5 years ago

I just had another occurrence where the go-langserver processor is at ~20GB, with ~100-300MB/s disk IO and ~100% CPU for about 10 minutes so far.

Here's the heap and CPU graphs: heap_cpu_2.zip

It seems to be the loader package still, although the second dump appears to be involving the build package as well.

Is there nothing that can be done about this in the interim?