Open parawanderer opened 7 months ago
If you have an issue when starting the debugger, that should not be related to parsing of source files, which is done in the background. What could be the case is, that becasue Perl::LanguageServer parses all these modules (to be able to support goto symbol etc.) it might get very big, so maybe you run out of memory. Could take a look with top
how your memory situation is when starting to debug.
If it's not an memory issue, please set LogLevel
to 2
and post the resulting output from the output pane of Perl::LanguageServer here.
Hello there,
First of all, this is a great extension! This extension has really made the development experience of working on perl code closer to more popular languages, and I greatly appreciate the effort that went into making this extension to make that possible.
My situation is as follows: I'm working on a large perl monorepo over an SSH connection. I got a configuration setup that works for getting the debugger to run, however I'm facing problems when it comes to debugger startup time/timeouts. It appears that the relationship is that the more perl modules get loaded, the slower the experience and the higher the likelihood that a disconnect happens during the process. I also have cases of the debugging never starting at all when I load some modules that are known to themselves be very large -- the debugging log indicates that that is due to a timeout.
I noticed in the "Output" window for the "Perl Language Server" that there's a constant stream of this:
And so on. Now, what I'm doing is that I am opening the very large monorepo at the root in VSCode, and I configured the extension to have
perl.perlInc
insettings.json
set to a large subdirectory from this monorepo. I assume that it's either loading all the files in the opened directory or all the files from theperl.perlInc
.I'd really like to be able to use this extension when working against files that are doing fairly heavy imports, including the ones that are currently timing out, and was wondering if there's any advice on how I could optimize the setup to be more performant (possibly not scan all the files as it is doing now if this is contributing to the poor performance)
Thanks in advance