Open asmodeus812 opened 8 months ago
For what I know jdtls cannot determine what jdk a project uses, it is written in the command that starts the jdtls server. nvim-jdtls does not expose any api either, because they run the command with the configuration given by the user. I don't think we can use them for this purpose. See this and this.
I find preferible reading the pom.xml
or the build.gradle
, but there is no deterministic way to know what java version it requires, for what I've seen in several projects.
A third idea could be to have a project dotfile to determine what jdk the project uses. I don't think it is the responsability of the adapter as it would be easier to use a jdk manager such as sdkman.
This is incorrect, the jdk you start the jdtls server with has nothing to do with the runtime the project is using.
I start my jdtls with java 20, yet on a daily basis i use single jdtls instance managing many projects at work, as work spaces, ranging from java 8 to java 17. I have a local lua script setup to execute the method below, to obtain the runtime version so i can run correctly maven for example, setting a JAVA_HOME in jobstart, before maven is run, so i can build each project with it's corresponding version.
As i said, you can extract the current runtime for each project with something like that
local COMPILER = "org.eclipse.jdt.core.compiler.source"
local LOCATION = "org.eclipse.jdt.ls.core.vm.location"
local uri = vim.uri_from_bufnr(bufnr)
execute_command({
command = "java.project.getSettings",
arguments = { uri, { COMPILER, LOCATION } },
}, function(err, settings, client)
local config = client.config.settings.java or {}
config = config.configuration or {}
if err then
vim.notify(string.format("Unable to resolve or find project settings %s", err.message), vim.log.levels.WARN)
else
cb(settings, config.runtimes)
end
end, bufnr)
By default jdtls will either try to resolve the version that is specified in the pom, or you can force it to use a specific version if you configure a default runtime like tat
"java.configuration.runtimes": [
{
"default": true,
"name": "JavaSE-11",
"path": "/home/linuxbrew/.linuxbrew/opt/openjdk@11/libexec"
}
],
The fallback should be considered if the user has no running lsp client.
@rcasia now even with the addition of dap support the issue above is still valid i believe, here is an example project which uses java 11, https://github.com/feltex/server-status.git. However my java in path is java 21, and there is no way this is going to be compatible and be able to compile and run the tests. I can port the implementation mentioned above in neotest java but would need some guidance as to where best to put the fetching logic for the correct java executable (i would also like to add support for coc.nvim as well as native lsp) .
If i try to debug one of the test cases from the project above i get an error neotest-java: .../nvim/lazy/neotest-java/lua/neotest-java/command/run.lua:20: error while running command cat target/neotest-java/classpath.txt exit code: 1 cat: target/neotest-java/classpath.txt: No such file...
, even if i manually build the classpath.txt
with mvn -q dependency:build-classpath -Dmdep.outputFile=target/neotest-java/classpath.txt
(neotest simply runs indefinetly not reporting any output error or otherwise)
The best idea would be to:
lua/neotest-java/lsp/init.lua
and put the fetching logic thereI have trying out running some jdtls commands and I have this code aside, if it helps https://github.com/rcasia/neotest-java/commit/98f1585c6d652783911995786d16c2f7e28badf0
@rcasia just did a quick and drity implementation, however the entire binaries module is called in a lua loop callback, which means that no vim.* api methods can be called, and that is a big problem !
Forgot to mention you can use require("nio")
to use those functions in async mode.
Never used it, but in the binaries module we would need a sync invocation of vim.* api functions, so in case you have an example would be great.
Here is an example I have tested:
local nio = require("nio")
local M = {}
---@return nio.lsp.Client
function M.jdtls()
local future_client = nio.control.future()
nio.run(function()
local clients = nio.lsp.get_clients({ name = "jdtls" })
if #clients == 0 then
future_client.set_error("No jdtls clients found")
return
end
future_client.set(clients[1])
end)
return future_client.wait()
end
---Executes workspace command on jdtls
---@param cmd_info {command: string, arguments: any }
---@param timeout number?
---@param buffer number?
---@return { err: { code: number, message: string }, result: any }
function M.execute_command(cmd_info, timeout, buffer)
timeout = timeout and timeout or 5000
buffer = buffer and buffer or 0
local result = M.jdtls().request("workspace/executeCommand", cmd_info, timeout, buffer)
return result
end
return M
Hmm, i do not see any vim.api.* examples here, i am very much aware of the lsp client wrapper but that is not what i am looking for, as i mentioned already. Maybe within the future we can invoke those calls ?
Oh sorry, misunderstood. Yes, you have the same wrapper for vim.api.* functions by just replacing vim
for nio
, as nvim-nio provides this:
--- Safely proxies calls to the vim.api module while in an async context.
nio.api = proxy_vim("api")
https://github.com/rcasia/neotest-java/pull/153 fixes the issue, as it compiles using nvim-jdtls.
@rcasia but this pr also allowed people to build using coc, what about that ? I see that the implementation has a dependency on nvim-jdtls, which is imo not a great idea, the build command is simple enough and we can use the native lsp or coc to run the build command
I didn't add it because I lack knowledge on how to use coc, but it should be really simple to implement if they have a way to just call an api to compile.
Also it would be a good idea to have an opt function for the compilation, if anyone want to compile in a certain different way other than lsp's.
This is part of the original implementation ported on top of the latest changes - https://github.com/rcasia/neotest-java/pull/162
Maven builds fail due to mvn by default using the javahome env variable. However, different projects can be configured to use different runtimes. Furthermore, i myself have 5 jdks installed, my default my javahome points to jdk20. However, my projects do not use java 20. They use a range of versions from 11 to 17, which are explicitly configured in jdtls runtimes. Where jdtls is made aware of which runtime it must use. (or the user can provide all available runtimes on the system, and jdtls can deduce the default one to use, when it scans the gradle or pom files)
Before starting the mvn jobstart, one must export the correct javahome variable pointing to the projects default jdk. That can be obtained by either querying jdtls or as a fallback scanning the pom.
The runtime locations available on the system could be user configureable in the plugin config in case the fallback route is taken.
The primary idea, however, is to go with a jdtls query, which will return the currently configured default runtime and whrere on the system it is located at. That will then be used to set the javahome in jobstart / termopen.