Closed schalkk closed 2 years ago
Could you reproduce this error, or it just occurs randomly?
It looks like the TCP message between vscode and languageserver could be incomplete in this case.
@randy3k Do you think we could tryCatch reading data from lsp client and ignores it if there's an error parsing the message? From the LSP specification, it looks like ignoring an LSP message might lead to bad consequences such as requests that is never handled.
From your message, it looks like the Content-Length
is not correctly handled because the rest of the message is the JSON we need.
@renkun-ken No sorry, at the moment is still appears to be random.
At the moment it appears to be related to LINT as it normally happens as I am budy clearing out balnks at the end of the code lines.
Could it be related to a local firewall? I am running McAFee
This kind of error is often seen in non UTF-8 files. Could you verify that your files are encoded in UTF-8?
Perhaps, you want to add some logic in vscode-r-lsp to check if the files are encoded in UTF-8 to avoid errors downstream.
I did double check. It is UTF-8.
Which style of line ending do you use? Unix, Windows or?
Another thing that you could try is to enable the debug setting in VSCode R setting, it should give us a more detailed log.
CRLF Line endings
On Mon, Jan 25, 2021 at 7:36 PM Randy Lai notifications@github.com wrote:
Which style of line ending do you use? Unix, Windows or?
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/REditorSupport/languageserver/issues/368#issuecomment-767026104, or unsubscribe https://github.com/notifications/unsubscribe-auth/AQTF72LX73NGZVA6BRW56ILS3W23ZANCNFSM4WRPNUUA .
After I have cleared out all the errors mentioned by LINT, the languageServer is now a lot more stable. Just had it running with debug for more than 30mins with no crash.
Hello !
very similar to the problem I described there : https://github.com/REditorSupport/languageserver/issues/325 (cf bug repro at the end)
Regards JJ
@jj-9 Yes, these seem very similar! Mine is also happening when changing basic things like removing spaces, adding spaces, ... In most cases it takes a number of changes before the language server crashes.
BTW - I also see the issue "Error: stdin connection close" when I try to start the languageserver from an R session outside the VSCode space (e.g. in Powershell calling R 4.03 binary directly).
Maybe this helps ;-)
Also having crashing seconds after initialization. Started when I changed .lintr file from
linters: with_defaults(assignment_linter = NULL
To
linters: with_defaults(assignment_linter = NULL, commented_code_linter = NULL, object_usage_linter = NULL, line_length_linter = NULL)
But changing it back won't solve it.
EDIT: reopen then saving with UTF-8 encoding seem to solve it
EDIT 2: nope, still broken: Error: Total path length must be less than PATH_MAX: 260
@albersonmiranda
It seems that your issue is a different one. Try to shorten your file/folder names to see if it fixes the issue.
@schalkk
How large is the file that you are editing?
BTW - I also see the issue "Error: stdin connection close" when I try to start the languageserver from an R session outside the VSCode space (e.g. in Powershell calling R 4.03 binary directly).
It is expected. You can't run languageserver
interactively.
@albersonmiranda
It seems that your issue is a different one. Try to shorten your file/folder names to see if it fixes the issue.
I don't think that's really the issue: C:\Users\alber\Documents\R\econdashboard\R\mod_credito.R
Would you like it to be treated as a separate issue?
Hi, I have managed to capture a logfile. RLandServ.log
Here it the editor I was using:
The R file is 168 lines long - 8kb in size.
The error is like #465.
[2021-02-07 16:59:24.385] error handling json: Error: lexical error: invalid char in json text.
Content-Length: 8217 {"jsonr
(right here) ------^
Stack trace:
1: parse_string(txt, bigint_as_char)
2: parseJSON(txt, bigint_as_char)
3: parse_and_simplify(txt = txt, simplifyVector = simplifyVector,
simplifyDataFrame = simplifyDataFrame, simplifyMatrix = simplifyMatrix,
flatten = flatten, ...)
4: jsonlite::fromJSON(data, simplifyVector = FALSE)
Is it possible that we make the connection reading more robust?
Hello !
just in anycase it could help, the error doesn't occur in vscode on ubuntu.
Regards JJ
I am having the same issue, I recently move to ios system and start to have this problem. When I was using Ubuntu never had this kind of problem.
The languageserver works for a while then suddenly stop.
[2022-04-23 13:55:22.791] document_code_action_reply: {
"uri": "file:///Users/andryaswavrzenczak/Documents/exploring-properties-montreal/montreal.Rmd",
"range": {
"start": {
"row": 1003,
"col": 0
},
"end": {
"row": 1005,
"col": 9
}
},
"context": {
"diagnostics": []
},
"result": []
}
[2022-04-23 13:55:22.791] deliver: ["Response", "Message", "R6"]
[2022-04-23 13:55:23.318] received: Content-Length: 28677
[2022-04-23 13:55:23.369] error handling json: Error: parse error: premature EOF
{"jsonrpc":"2.0","method":"text
(right here) ------^
Stack trace:
1: parse_string(txt, bigint_as_char)
2: parseJSON(txt, bigint_as_char)
3: parse_and_simplify(txt = txt, simplifyVector = simplifyVector,
simplifyDataFrame = simplifyDataFrame, simplifyMatrix = simplifyMatrix,
flatten = flatten, ...)
4: jsonlite::fromJSON(data, simplifyVector = FALSE)
[2022-04-23 13:55:23.371] received: marize(prop_sale, var = \"price\", neighbourhood) |>\n arrange(desc(med)) |>\n filter(total_obs >= 30),\n by = c(\"NOM\" = \"neighbourhood\")\n ) |>\n ggplot() +\n geom_sf(aes(fill = sd)) +\n theme_minimal() +\n theme(legend.title = element_blank()) +\n scale_fill_continuous(labels = format_num)\n```\n\n```{r}\npal_sale <- as.character(wes_palette(\"Zissou1\", length(unique(prop_sale$price_cat)), type = \"continuous\"))\n\nwalk2(unique(levels(prop_sale$price_cat)),\n pal_rent,\n function(.x, .y) {\n map_sale <<- map_sale |>\n addCircleMarkers(\n data = prop_sale |>\n filter(price_cat == !!.x),\n lat = ~lat,\n lng = ~lng,\n group = .x,\n radius = which(pal_rent == .y) + 3,\n color = .y,\n popup = ~popup,\n stroke = FALSE,\n fillOpacity = 0.75\n )\n })\n\nmap_sale %>%\n addLayersControl(\n overlayGroups = levels(prop_sale$price_cat),\n options = layersControlOptions(collapsed = FALSE)\n )\n```\n\n## Top 10 expensive properties\n\n```{r}\nprop_sale |>\n arrange(desc(price)) |>\n slice(1:10) |>\n select(url, price)\n```\n\n## Best deals\n\n```{r}\nprop_sale_mod <- prop_sale |>\n filter(rooms != 0 & bedrooms != 0 & bathrooms != 0) |>\n select(url, rooms, walkscore, year_built, neighbourhood, tipology2, price) |>\n drop_na() |>\n filter(between(price, 100000, 2000000))\n\nprop_sale_mod$pricelog <- log(prop_sale_mod$price)\n\nm0 <- MASS::rlm(\n pricelog ~ .,\n data = prop_sale_mod |>\n select(-price, -url)\n)\n## summary(m0)\n\nprop_sale_mod2 <- prop_sale_mod |>\n filter(\n !(url %in% (prop_sale_mod |>\n mutate(\n pricelogPredict = predict(m0),\n pricePredict = exp(pricelogPredict),\n diff = price - pricePredict\n ) |>\n arrange(diff) |>\n slice(1:2) |>\n pull(url))\n )\n )\n\nm2 <- MASS::rlm(\n pricelog ~ .,\n data = prop_sale_mod2 |>\n select(-price, -url)\n)\n\nprop_sale_mod2 |>\n mutate(\n pricelogPredict = predict(m2),\n pricePredict = exp(pricelogPredict),\n diff = price - pricePredict,\n ratio = pricePredict / price - 1\n ) |>\n arrange(desc(ratio)) |>\n slice(1:10)\n```\n\n# Brokers brokers brokers\n\nSome of the things that I liked to see here are the real estate agents, the people who sell, \nbuy and rent houses or apartments. I think all the available houses for one of those operations \nhave a real estate sign. \n\nSomething interesting that I discover here is that agents are different from brokers. It is a \nslight difference; the brokers can build their own business and hire others agents to work for \nthem; the agents can't do that; they only can work. For the analysis, it doesn't matter.\n\nLet's take a look at the top 10 agents in Montreal.\n\n```{r}\nagents_top <- agents |>\n group_by(agents_name) |>\n count() |>\n ungroup() |>\n arrange(desc(n)) |>\n slice(1:10)\n\nagents_top |>\n kbl(digits = 4) |>\n kable_styling(c('striped', 'hover'))\n```\n\n\nFor the top 10, the main negotiation that they are doing.\n\n```{r}\nagents_prop <- agents |>\n left_join(\n prop |>\n select(url, type, tipology2, neighbourhood, price)\n ) %>%\n drop_na(type)\n\nagents_prop |>\n filter(agents_name %in% agents_top$agents_name) |>\n group_by(agents_name) |>\n count(type) |>\n spread(type, n) |>\n ungroup() |>\n arrange(desc(sale)) |>\n kbl(digits = 4) |>\n kable_styling(c('striped', 'hover'))\n```\n\nThe main kind of propertie that they are working. Here \n\n```{r}\nagents_prop |>\n filter(agents_name %in% agents_top$agents_name) |>\n group_by(agents_name) |>\n count(tipology2) |>\n spread(tipology2, n) |>\n ungroup() |>\n kbl(digits = 4) |>\n kable_styling(c('striped', 'hover'))\n```\n\n\n\n```{r}\n\n filter(agents_name %in% agents_top$agents_name) |>\n group_by(agents_name) |>\n count(neighbourhood) |>\n group_by(neighbourhood) |>\n mutate(total_neighbourhood = sum(n)) |>\n group_by(agents_name) |>\n mutate(relevance = n / total_neighbourhood) |>\n filter(relevance == max(relevance) & total_neighbourhood >= 30)\n```"}]}}Content-Length: 28678
[2022-04-23 13:55:23.371] Error: Unexpected non-empty line
Call: self$read_header()
Stack trace:
1: stop("Unexpected non-empty line")
2: self$read_header()
3: self$fetch(blocking = FALSE)
[2022-04-23 13:55:23.371] exiting
There were 11 warnings (use warnings() to see them)
[Error - 1:55:23 PM] Connection to server is erroring. Shutting down server.
[Error - 1:55:23 PM] Connection to server is erroring. Shutting down server.
[Error - 1:55:23 PM] Connection to server is erroring. Shutting down server.
Other example
Stack trace:
1: parse_string(txt, bigint_as_char)
2: parseJSON(txt, bigint_as_char)
3: parse_and_simplify(txt = txt, simplifyVector = simplifyVector,
simplifyDataFrame = simplifyDataFrame, simplifyMatrix = simplifyMatrix,
flatten = flatten, ...)
4: jsonlite::fromJSON(data, simplifyVector = FALSE)
[2022-04-23 14:22:15.278] received: prop |>\n select(neighbourhood, .x) |>\n group_by(neighbourhood) |>\n summarise_all(median) |>\n rename(name = neighbourhood, value = .x) |>\n e_charts(name) |>\n e_map_register('Montreal', montreal_json) |>\n e_map(value, map = 'Montreal') |>\n e_visual_map(value) |>\n e_title(.x)\n }\n)\n```\n\nRemember that you can check the values in the last table.\n\n# Rent \n\nTaking a general overview of the rent in Montreal.\n\n```{r}\nprop_rent <- prop |>\n filter(type == \"rent\") |>\n mutate(\n price_cat = cut2(price, c(seq(0, 5000, 500), 10000, Inf)),\n popup = str_c(\n \"Rent price: \",\n format_num(price),\n \"<br/>\",\n \"<a href='\",\n url,\n \"'>Property<a/>\"\n )\n )\n\nsimple_summarize(prop_rent, var = \"price\") |>\n select(-var) |>\n kbl(digits = 4) |>\n kable_styling(c(\"striped\", \"hover\"))\n```\n\nAdding the neighbourhood as a group for the statistics.\n\n```{r}\nsimple_summarize(prop_rent, var = \"price\", neighbourhood) |>\n arrange(desc(med)) |>\n select(-var) |>\n kbl(digits = 4) |>\n kable_styling(c(\"striped\", \"hover\")) |>\n scroll_box(width = \"100%\")\n```\n\nAs we did before for styles, a visual graphic for the `median.` We use the median here instead of \naverage because it is a robust statistic when we have too many outliers.\n\n```{r}\nprop_rent |>\n select(neighbourhood, price) |>\n group_by(neighbourhood) |>\n summarise(median = median(price)) |>\n rename(name = neighbourhood, value = median) |>\n e_charts(name) |>\n e_map_register('Montreal', montreal_json) |>\n e_map(value, map = 'Montreal') |>\n e_visual_map(value) |>\n e_title(\"Rental price\")\n```\n\nThe hard thing about rent a house/apartment is that you are giving money for some else enjoy\nthe life or pay for the place you are renting, but the beautiful thing about rent is that\nyour place is anywhere! So let's take a look in all rent data set that we have.\n\n```{r}\npal_rent <- as.character(wes_palette(\"Zissou1\", length(unique(prop_rent$price_cat)), type = \"continuous\"))\n\nmap_rent <- map_sale <- leaflet() |>\n addTiles()\n\nwalk2(unique(levels(prop_rent$price_cat)),\n pal_rent,\n function(.x, .y) {\n map_rent <<- map_rent |>\n addCircleMarkers(\n data = prop_rent |>\n filter(price_cat == !!.x),\n lat = ~lat,\n lng = ~lng,\n group = .x,\n radius = which(pal_rent == .y) + 3,\n color = .y,\n popup = ~popup,\n stroke = FALSE,\n fillOpacity = 0.75\n )\n })\n\nmap_rent %>%\n addLayersControl(\n overlayGroups = levels(prop_rent$price_cat),\n options = layersControlOptions(collapsed = FALSE)\n )\n```\n\nOBSERVATION: You'll see that exists different circle size, the smaller the cheaper the\nbigger the more expensive. Also, you can click in each circle, that is a link to the property.\n\n# Sale\n\n```{r}\nprop_sale <- prop |>\n filter(type == \"sale\") |>\n mutate(\n price_cat = as.character(cut2(price, c(seq(0, 300000, 50000), 10000000, Inf))),\n popup = str_c(\n \"Sales price: \",\n format_num(price),\n \"<br/>\",\n \"<a href='\",\n url,\n \"'>Property<a/>\"\n )\n )\n\nsimple_summarize(prop_sale, var = \"price\") |>\n select(-var)\n\nsimple_summarize(prop_sale, var = \"price\", neighbourhood) |>\n arrange(desc(med)) |>\n filter(total_obs >= 30)\n\nmontreal |>\n left_join(\n simple_summarize(prop_sale, var = \"price\", neighbourhood) |>\n arrange(desc(med)) |>\n filter(total_obs >= 30),\n by = c(\"NOM\" = \"neighbourhood\")\n ) |>\n ggplot() +\n geom_sf(aes(fill = med)) +\n theme_minimal() +\n theme(legend.title = element_blank()) +\n scale_fill_continuous(labels = format_num)\n\nmontreal |>\n left_join(\n simple_summarize(prop_sale, var = \"price\", neighbourhood) |>\n arrange(desc(med)) |>\n filter(total_obs >= 30),\n by = c(\"NOM\" = \"neighbourhood\")\n ) |>\n ggplot() +\n geom_sf(aes(fill = sd)) +\n theme_minimal() +\n theme(legend.title = element_blank()) +\n scale_fill_continuous(labels = format_num)\n```\n\n```{r}\npal_sale <- as.character(wes_palette(\"Zissou1\", length(unique(prop_sale$price_cat)), type = \"continuous\"))\n\nwalk2(unique(levels(prop_sale$price_cat)),\n pal_rent,\n function(.x, .y) {\n map_sale <<- map_sale |>\n addCircleMarkers(\n data = prop_sale |>\n filter(price_cat == !!.x),\n lat = ~lat,\n lng = ~lng,\n group = .x,\n radius = which(pal_rent == .y) + 3,\n color = .y,\n popup = ~popup,\n stroke = FALSE,\n fillOpacity = 0.75\n )\n })\n\nmap_sale %>%\n addLayersControl(\n overlayGroups = levels(prop_sale$price_cat),\n options = layersControlOptions(collapsed = FALSE)\n )\n```\n\n## Top 10 expensive properties\n\n```{r}\nprop_sale |>\n arrange(desc(price)) |>\n slice(1:10) |>\n select(url, price)\n```\n\n## Best deals\n\n```{r}\nprop_sale_mod <- prop_sale |>\n filter(rooms != 0 & bedrooms != 0 & bathrooms != 0) |>\n select(url, rooms, walkscore, year_built, neighbourhood, tipology2, price) |>\n drop_na() |>\n filter(between(price, 100000, 2000000))\n\nprop_sale_mod$pricelog <- log(prop_sale_mod$price)\n\nm0 <- MASS::rlm(\n pricelog ~ .,\n data = prop_sale_mod |>\n select(-price, -url)\n)\n## summary(m0)\n\nprop_sale_mod2 <- prop_sale_mod |>\n filter(\n !(url %in% (prop_sale_mod |>\n mutate(\n pricelogPredict = predict(m0),\n pricePredict = exp(pricelogPredict),\n diff = price - pricePredict\n ) |>\n arrange(diff) |>\n slice(1:2) |>\n pull(url))\n )\n )\n\nm2 <- MASS::rlm(\n pricelog ~ .,\n data = prop_sale_mod2 |>\n select(-price, -url)\n)\n\nprop_sale_mod2 |>\n mutate(\n pricelogPredict = predict(m2),\n pricePredict = exp(pricelogPredict),\n diff = price - pricePredict,\n ratio = pricePredict / price - 1\n ) |>\n arrange(desc(ratio)) |>\n slice(1:10)\n```\n\n# Brokers brokers brokers\n\nSome of the things that I liked to see here are the real estate agents, the people who sell, \nbuy and rent houses or apartments. I think all the available houses for one of those operations \nhave a real estate sign. \n\nSomething interesting that I discover here is that agents are different from brokers. It is a \nslight difference; the brokers can build their own business and hire others agents to work for \nthem; the agents can't do that; they only can work. For the analysis, it doesn't matter.\n\nLet's take a look at the top 10 agents in Montreal.\n\n```{r}\nagents_top <- agents |>\n group_by(agents_name) |>\n count() |>\n ungroup() |>\n arrange(desc(n)) |>\n slice(1:10)\n\nagents_top |>\n kbl(digits = 4) |>\n kable_styling(c('striped', 'hover'))\n```\n\n\nFor the top 10, the main negotiation that they are doing.\n\n```{r}\nagents_prop <- agents |>\n left_join(\n prop |>\n select(url, type, tipology2, neighbourhood, price)\n ) %>%\n drop_na(type)\n\nagents_prop |>\n filter(agents_name %in% agents_top$agents_name) |>\n group_by(agents_name) |>\n count(type) |>\n spread(type, n) |>\n ungroup() |>\n arrange(desc(sale)) |>\n kbl(digits = 4) |>\n kable_styling(c('striped', 'hover'))\n```\n\nThe main kind of propertie that they are working. Here \n\n```{r}\nagents_prop |>\n filter(agents_name %in% agents_top$agents_name) |>\n group_by(agents_name) |>\n count(tipology2) |>\n spread(tipology2, n) |>\n ungroup() |>\n kbl(digits = 4) |>\n kable_styling(c('striped', 'hover'))\n```\n\nand to end this post I created a index that I called relevance that \nis the total properties agents by n\n\n```{r}\nagents_prop |>\n filter(agents_name %in% agents_top$agents_name) |>\n group_by(agents_name) |>\n count(neighbourhood) |>\n left_join(\n agents_prop |>\n group_by(neighbourhood) |>\n count(name = \"total_neighbourhood\"),\n by = \"neighbourhood\"\n ) |> \n mutate(relevance = n / total_neighbourhood) |>\n group_by(neighbourhood) |> \n filter(relevance == max(relevance) & total_neighbourhood >= 30)\n```"}]}}Content-Length: 251
[2022-04-23 14:22:15.278] Error: Unexpected non-empty line
Call: self$read_header()
Stack trace:
1: stop("Unexpected non-empty line")
2: self$read_header()
3: self$fetch(blocking = FALSE)
[2022-04-23 14:22:15.278] exiting
[Error - 2:22:15 PM] Connection to server got closed. Server will not be restarted.
[Error - 2:22:15 PM] Request textDocument/documentSymbol failed.
Error: Connection got disposed.
at Object.dispose (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1835784)
at Object.dispose (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1916023)
at T.handleConnectionClosed (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1916236)
at T.handleConnectionClosed (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1976767)
at t (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1914325)
at r.invoke (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1837401)
at a.fire (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1838162)
at X (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1825042)
at r.invoke (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1837401)
at a.fire (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1838162)
at h.fireClose (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1846051)
at Socket.<anonymous> (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1847636)
at Socket.emit (node:events:402:35)
at Pipe.<anonymous> (node:net:687:12)
[Error - 2:22:15 PM] Request textDocument/completion failed.
Error: Connection got disposed.
at Object.dispose (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1835784)
at Object.dispose (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1916023)
at T.handleConnectionClosed (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1916236)
at T.handleConnectionClosed (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1976767)
at t (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1914325)
at r.invoke (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1837401)
at a.fire (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1838162)
at X (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1825042)
at r.invoke (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1837401)
at a.fire (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1838162)
at h.fireClose (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1846051)
at Socket.<anonymous> (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1847636)
at Socket.emit (node:events:402:35)
at Pipe.<anonymous> (node:net:687:12)
R version 4.1.2 (2021-11-01)
Platform: aarch64-apple-darwin20 (64-bit)
Running under: macOS Monterey 12.2.1
Matrix products: default
BLAS: /Library/Frameworks/R.framework/Versions/4.1-arm64/Resources/lib/libRblas.0.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/4.1-arm64/Resources/lib/libRlapack.dylib
locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
attached base packages:
[1] stats graphics grDevices utils datasets methods base
loaded via a namespace (and not attached):
[1] processx_3.5.2 compiler_4.1.2 R6_2.5.1 cli_3.2.0 parallel_4.1.2 tools_4.1.2 collections_0.3.5 xml2_1.3.3 callr_3.7.0 jsonlite_1.8.0 ps_1.6.0 rlang_1.0.2 languageserver_0.3.12
I have the same issue, completely have no idea why, it just suddenly crashes
I once tried to open r.lsp.debug = true
in the lsp settings,
but it outputs too many garbages and the really useful information is loss.
My R --version
is:
R version 4.1.2 (2021-11-01) -- "Bird Hippie"
Copyright (C) 2021 The R Foundation for Statistical Computing
Platform: aarch64-apple-darwin20 (64-bit)
R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under the terms of the
GNU General Public License versions 2 or 3.
For more information about these matters see
https://www.gnu.org/licenses/.
I have take a look at my lsp.log
and one of my typical error is:
I have the followng error:
[2022-05-09 04:04:40.036] error handling json: Error: lexical error: invalid char in json text.
ys <- runif(n) >\n {"params":{"textDocument":{"versi
(right here) ------^
Stack trace:
1: parse_string(txt, bigint_as_char)
2: parseJSON(txt, bigint_as_char)
3: parse_and_simplify(txt = txt, simplifyVector = simplifyVector,
simplifyDataFrame = simplifyDataFrame, simplifyMatrix = simplifyMatrix,
flatten = flatten, ...)
4: jsonlite::fromJSON(data, simplifyVector = FALSE)
[2022-05-09 04:04:40.039] Error: Unexpected non-empty line
Call: self$read_header()
Stack trace:
1: stop("Unexpected non-empty line")
2: self$read_header()
3: self$fetch(blocking = FALSE)
[2022-05-09 04:04:40.039] exiting
The code that yields the error is at here:
generation <- function(n, gamma0, beta0, gamma1, beta1, alpha) {
`%>%` <- tidyverse::`%>%` # pass lintr object-usage linter
ys <- runif(n) %>%
(function(x) ifelse(x > alpha, 1, 0))
ps <- runif(n)
xs <- rep(0, n)
xs[ys == 0] <- (log(ps[ys == 0] / (1 - ps[ys == 0])) + beta0) / gamma0
xs[ys == 1] <- (log(ps[ys == 1] / (1 - ps[ys == 1])) + beta1) / gamma1
ys <- ys[xs >= 0]
xs <- xs[xs >= 0]
data.frame(x = xs, y = factor(ys))
}
I tried some debug work, in
languagebase.R
at line 120, I add the following code to capture what on earth the lsp client send to the lsp server,
if (inherits(payload, "error")) {
logger$error("error handling json: ", payload)
tstfile <- file("testfails.json", "w")
writeLines(data, tstfile)
close(tstfile)
return(NULL)
And I find that testfails.json
is indeed a non-parsable text, I will show a brief excerption of the json, ignore the irrelevant part (and I manually break the line to make it more readable), and the body of the text is my comment instead of the real text.
{"jsonrpc":"2.0",
"method":"textDocument\/didChange",
"params":{"contentChanges":
[{"text":"---the following is the normal parsable text (and ascii I can make sure), so I will skip them.
Then the text suddenly is clipped without end quote
ant a new completely new json string (maybe the server send a new request?)
is loaded{"jsonrpc":"2.0","method":"textDocument\/didChange",
"params":{"contentChanges":
[{"text":"---\ The next
is repeating the content my document
and it does not have legal json end identifier (it is clipped at the middle)
So there may be problem in the text loading part, related to languagebase.LanguageBase$read_header()
and langaugeserver.LanguageServer$read_char()
However, as I find that the text is clipped from the middle, and start to read a completely new string, I think it is just possible that the stream sent from lsp client is just incomplete and the next one is sent.
According to lsp specification at subsection Basic protocal Json Strucutre - Response Message
A Response Message sent as a result of a request. If a request doesn’t provide a result value the receiver of a request still needs to return a response message to conform to the JSON-RPC specification. The result property of the ResponseMessage should be set to null in this case to signal a successful request.
So may be in such cases like read text fails, we should just send a null
message?
@milanglacier what editor are you using? Specifically, do you know if you are using TCP connection or STDIO?
@milanglacier what editor are you using? Specifically, do you know if you are using TCP connection or STDIO?
I am using neovim with builtin LSP, and it is STDIO connection.
after tons of controlling variables experiments, I found that the crashes happen when a neovim plugin cmp-buffer
which provides completion source from the text (mainly words appeared in the text) is enabled simultaneously with nvim-LSP as another completion source.
While it is hard to believe that two completion source would be possible to have conflict, but this is controlling variables experiment shows me. I do believe the crashes would be triggered under another situations, but currently if I disables cmp-buffer
sources, then R-LSP will works fine.
After all, I am quiet sure the crashes only happen during the auto-completion. I’m wondering maybe the cmp-buffer
will let the didChange
notification too frequently to cause the stdio pipe being clipped? I don’t have experience in jsonRPC, so forgive me for my random guessing.
I faced another situation where r-lsp crashes and I am not turning on cmp-buffer
, where only lsp
source is enabled, so the root cause should not be the conflict of different sources.
I think this may relate to #127 since it casts the same error: error handling json: Error: lexical error: invalid char in json text
I find that such error is relatively easy to be reproduced for moderately large files (over 500 lines), for example
a example file
I opened this file and before typing more than 50 characters, the r-lsp
will crash
I followed the solution merged at #140, and I tried to change the bufsize from 2 * n
gradually even to 10000 * n
,
but the crash still occurs, now it occurs at
error handling json: Error: lexical error: invalid char in json text.
ases. Taking logarithm woul{"method":"textDocument\/didChang
So the stdio buffer is still clipped. But should 10000 * n
be still not enough for reading the buffer? Hard to believe (I am not familiar with the low-level mechanism like streaming data, I am just doing random guessing, so forgive me)
If you want to test,
try changing this function to
read_content = function(nbytes) {
data <- ""
logger$debug("original nbytes:", nbytes)
while (nbytes > 0) {
newdata <- self$read_char(nbytes)
if (length(newdata) > 0) {
nbytes <- nbytes - nchar(newdata, type = "bytes")
logger$debug("newdata:", newdata)
logger$debug("nbytes:", nbytes)
data <- paste0(data, newdata)
}
Sys.sleep(0.01)
}
data
},
self$read_char
may return a string which is shorter than nbytes, but successive use should give the sum of bytes = the original nbytes.
Here is an example error casted log file by editing the file a example file. Note that I change logger$debug
in your example code to logger$error
because setting debug = TRUE
will generate too many garbage information.
I find that, for the content that causes error, the newdata
shows completely the same content twice, i.e read_char()
read the same bytes twice:
[2022-05-21 18:27:25.439] nbytes: 22619
[2022-05-21 18:27:25.451] nbytes: 14427
[2022-05-21 18:27:25.462] nbytes: 6235
[2022-05-21 18:27:25.474] nbytes: 0
[2022-05-21 18:27:25.486] error handling json: Error: parse error: premature EOF
Here the content of newdata
followed by nbytes 14427
and the content of newdata
followed by nbytes 6235
are completely the same
Since the logfile is too large, I ignore the output from newdata
at here, I shared the complete log file (included output from newdata
via https://gist.github.com/milanglacier/f6e65d276698edda53e70400441e114c
I give several other experiments:
The replicate can appear at anywhere (may not be the last two contents being read)
It is also possible that the sum of nbytes exceed the orignal nbytes, but one thing for sure: the contents of newdata
are completely the same, Here for personal privacy I will clip the content of the files I am editing (i.e output from newdata), and I will show you the log file.
I make a simple change to the function self$read_content()
which checks if the two batches read by read_char()
are the same, if the two batches are the same, then ignore this batch and continue to read.
Currently it seems work fine on the rmd file I list as example,
I will try to give more test on my problematic files.
Once I find that it works fine on all my problematic files, I will submit a PR
(updated: test on a R file with 1500 lines caret/resamples.R
and it works fine, the completion is smooth)
I make a simple change to the function
self$read_content()
which checks if the two batches read byread_char()
are the same, if the two batches are the same, then ignore this batch and continue to read.Currently it seems work fine on the rmd file I list as example,
I will try to give more test on my problematic files.
Once I find that it works fine on all my problematic files, I will submit a PR
(updated: test on a R file with 1500 lines
caret/resamples.R
and it works fine, the completion is smooth)
That's useful. Not sure why the data is doubled. For some reason, the cursor position is reset.
Lately I have had frequent crashed of the Language server.
Here is the information I have collected. Please let me know if you need any additional information.
Environment
Output after crash
R Language Server (41964) started [2021-01-25 10:36:16.006] error handling json: Error: lexical error: invalid char in json text. Content-Length: 5204 {"jsonr (right here) ------^
Stack trace: 1: parse_string(txt, bigint_as_char) 2: parseJSON(txt, bigint_as_char) 3: parse_and_simplify(txt = txt, simplifyVector = simplifyVector, simplifyDataFrame = simplifyDataFrame, simplifyMatrix = simplifyMatrix, flatten = flatten, ...) 4: jsonlite::fromJSON(data, simplifyVector = FALSE)
[2021-01-25 10:36:16.219] Error: Unexpected non-empty line Call: self$read_header() Stack trace: 1: stop("Unexpected non-empty line") 2: self$read_header() 3: self$fetch(blocking = FALSE)
[2021-01-25 10:36:16.219] exiting Warning message: In readLines(self$inputcon, n = 1, encoding = "UTF-8") : incomplete final line found on '->localhost:50454'
[Error - 10:36:16 AM] Connection to server is erroring. Shutting down server. [Error - 10:36:16 AM] Connection to server is erroring. Shutting down server. [Error - 10:36:16 AM] Connection to server is erroring. Shutting down server. [Error - 10:36:16 AM] Connection to server is erroring. Shutting down server. [Error - 10:36:16 AM] Connection to server is erroring. Shutting down server. [Error - 10:36:16 AM] Connection to server is erroring. Shutting down server. R Language Server (41964) exited with exit code 0