REditorSupport / languageserver

An implementation of the Language Server Protocol for R
Other
584 stars 94 forks source link

Language Server crashing sporadically #368

Closed schalkk closed 2 years ago

schalkk commented 3 years ago

Lately I have had frequent crashed of the Language server.

Here is the information I have collected. Please let me know if you need any additional information.

Environment

Output after crash

R Language Server (41964) started [2021-01-25 10:36:16.006] error handling json: Error: lexical error: invalid char in json text. Content-Length: 5204 {"jsonr (right here) ------^

Stack trace: 1: parse_string(txt, bigint_as_char) 2: parseJSON(txt, bigint_as_char) 3: parse_and_simplify(txt = txt, simplifyVector = simplifyVector, simplifyDataFrame = simplifyDataFrame, simplifyMatrix = simplifyMatrix, flatten = flatten, ...) 4: jsonlite::fromJSON(data, simplifyVector = FALSE)

[2021-01-25 10:36:16.219] Error: Unexpected non-empty line Call: self$read_header() Stack trace: 1: stop("Unexpected non-empty line") 2: self$read_header() 3: self$fetch(blocking = FALSE)

[2021-01-25 10:36:16.219] exiting Warning message: In readLines(self$inputcon, n = 1, encoding = "UTF-8") : incomplete final line found on '->localhost:50454'

[Error - 10:36:16 AM] Connection to server is erroring. Shutting down server. [Error - 10:36:16 AM] Connection to server is erroring. Shutting down server. [Error - 10:36:16 AM] Connection to server is erroring. Shutting down server. [Error - 10:36:16 AM] Connection to server is erroring. Shutting down server. [Error - 10:36:16 AM] Connection to server is erroring. Shutting down server. [Error - 10:36:16 AM] Connection to server is erroring. Shutting down server. R Language Server (41964) exited with exit code 0

renkun-ken commented 3 years ago

Could you reproduce this error, or it just occurs randomly?

It looks like the TCP message between vscode and languageserver could be incomplete in this case.

@randy3k Do you think we could tryCatch reading data from lsp client and ignores it if there's an error parsing the message? From the LSP specification, it looks like ignoring an LSP message might lead to bad consequences such as requests that is never handled.

renkun-ken commented 3 years ago

From your message, it looks like the Content-Length is not correctly handled because the rest of the message is the JSON we need.

schalkk commented 3 years ago

@renkun-ken No sorry, at the moment is still appears to be random.

At the moment it appears to be related to LINT as it normally happens as I am budy clearing out balnks at the end of the code lines.

Could it be related to a local firewall? I am running McAFee

randy3k commented 3 years ago

This kind of error is often seen in non UTF-8 files. Could you verify that your files are encoded in UTF-8?

randy3k commented 3 years ago

Perhaps, you want to add some logic in vscode-r-lsp to check if the files are encoded in UTF-8 to avoid errors downstream.

schalkk commented 3 years ago

I did double check. It is UTF-8.

image

randy3k commented 3 years ago

Which style of line ending do you use? Unix, Windows or?

randy3k commented 3 years ago

Another thing that you could try is to enable the debug setting in VSCode R setting, it should give us a more detailed log.

schalkk commented 3 years ago

CRLF Line endings

On Mon, Jan 25, 2021 at 7:36 PM Randy Lai notifications@github.com wrote:

Which style of line ending do you use? Unix, Windows or?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/REditorSupport/languageserver/issues/368#issuecomment-767026104, or unsubscribe https://github.com/notifications/unsubscribe-auth/AQTF72LX73NGZVA6BRW56ILS3W23ZANCNFSM4WRPNUUA .

schalkk commented 3 years ago

After I have cleared out all the errors mentioned by LINT, the languageServer is now a lot more stable. Just had it running with debug for more than 30mins with no crash.

jj-9 commented 3 years ago

Hello !

very similar to the problem I described there : https://github.com/REditorSupport/languageserver/issues/325 (cf bug repro at the end)

Regards JJ

schalkk commented 3 years ago

@jj-9 Yes, these seem very similar! Mine is also happening when changing basic things like removing spaces, adding spaces, ... In most cases it takes a number of changes before the language server crashes.

BTW - I also see the issue "Error: stdin connection close" when I try to start the languageserver from an R session outside the VSCode space (e.g. in Powershell calling R 4.03 binary directly).

Maybe this helps ;-)

albersonmiranda commented 3 years ago

Also having crashing seconds after initialization. Started when I changed .lintr file from

linters: with_defaults(assignment_linter = NULL

To

linters: with_defaults(assignment_linter = NULL, commented_code_linter = NULL, object_usage_linter = NULL, line_length_linter = NULL)

But changing it back won't solve it.

EDIT: reopen then saving with UTF-8 encoding seem to solve it

EDIT 2: nope, still broken: Error: Total path length must be less than PATH_MAX: 260

randy3k commented 3 years ago

@albersonmiranda

It seems that your issue is a different one. Try to shorten your file/folder names to see if it fixes the issue.

randy3k commented 3 years ago

@schalkk

How large is the file that you are editing?

BTW - I also see the issue "Error: stdin connection close" when I try to start the languageserver from an R session outside the VSCode space (e.g. in Powershell calling R 4.03 binary directly).

It is expected. You can't run languageserver interactively.

albersonmiranda commented 3 years ago

@albersonmiranda

It seems that your issue is a different one. Try to shorten your file/folder names to see if it fixes the issue.

I don't think that's really the issue: C:\Users\alber\Documents\R\econdashboard\R\mod_credito.R

Would you like it to be treated as a separate issue?

schalkk commented 3 years ago

Hi, I have managed to capture a logfile. RLandServ.log

Here it the editor I was using: image

The R file is 168 lines long - 8kb in size.

renkun-ken commented 3 years ago

The error is like #465.

[2021-02-07 16:59:24.385] error handling json:  Error: lexical error: invalid char in json text.
                                       Content-Length: 8217    {"jsonr
                     (right here) ------^

Stack trace:
1: parse_string(txt, bigint_as_char)
2: parseJSON(txt, bigint_as_char)
3: parse_and_simplify(txt = txt, simplifyVector = simplifyVector, 
    simplifyDataFrame = simplifyDataFrame, simplifyMatrix = simplifyMatrix, 
    flatten = flatten, ...)
4: jsonlite::fromJSON(data, simplifyVector = FALSE)
renkun-ken commented 3 years ago

Is it possible that we make the connection reading more robust?

jj-9 commented 2 years ago

Hello !

just in anycase it could help, the error doesn't occur in vscode on ubuntu.

Regards JJ

Andryas commented 2 years ago

I am having the same issue, I recently move to ios system and start to have this problem. When I was using Ubuntu never had this kind of problem.

The languageserver works for a while then suddenly stop.

[2022-04-23 13:55:22.791] document_code_action_reply:  {
  "uri": "file:///Users/andryaswavrzenczak/Documents/exploring-properties-montreal/montreal.Rmd",
  "range": {
    "start": {
      "row": 1003,
      "col": 0
    },
    "end": {
      "row": 1005,
      "col": 9
    }
  },
  "context": {
    "diagnostics": []
  },
  "result": []
}
[2022-04-23 13:55:22.791] deliver:  ["Response", "Message", "R6"]
[2022-04-23 13:55:23.318] received:  Content-Length: 28677
[2022-04-23 13:55:23.369] error handling json:  Error: parse error: premature EOF
                                       {"jsonrpc":"2.0","method":"text
                     (right here) ------^

Stack trace:
1: parse_string(txt, bigint_as_char)
2: parseJSON(txt, bigint_as_char)
3: parse_and_simplify(txt = txt, simplifyVector = simplifyVector, 
    simplifyDataFrame = simplifyDataFrame, simplifyMatrix = simplifyMatrix, 
    flatten = flatten, ...)
4: jsonlite::fromJSON(data, simplifyVector = FALSE)
[2022-04-23 13:55:23.371] received:  marize(prop_sale, var = \"price\", neighbourhood) |>\n      arrange(desc(med)) |>\n      filter(total_obs >= 30),\n    by = c(\"NOM\" = \"neighbourhood\")\n  ) |>\n  ggplot() +\n  geom_sf(aes(fill = sd)) +\n  theme_minimal() +\n  theme(legend.title = element_blank()) +\n  scale_fill_continuous(labels = format_num)\n```\n\n```{r}\npal_sale <- as.character(wes_palette(\"Zissou1\", length(unique(prop_sale$price_cat)), type = \"continuous\"))\n\nwalk2(unique(levels(prop_sale$price_cat)),\n  pal_rent,\n  function(.x, .y) {\n    map_sale <<- map_sale |>\n      addCircleMarkers(\n        data = prop_sale |>\n          filter(price_cat == !!.x),\n        lat = ~lat,\n        lng = ~lng,\n        group = .x,\n        radius = which(pal_rent == .y) + 3,\n        color = .y,\n        popup = ~popup,\n        stroke = FALSE,\n        fillOpacity = 0.75\n      )\n  })\n\nmap_sale %>%\n  addLayersControl(\n    overlayGroups = levels(prop_sale$price_cat),\n    options = layersControlOptions(collapsed = FALSE)\n  )\n```\n\n## Top 10 expensive properties\n\n```{r}\nprop_sale |>\n  arrange(desc(price)) |>\n  slice(1:10) |>\n  select(url, price)\n```\n\n## Best deals\n\n```{r}\nprop_sale_mod <- prop_sale |>\n  filter(rooms != 0 & bedrooms != 0 & bathrooms != 0) |>\n  select(url, rooms, walkscore, year_built, neighbourhood, tipology2, price) |>\n  drop_na() |>\n  filter(between(price, 100000, 2000000))\n\nprop_sale_mod$pricelog <- log(prop_sale_mod$price)\n\nm0 <- MASS::rlm(\n  pricelog ~ .,\n  data = prop_sale_mod |>\n    select(-price, -url)\n)\n## summary(m0)\n\nprop_sale_mod2 <- prop_sale_mod |>\n  filter(\n    !(url %in% (prop_sale_mod |>\n      mutate(\n        pricelogPredict = predict(m0),\n        pricePredict = exp(pricelogPredict),\n        diff = price - pricePredict\n      ) |>\n      arrange(diff) |>\n      slice(1:2) |>\n      pull(url))\n    )\n  )\n\nm2 <- MASS::rlm(\n  pricelog ~ .,\n  data = prop_sale_mod2 |>\n    select(-price, -url)\n)\n\nprop_sale_mod2 |>\n  mutate(\n    pricelogPredict = predict(m2),\n    pricePredict = exp(pricelogPredict),\n    diff = price - pricePredict,\n    ratio = pricePredict / price - 1\n  ) |>\n  arrange(desc(ratio)) |>\n  slice(1:10)\n```\n\n# Brokers brokers brokers\n\nSome of the things that I liked to see here are the real estate agents,  the people who sell, \nbuy and rent houses or apartments. I think all the available houses for one of those operations \nhave a real estate sign. \n\nSomething interesting that I discover here is that agents are different from brokers. It is a \nslight difference; the brokers can build their own business and hire others agents to work for \nthem; the agents can't do that; they only can work. For the analysis, it doesn't matter.\n\nLet's take a look at the top 10 agents in Montreal.\n\n```{r}\nagents_top <- agents |>\n  group_by(agents_name) |>\n  count() |>\n  ungroup() |>\n  arrange(desc(n)) |>\n  slice(1:10)\n\nagents_top |>\n  kbl(digits = 4) |>\n  kable_styling(c('striped', 'hover'))\n```\n\n\nFor the top 10, the main negotiation that they are doing.\n\n```{r}\nagents_prop <- agents |>\n  left_join(\n    prop |>\n      select(url, type, tipology2, neighbourhood, price)\n  ) %>%\n  drop_na(type)\n\nagents_prop |>\n  filter(agents_name %in% agents_top$agents_name) |>\n  group_by(agents_name) |>\n  count(type) |>\n  spread(type, n) |>\n  ungroup() |>\n  arrange(desc(sale)) |>\n  kbl(digits = 4) |>\n  kable_styling(c('striped', 'hover'))\n```\n\nThe main kind of propertie that they are working. Here \n\n```{r}\nagents_prop |>\n  filter(agents_name %in% agents_top$agents_name) |>\n  group_by(agents_name) |>\n  count(tipology2) |>\n  spread(tipology2, n) |>\n  ungroup() |>\n  kbl(digits = 4) |>\n  kable_styling(c('striped', 'hover'))\n```\n\n\n\n```{r}\n\n  filter(agents_name %in% agents_top$agents_name) |>\n  group_by(agents_name) |>\n  count(neighbourhood) |>\n  group_by(neighbourhood) |>\n  mutate(total_neighbourhood = sum(n)) |>\n  group_by(agents_name) |>\n  mutate(relevance = n / total_neighbourhood) |>\n  filter(relevance == max(relevance) & total_neighbourhood >= 30)\n```"}]}}Content-Length: 28678
[2022-04-23 13:55:23.371] Error: Unexpected non-empty line
Call: self$read_header()
Stack trace:
1: stop("Unexpected non-empty line")
2: self$read_header()
3: self$fetch(blocking = FALSE)
[2022-04-23 13:55:23.371] exiting
There were 11 warnings (use warnings() to see them)
[Error - 1:55:23 PM] Connection to server is erroring. Shutting down server.
[Error - 1:55:23 PM] Connection to server is erroring. Shutting down server.
[Error - 1:55:23 PM] Connection to server is erroring. Shutting down server.

Other example

Stack trace:
1: parse_string(txt, bigint_as_char)
2: parseJSON(txt, bigint_as_char)
3: parse_and_simplify(txt = txt, simplifyVector = simplifyVector, 
    simplifyDataFrame = simplifyDataFrame, simplifyMatrix = simplifyMatrix, 
    flatten = flatten, ...)
4: jsonlite::fromJSON(data, simplifyVector = FALSE)
[2022-04-23 14:22:15.278] received:   prop |>\n      select(neighbourhood, .x) |>\n      group_by(neighbourhood) |>\n      summarise_all(median) |>\n      rename(name = neighbourhood, value = .x) |>\n      e_charts(name) |>\n      e_map_register('Montreal', montreal_json) |>\n      e_map(value, map = 'Montreal') |>\n      e_visual_map(value) |>\n      e_title(.x)\n  }\n)\n```\n\nRemember that you can check the values in the last table.\n\n# Rent \n\nTaking a general overview of the rent in Montreal.\n\n```{r}\nprop_rent <- prop |>\n  filter(type == \"rent\") |>\n  mutate(\n    price_cat = cut2(price, c(seq(0, 5000, 500), 10000, Inf)),\n    popup = str_c(\n      \"Rent price: \",\n      format_num(price),\n      \"<br/>\",\n      \"<a href='\",\n      url,\n      \"'>Property<a/>\"\n    )\n  )\n\nsimple_summarize(prop_rent, var = \"price\") |>\n  select(-var) |>\n  kbl(digits = 4) |>\n  kable_styling(c(\"striped\", \"hover\"))\n```\n\nAdding the neighbourhood as a group for the statistics.\n\n```{r}\nsimple_summarize(prop_rent, var = \"price\", neighbourhood) |>\n  arrange(desc(med)) |>\n  select(-var) |>\n  kbl(digits = 4) |>\n  kable_styling(c(\"striped\", \"hover\")) |>\n  scroll_box(width = \"100%\")\n```\n\nAs we did before for styles, a visual graphic for the `median.` We use the median here instead of \naverage because it is a robust statistic when we have too many outliers.\n\n```{r}\nprop_rent |>\n  select(neighbourhood, price) |>\n  group_by(neighbourhood) |>\n  summarise(median = median(price)) |>\n  rename(name = neighbourhood, value = median) |>\n  e_charts(name) |>\n  e_map_register('Montreal', montreal_json) |>\n  e_map(value, map = 'Montreal') |>\n  e_visual_map(value) |>\n  e_title(\"Rental price\")\n```\n\nThe hard thing about rent a house/apartment is that you are giving money for some else enjoy\nthe life or pay for the place you are renting, but the beautiful thing about rent is that\nyour place is anywhere! So let's take a look in all rent data set that we have.\n\n```{r}\npal_rent <- as.character(wes_palette(\"Zissou1\", length(unique(prop_rent$price_cat)), type = \"continuous\"))\n\nmap_rent <- map_sale <- leaflet() |>\n  addTiles()\n\nwalk2(unique(levels(prop_rent$price_cat)),\n  pal_rent,\n  function(.x, .y) {\n    map_rent <<- map_rent |>\n      addCircleMarkers(\n        data = prop_rent |>\n          filter(price_cat == !!.x),\n        lat = ~lat,\n        lng = ~lng,\n        group = .x,\n        radius = which(pal_rent == .y) + 3,\n        color = .y,\n        popup = ~popup,\n        stroke = FALSE,\n        fillOpacity = 0.75\n      )\n  })\n\nmap_rent %>%\n  addLayersControl(\n    overlayGroups = levels(prop_rent$price_cat),\n    options = layersControlOptions(collapsed = FALSE)\n  )\n```\n\nOBSERVATION: You'll see that exists different circle size, the smaller the cheaper the\nbigger the more expensive. Also, you can click in each circle, that is a link to the property.\n\n# Sale\n\n```{r}\nprop_sale <- prop |>\n  filter(type == \"sale\") |>\n  mutate(\n    price_cat = as.character(cut2(price, c(seq(0, 300000, 50000), 10000000, Inf))),\n    popup = str_c(\n      \"Sales price: \",\n      format_num(price),\n      \"<br/>\",\n      \"<a href='\",\n      url,\n      \"'>Property<a/>\"\n    )\n  )\n\nsimple_summarize(prop_sale, var = \"price\") |>\n  select(-var)\n\nsimple_summarize(prop_sale, var = \"price\", neighbourhood) |>\n  arrange(desc(med)) |>\n  filter(total_obs >= 30)\n\nmontreal |>\n  left_join(\n    simple_summarize(prop_sale, var = \"price\", neighbourhood) |>\n      arrange(desc(med)) |>\n      filter(total_obs >= 30),\n    by = c(\"NOM\" = \"neighbourhood\")\n  ) |>\n  ggplot() +\n  geom_sf(aes(fill = med)) +\n  theme_minimal() +\n  theme(legend.title = element_blank()) +\n  scale_fill_continuous(labels = format_num)\n\nmontreal |>\n  left_join(\n    simple_summarize(prop_sale, var = \"price\", neighbourhood) |>\n      arrange(desc(med)) |>\n      filter(total_obs >= 30),\n    by = c(\"NOM\" = \"neighbourhood\")\n  ) |>\n  ggplot() +\n  geom_sf(aes(fill = sd)) +\n  theme_minimal() +\n  theme(legend.title = element_blank()) +\n  scale_fill_continuous(labels = format_num)\n```\n\n```{r}\npal_sale <- as.character(wes_palette(\"Zissou1\", length(unique(prop_sale$price_cat)), type = \"continuous\"))\n\nwalk2(unique(levels(prop_sale$price_cat)),\n  pal_rent,\n  function(.x, .y) {\n    map_sale <<- map_sale |>\n      addCircleMarkers(\n        data = prop_sale |>\n          filter(price_cat == !!.x),\n        lat = ~lat,\n        lng = ~lng,\n        group = .x,\n        radius = which(pal_rent == .y) + 3,\n        color = .y,\n        popup = ~popup,\n        stroke = FALSE,\n        fillOpacity = 0.75\n      )\n  })\n\nmap_sale %>%\n  addLayersControl(\n    overlayGroups = levels(prop_sale$price_cat),\n    options = layersControlOptions(collapsed = FALSE)\n  )\n```\n\n## Top 10 expensive properties\n\n```{r}\nprop_sale |>\n  arrange(desc(price)) |>\n  slice(1:10) |>\n  select(url, price)\n```\n\n## Best deals\n\n```{r}\nprop_sale_mod <- prop_sale |>\n  filter(rooms != 0 & bedrooms != 0 & bathrooms != 0) |>\n  select(url, rooms, walkscore, year_built, neighbourhood, tipology2, price) |>\n  drop_na() |>\n  filter(between(price, 100000, 2000000))\n\nprop_sale_mod$pricelog <- log(prop_sale_mod$price)\n\nm0 <- MASS::rlm(\n  pricelog ~ .,\n  data = prop_sale_mod |>\n    select(-price, -url)\n)\n## summary(m0)\n\nprop_sale_mod2 <- prop_sale_mod |>\n  filter(\n    !(url %in% (prop_sale_mod |>\n      mutate(\n        pricelogPredict = predict(m0),\n        pricePredict = exp(pricelogPredict),\n        diff = price - pricePredict\n      ) |>\n      arrange(diff) |>\n      slice(1:2) |>\n      pull(url))\n    )\n  )\n\nm2 <- MASS::rlm(\n  pricelog ~ .,\n  data = prop_sale_mod2 |>\n    select(-price, -url)\n)\n\nprop_sale_mod2 |>\n  mutate(\n    pricelogPredict = predict(m2),\n    pricePredict = exp(pricelogPredict),\n    diff = price - pricePredict,\n    ratio = pricePredict / price - 1\n  ) |>\n  arrange(desc(ratio)) |>\n  slice(1:10)\n```\n\n# Brokers brokers brokers\n\nSome of the things that I liked to see here are the real estate agents,  the people who sell, \nbuy and rent houses or apartments. I think all the available houses for one of those operations \nhave a real estate sign. \n\nSomething interesting that I discover here is that agents are different from brokers. It is a \nslight difference; the brokers can build their own business and hire others agents to work for \nthem; the agents can't do that; they only can work. For the analysis, it doesn't matter.\n\nLet's take a look at the top 10 agents in Montreal.\n\n```{r}\nagents_top <- agents |>\n  group_by(agents_name) |>\n  count() |>\n  ungroup() |>\n  arrange(desc(n)) |>\n  slice(1:10)\n\nagents_top |>\n  kbl(digits = 4) |>\n  kable_styling(c('striped', 'hover'))\n```\n\n\nFor the top 10, the main negotiation that they are doing.\n\n```{r}\nagents_prop <- agents |>\n  left_join(\n    prop |>\n      select(url, type, tipology2, neighbourhood, price)\n  ) %>%\n  drop_na(type)\n\nagents_prop |>\n  filter(agents_name %in% agents_top$agents_name) |>\n  group_by(agents_name) |>\n  count(type) |>\n  spread(type, n) |>\n  ungroup() |>\n  arrange(desc(sale)) |>\n  kbl(digits = 4) |>\n  kable_styling(c('striped', 'hover'))\n```\n\nThe main kind of propertie that they are working. Here \n\n```{r}\nagents_prop |>\n  filter(agents_name %in% agents_top$agents_name) |>\n  group_by(agents_name) |>\n  count(tipology2) |>\n  spread(tipology2, n) |>\n  ungroup() |>\n  kbl(digits = 4) |>\n  kable_styling(c('striped', 'hover'))\n```\n\nand to end this post I created a index that I called relevance that \nis the total properties agents by n\n\n```{r}\nagents_prop |>\n  filter(agents_name %in% agents_top$agents_name) |>\n  group_by(agents_name) |>\n  count(neighbourhood) |>\n  left_join(\n    agents_prop |>\n      group_by(neighbourhood) |>\n      count(name = \"total_neighbourhood\"),\n    by = \"neighbourhood\"\n  ) |> \n  mutate(relevance = n / total_neighbourhood) |>\n  group_by(neighbourhood) |> \n  filter(relevance == max(relevance) & total_neighbourhood >= 30)\n```"}]}}Content-Length: 251
[2022-04-23 14:22:15.278] Error: Unexpected non-empty line
Call: self$read_header()
Stack trace:
1: stop("Unexpected non-empty line")
2: self$read_header()
3: self$fetch(blocking = FALSE)
[2022-04-23 14:22:15.278] exiting
[Error - 2:22:15 PM] Connection to server got closed. Server will not be restarted.
[Error - 2:22:15 PM] Request textDocument/documentSymbol failed.
Error: Connection got disposed.
    at Object.dispose (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1835784)
    at Object.dispose (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1916023)
    at T.handleConnectionClosed (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1916236)
    at T.handleConnectionClosed (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1976767)
    at t (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1914325)
    at r.invoke (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1837401)
    at a.fire (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1838162)
    at X (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1825042)
    at r.invoke (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1837401)
    at a.fire (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1838162)
    at h.fireClose (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1846051)
    at Socket.<anonymous> (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1847636)
    at Socket.emit (node:events:402:35)
    at Pipe.<anonymous> (node:net:687:12)
[Error - 2:22:15 PM] Request textDocument/completion failed.
Error: Connection got disposed.
    at Object.dispose (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1835784)
    at Object.dispose (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1916023)
    at T.handleConnectionClosed (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1916236)
    at T.handleConnectionClosed (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1976767)
    at t (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1914325)
    at r.invoke (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1837401)
    at a.fire (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1838162)
    at X (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1825042)
    at r.invoke (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1837401)
    at a.fire (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1838162)
    at h.fireClose (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1846051)
    at Socket.<anonymous> (/Users/andryaswavrzenczak/.vscode/extensions/ikuyadeu.r-2.4.0/dist/extension.js:2:1847636)
    at Socket.emit (node:events:402:35)
    at Pipe.<anonymous> (node:net:687:12)
R version 4.1.2 (2021-11-01)
Platform: aarch64-apple-darwin20 (64-bit)
Running under: macOS Monterey 12.2.1

Matrix products: default
BLAS:   /Library/Frameworks/R.framework/Versions/4.1-arm64/Resources/lib/libRblas.0.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/4.1-arm64/Resources/lib/libRlapack.dylib

locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

loaded via a namespace (and not attached):
 [1] processx_3.5.2        compiler_4.1.2        R6_2.5.1              cli_3.2.0             parallel_4.1.2        tools_4.1.2           collections_0.3.5     xml2_1.3.3            callr_3.7.0           jsonlite_1.8.0        ps_1.6.0              rlang_1.0.2           languageserver_0.3.12
milanglacier commented 2 years ago

I have the same issue, completely have no idea why, it just suddenly crashes I once tried to open r.lsp.debug = true in the lsp settings, but it outputs too many garbages and the really useful information is loss.

My R --version is:

R version 4.1.2 (2021-11-01) -- "Bird Hippie"
Copyright (C) 2021 The R Foundation for Statistical Computing
Platform: aarch64-apple-darwin20 (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under the terms of the
GNU General Public License versions 2 or 3.
For more information about these matters see
https://www.gnu.org/licenses/.

I have take a look at my lsp.log and one of my typical error is:

I have the followng error:

[2022-05-09 04:04:40.036] error handling json:  Error: lexical error: invalid char in json text.
            ys <- runif(n) >\n     {"params":{"textDocument":{"versi
                     (right here) ------^

Stack trace:
1: parse_string(txt, bigint_as_char)
2: parseJSON(txt, bigint_as_char)
3: parse_and_simplify(txt = txt, simplifyVector = simplifyVector, 
    simplifyDataFrame = simplifyDataFrame, simplifyMatrix = simplifyMatrix, 
    flatten = flatten, ...)
4: jsonlite::fromJSON(data, simplifyVector = FALSE)
[2022-05-09 04:04:40.039] Error: Unexpected non-empty line
Call: self$read_header()
Stack trace:
1: stop("Unexpected non-empty line")
2: self$read_header()
3: self$fetch(blocking = FALSE)
[2022-05-09 04:04:40.039] exiting

The code that yields the error is at here:

generation <- function(n, gamma0, beta0, gamma1, beta1, alpha) {
    `%>%` <- tidyverse::`%>%` # pass lintr object-usage linter
    ys <- runif(n) %>%
        (function(x) ifelse(x > alpha, 1, 0))
    ps <- runif(n)

    xs <- rep(0, n)

    xs[ys == 0] <- (log(ps[ys == 0] / (1 - ps[ys == 0])) + beta0) / gamma0
    xs[ys == 1] <- (log(ps[ys == 1] / (1 - ps[ys == 1])) + beta1) / gamma1

    ys <- ys[xs >= 0]
    xs <- xs[xs >= 0]

    data.frame(x = xs, y = factor(ys))
}
milanglacier commented 2 years ago

I tried some debug work, in languagebase.R at line 120, I add the following code to capture what on earth the lsp client send to the lsp server,

if (inherits(payload, "error")) {
                logger$error("error handling json: ", payload)
                tstfile <- file("testfails.json", "w")
                writeLines(data, tstfile)
                close(tstfile)
                return(NULL)

And I find that testfails.json is indeed a non-parsable text, I will show a brief excerption of the json, ignore the irrelevant part (and I manually break the line to make it more readable), and the body of the text is my comment instead of the real text.


{"jsonrpc":"2.0",
"method":"textDocument\/didChange",
"params":{"contentChanges":
[{"text":"---the following is the normal parsable text (and ascii I can make sure), so I will skip them.
Then the text suddenly is clipped without end quote
ant a new completely new json string (maybe the server send a new request?)
is loaded{"jsonrpc":"2.0","method":"textDocument\/didChange",
"params":{"contentChanges":
[{"text":"---\ The next
is repeating the content my document
and it does not have legal json end identifier (it is clipped at the middle)

So there may be problem in the text loading part, related to languagebase.LanguageBase$read_header() and langaugeserver.LanguageServer$read_char()

However, as I find that the text is clipped from the middle, and start to read a completely new string, I think it is just possible that the stream sent from lsp client is just incomplete and the next one is sent.

According to lsp specification at subsection Basic protocal Json Strucutre - Response Message

A Response Message sent as a result of a request. If a request doesn’t provide a result value the receiver of a request still needs to return a response message to conform to the JSON-RPC specification. The result property of the ResponseMessage should be set to null in this case to signal a successful request.

So may be in such cases like read text fails, we should just send a null message?

randy3k commented 2 years ago

@milanglacier what editor are you using? Specifically, do you know if you are using TCP connection or STDIO?

milanglacier commented 2 years ago

@milanglacier what editor are you using? Specifically, do you know if you are using TCP connection or STDIO?

I am using neovim with builtin LSP, and it is STDIO connection.

milanglacier commented 2 years ago

after tons of controlling variables experiments, I found that the crashes happen when a neovim plugin cmp-buffer which provides completion source from the text (mainly words appeared in the text) is enabled simultaneously with nvim-LSP as another completion source. While it is hard to believe that two completion source would be possible to have conflict, but this is controlling variables experiment shows me. I do believe the crashes would be triggered under another situations, but currently if I disables cmp-buffer sources, then R-LSP will works fine.

After all, I am quiet sure the crashes only happen during the auto-completion. I’m wondering maybe the cmp-buffer will let the didChange notification too frequently to cause the stdio pipe being clipped? I don’t have experience in jsonRPC, so forgive me for my random guessing.

milanglacier commented 2 years ago

I faced another situation where r-lsp crashes and I am not turning on cmp-buffer, where only lsp source is enabled, so the root cause should not be the conflict of different sources.

I think this may relate to #127 since it casts the same error: error handling json: Error: lexical error: invalid char in json text

I find that such error is relatively easy to be reproduced for moderately large files (over 500 lines), for example a example file I opened this file and before typing more than 50 characters, the r-lsp will crash

I followed the solution merged at #140, and I tried to change the bufsize from 2 * n gradually even to 10000 * n, but the crash still occurs, now it occurs at

error handling json:  Error: lexical error: invalid char in json text. 
ases. Taking logarithm woul{"method":"textDocument\/didChang

So the stdio buffer is still clipped. But should 10000 * n be still not enough for reading the buffer? Hard to believe (I am not familiar with the low-level mechanism like streaming data, I am just doing random guessing, so forgive me)

randy3k commented 2 years ago

If you want to test,

try changing this function to

        read_content = function(nbytes) {
            data <- ""
            logger$debug("original nbytes:", nbytes)
            while (nbytes > 0) {
                newdata <- self$read_char(nbytes)
                if (length(newdata) > 0) {
                    nbytes <- nbytes - nchar(newdata, type = "bytes")
                    logger$debug("newdata:", newdata)
                    logger$debug("nbytes:", nbytes)
                    data <- paste0(data, newdata)
                }
                Sys.sleep(0.01)
            }
            data
        },
randy3k commented 2 years ago

self$read_char may return a string which is shorter than nbytes, but successive use should give the sum of bytes = the original nbytes.

milanglacier commented 2 years ago

Here is an example error casted log file by editing the file a example file. Note that I change logger$debug in your example code to logger$error because setting debug = TRUE will generate too many garbage information.

``` [2022-05-21 18:26:42.986] original nbytes: 46933 [2022-05-21 18:26:42.987] nbytes: 39025 [2022-05-21 18:26:42.998] nbytes: 30833 [2022-05-21 18:26:43.009] nbytes: 22641 [2022-05-21 18:26:43.020] nbytes: 14449 [2022-05-21 18:26:43.031] nbytes: 6257 [2022-05-21 18:26:43.043] nbytes: 0 [2022-05-21 18:26:43.082] original nbytes: 191 [2022-05-21 18:26:43.082] nbytes: 0 [2022-05-21 18:26:45.117] original nbytes: 191 [2022-05-21 18:26:45.118] nbytes: 0 [2022-05-21 18:27:22.283] original nbytes: 46939 [2022-05-21 18:27:22.284] nbytes: 38772 [2022-05-21 18:27:22.298] nbytes: 30580 [2022-05-21 18:27:22.312] nbytes: 22388 [2022-05-21 18:27:22.324] nbytes: 14196 [2022-05-21 18:27:22.338] nbytes: 6004 [2022-05-21 18:27:22.350] nbytes: 0 [2022-05-21 18:27:22.794] original nbytes: 46941 [2022-05-21 18:27:22.795] nbytes: 38774 [2022-05-21 18:27:22.809] nbytes: 30582 [2022-05-21 18:27:22.823] nbytes: 22390 [2022-05-21 18:27:22.837] nbytes: 14198 [2022-05-21 18:27:22.848] nbytes: 6006 [2022-05-21 18:27:22.861] nbytes: 0 [2022-05-21 18:27:23.951] original nbytes: 46942 [2022-05-21 18:27:23.951] nbytes: 38775 [2022-05-21 18:27:23.964] nbytes: 30583 [2022-05-21 18:27:23.977] nbytes: 22391 [2022-05-21 18:27:23.989] nbytes: 14199 [2022-05-21 18:27:24.000] nbytes: 6007 [2022-05-21 18:27:24.010] nbytes: 0 [2022-05-21 18:27:24.051] original nbytes: 46943 [2022-05-21 18:27:24.052] nbytes: 38776 [2022-05-21 18:27:24.062] nbytes: 30584 [2022-05-21 18:27:24.075] nbytes: 22392 [2022-05-21 18:27:24.086] nbytes: 14200 [2022-05-21 18:27:24.097] nbytes: 6008 [2022-05-21 18:27:24.108] nbytes: 0 [2022-05-21 18:27:24.122] original nbytes: 254 [2022-05-21 18:27:24.122] nbytes: 0 [2022-05-21 18:27:24.134] original nbytes: 191 [2022-05-21 18:27:24.134] nbytes: 0 [2022-05-21 18:27:24.679] original nbytes: 46942 [2022-05-21 18:27:24.679] nbytes: 38775 [2022-05-21 18:27:24.692] nbytes: 30583 [2022-05-21 18:27:24.705] nbytes: 22391 [2022-05-21 18:27:24.717] nbytes: 14199 [2022-05-21 18:27:24.730] nbytes: 6007 [2022-05-21 18:27:24.743] nbytes: 0 [2022-05-21 18:27:24.976] original nbytes: 46955 [2022-05-21 18:27:24.976] nbytes: 38788 [2022-05-21 18:27:24.987] nbytes: 30596 [2022-05-21 18:27:25.000] nbytes: 22404 [2022-05-21 18:27:25.013] nbytes: 14212 [2022-05-21 18:27:25.026] nbytes: 6020 [2022-05-21 18:27:25.039] nbytes: 0 [2022-05-21 18:27:25.399] original nbytes: 191 [2022-05-21 18:27:25.400] nbytes: 0 [2022-05-21 18:27:25.413] original nbytes: 46956 [2022-05-21 18:27:25.413] nbytes: 39003 [2022-05-21 18:27:25.426] nbytes: 30811 [2022-05-21 18:27:25.439] nbytes: 22619 [2022-05-21 18:27:25.451] nbytes: 14427 [2022-05-21 18:27:25.462] nbytes: 6235 [2022-05-21 18:27:25.474] nbytes: 0 [2022-05-21 18:27:25.486] error handling json: Error: parse error: premature EOF {"jsonrpc":"2.0","method":"text (right here) ------^ Stack trace: 1: parse_string(txt, bigint_as_char) 2: parseJSON(txt, bigint_as_char) 3: parse_and_simplify(txt = txt, simplifyVector = simplifyVector, simplifyDataFrame = simplifyDataFrame, simplifyMatrix = simplifyMatrix, flatten = flatten, ...) 4: jsonlite::fromJSON(data, simplifyVector = FALSE) [2022-05-21 18:27:25.490] Error: Unexpected non-empty line Call: self$read_header() Stack trace: 1: stop("Unexpected non-empty line") 2: self$read_header() 3: self$fetch(blocking = FALSE) [2022-05-21 18:27:25.490] exiting ```

I find that, for the content that causes error, the newdata shows completely the same content twice, i.e read_char() read the same bytes twice:

[2022-05-21 18:27:25.439] nbytes: 22619
[2022-05-21 18:27:25.451] nbytes: 14427
[2022-05-21 18:27:25.462] nbytes: 6235
[2022-05-21 18:27:25.474] nbytes: 0
[2022-05-21 18:27:25.486] error handling json:  Error: parse error: premature EOF

Here the content of newdata followed by nbytes 14427 and the content of newdata followed by nbytes 6235 are completely the same

Since the logfile is too large, I ignore the output from newdata at here, I shared the complete log file (included output from newdata via https://gist.github.com/milanglacier/f6e65d276698edda53e70400441e114c

milanglacier commented 2 years ago

I give several other experiments:

  1. The replicate can appear at anywhere (may not be the last two contents being read)

  2. It is also possible that the sum of nbytes exceed the orignal nbytes, but one thing for sure: the contents of newdata are completely the same, Here for personal privacy I will clip the content of the files I am editing (i.e output from newdata), and I will show you the log file.

``` [2022-05-21 20:02:26.323] original nbytes: 29555 [2022-05-21 20:02:26.324] newdata: {"params":{"textDocument":{"uri":"file:\/\/\/Users\/no [2022-05-21 20:02:26.324] nbytes: 21647 [2022-05-21 20:02:26.337] newdata: interpolation,\nas we assume a smooth transition betw [2022-05-21 20:02:26.337] nbytes: 13455 [2022-05-21 20:02:26.350] newdata: .align = 'center', fig.cap = 'The scatterplot of SHAP [2022-05-21 20:02:26.350] nbytes: 5263 [2022-05-21 20:02:26.363] newdata: ated, anonymized sets\",\n \"Measured with [2022-05-21 20:02:26.363] nbytes: 0 [2022-05-21 20:02:26.407] original nbytes: 183 [2022-05-21 20:02:26.407] newdata: {"params":{"textDocument":{"uri":"file:\/\/\/Users\/no [2022-05-21 20:02:26.407] nbytes: 0 [2022-05-21 20:02:26.782] original nbytes: 183 [2022-05-21 20:02:26.783] newdata: {"params":{"textDocument":{"uri":"file:\/\/\/Users\/no [2022-05-21 20:02:26.783] nbytes: 0 [2022-05-21 20:02:27.246] original nbytes: 29561 [2022-05-21 20:02:27.247] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:27.248] nbytes: 21394 [2022-05-21 20:02:27.261] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:27.262] nbytes: 13202 [2022-05-21 20:02:27.276] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:27.276] nbytes: 5010 [2022-05-21 20:02:27.289] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:27.289] nbytes: 0 [2022-05-21 20:02:27.512] original nbytes: 29564 [2022-05-21 20:02:27.513] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:27.513] nbytes: 21397 [2022-05-21 20:02:27.524] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:27.524] nbytes: 13205 [2022-05-21 20:02:27.535] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:27.536] nbytes: 5013 [2022-05-21 20:02:27.547] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:27.547] nbytes: 0 [2022-05-21 20:02:27.668] original nbytes: 29565 [2022-05-21 20:02:27.668] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:27.668] nbytes: 21398 [2022-05-21 20:02:27.679] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:27.679] nbytes: 13206 [2022-05-21 20:02:27.690] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:27.690] nbytes: 5014 [2022-05-21 20:02:27.703] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:27.703] nbytes: 0 [2022-05-21 20:02:27.717] original nbytes: 245 [2022-05-21 20:02:27.718] newdata: {"params":{"textDocument":{"uri":"file:\/\/\/Users\/no [2022-05-21 20:02:27.718] nbytes: 0 [2022-05-21 20:02:28.138] original nbytes: 29564 [2022-05-21 20:02:28.138] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:28.139] nbytes: 21397 [2022-05-21 20:02:28.149] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:28.150] nbytes: 13205 [2022-05-21 20:02:28.162] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:28.163] nbytes: 5013 [2022-05-21 20:02:28.175] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:28.176] nbytes: 0 [2022-05-21 20:02:28.506] original nbytes: 29577 [2022-05-21 20:02:28.507] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:28.508] nbytes: 21410 [2022-05-21 20:02:28.521] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:28.522] nbytes: 13218 [2022-05-21 20:02:28.534] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:28.535] nbytes: 5026 [2022-05-21 20:02:28.548] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:28.549] nbytes: 0 [2022-05-21 20:02:28.780] original nbytes: 29579 [2022-05-21 20:02:28.781] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:28.781] nbytes: 21412 [2022-05-21 20:02:28.794] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:28.795] nbytes: 13220 [2022-05-21 20:02:28.808] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:28.808] nbytes: 5028 [2022-05-21 20:02:28.821] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:28.822] nbytes: 0 [2022-05-21 20:02:28.840] original nbytes: 245 [2022-05-21 20:02:28.840] newdata: {"params":{"textDocument":{"uri":"file:\/\/\/Users\/no [2022-05-21 20:02:28.840] nbytes: 0 [2022-05-21 20:02:29.125] original nbytes: 29581 [2022-05-21 20:02:29.126] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:29.126] nbytes: 21414 [2022-05-21 20:02:29.139] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:29.139] nbytes: 13222 [2022-05-21 20:02:29.152] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:29.152] nbytes: 5030 [2022-05-21 20:02:29.165] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:29.165] nbytes: 0 [2022-05-21 20:02:29.390] original nbytes: 29584 [2022-05-21 20:02:29.390] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:29.391] nbytes: 21417 [2022-05-21 20:02:29.401] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:29.401] nbytes: 13225 [2022-05-21 20:02:29.414] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:29.414] nbytes: 5033 [2022-05-21 20:02:29.427] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:29.427] nbytes: 0 [2022-05-21 20:02:30.087] original nbytes: 29585 [2022-05-21 20:02:30.087] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:30.087] nbytes: 21418 [2022-05-21 20:02:30.099] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:30.099] nbytes: 13226 [2022-05-21 20:02:30.111] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:30.111] nbytes: 5034 [2022-05-21 20:02:30.122] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:30.122] nbytes: 0 [2022-05-21 20:02:30.241] original nbytes: 29586 [2022-05-21 20:02:30.241] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:30.242] nbytes: 21419 [2022-05-21 20:02:30.255] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:30.255] nbytes: 13227 [2022-05-21 20:02:30.268] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:30.268] nbytes: 5035 [2022-05-21 20:02:30.281] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:30.281] nbytes: 0 [2022-05-21 20:02:30.296] original nbytes: 220 [2022-05-21 20:02:30.296] newdata: {"params":{"position":{"character":8,"line":727},"text [2022-05-21 20:02:30.296] nbytes: 0 [2022-05-21 20:02:30.436] original nbytes: 29587 [2022-05-21 20:02:30.436] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:30.436] nbytes: 21420 [2022-05-21 20:02:30.449] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:30.449] nbytes: 13228 [2022-05-21 20:02:30.462] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:30.462] nbytes: 5036 [2022-05-21 20:02:30.475] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:30.475] nbytes: 0 [2022-05-21 20:02:30.491] original nbytes: 220 [2022-05-21 20:02:30.491] newdata: {"params":{"position":{"character":8,"line":727},"text [2022-05-21 20:02:30.491] nbytes: 0 [2022-05-21 20:02:30.516] original nbytes: 29588 [2022-05-21 20:02:30.517] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:30.517] nbytes: 21664 [2022-05-21 20:02:30.529] newdata: ake the logarithm of our target data: the new infectio [2022-05-21 20:02:30.530] nbytes: 13472 [2022-05-21 20:02:30.541] newdata: nclude_graphics(\".\/figures_sz\/6.2.png\")\n```\n\nSp [2022-05-21 20:02:30.542] nbytes: 5280 [2022-05-21 20:02:30.554] newdata: d, anonymized sets\",\n \"Measured with agg [2022-05-21 20:02:30.554] nbytes: 0 [2022-05-21 20:02:30.569] original nbytes: 246 [2022-05-21 20:02:30.569] newdata: {"params":{"textDocument":{"uri":"file:\/\/\/Users\/no [2022-05-21 20:02:30.570] nbytes: 0 [2022-05-21 20:02:30.621] original nbytes: 183 [2022-05-21 20:02:30.621] newdata: {"params":{"textDocument":{"uri":"file:\/\/\/Users\/no [2022-05-21 20:02:30.621] nbytes: 0 [2022-05-21 20:02:30.634] original nbytes: 29589 [2022-05-21 20:02:30.634] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:30.634] nbytes: 21628 [2022-05-21 20:02:30.645] newdata: the new infectious\ncases. Taking logarithm would mak [2022-05-21 20:02:30.646] nbytes: 13436 [2022-05-21 20:02:30.658] newdata: png\")\n```\n\nSpecifically, for feature gender, high [2022-05-21 20:02:30.659] nbytes: 5244 [2022-05-21 20:02:30.670] newdata: Measured with aggregated, anonymized sets\",\n [2022-05-21 20:02:30.670] nbytes: 0 [2022-05-21 20:02:30.682] original nbytes: 221 [2022-05-21 20:02:30.682] newdata: {"params":{"position":{"character":8,"line":727},"text [2022-05-21 20:02:30.682] nbytes: 0 [2022-05-21 20:02:30.800] original nbytes: 29590 [2022-05-21 20:02:30.800] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:30.800] nbytes: 21423 [2022-05-21 20:02:30.813] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:30.813] nbytes: 13231 [2022-05-21 20:02:30.826] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:30.826] nbytes: 5039 [2022-05-21 20:02:30.838] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:30.838] nbytes: 0 [2022-05-21 20:02:30.851] original nbytes: 221 [2022-05-21 20:02:30.851] newdata: {"params":{"position":{"character":8,"line":727},"text [2022-05-21 20:02:30.851] nbytes: 0 [2022-05-21 20:02:31.076] original nbytes: 29591 [2022-05-21 20:02:31.076] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:31.077] nbytes: 21424 [2022-05-21 20:02:31.089] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:31.090] nbytes: 13232 [2022-05-21 20:02:31.102] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:31.103] nbytes: 5040 [2022-05-21 20:02:31.115] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:31.115] nbytes: 0 [2022-05-21 20:02:31.128] original nbytes: 221 [2022-05-21 20:02:31.128] newdata: {"params":{"position":{"character":8,"line":727},"text [2022-05-21 20:02:31.128] nbytes: 0 [2022-05-21 20:02:31.246] original nbytes: 29593 [2022-05-21 20:02:31.247] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:31.247] nbytes: 21426 [2022-05-21 20:02:31.260] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:31.260] nbytes: 13234 [2022-05-21 20:02:31.272] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:31.273] nbytes: 5042 [2022-05-21 20:02:31.283] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:31.283] nbytes: 0 [2022-05-21 20:02:31.297] original nbytes: 221 [2022-05-21 20:02:31.297] newdata: {"params":{"position":{"character":8,"line":727},"text [2022-05-21 20:02:31.297] nbytes: 0 [2022-05-21 20:02:31.417] original nbytes: 29594 [2022-05-21 20:02:31.417] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:31.417] nbytes: 21427 [2022-05-21 20:02:31.430] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:31.430] nbytes: 13235 [2022-05-21 20:02:31.443] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:31.443] nbytes: 5043 [2022-05-21 20:02:31.456] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:31.456] nbytes: 0 [2022-05-21 20:02:31.470] original nbytes: 221 [2022-05-21 20:02:31.470] newdata: {"params":{"position":{"character":8,"line":727},"text [2022-05-21 20:02:31.471] nbytes: 0 [2022-05-21 20:02:31.590] original nbytes: 29595 [2022-05-21 20:02:31.590] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:31.590] nbytes: 21428 [2022-05-21 20:02:31.603] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:31.614] nbytes: 13236 [2022-05-21 20:02:31.628] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:31.628] nbytes: 5044 [2022-05-21 20:02:31.638] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:31.638] nbytes: 0 [2022-05-21 20:02:31.651] original nbytes: 221 [2022-05-21 20:02:31.651] newdata: {"params":{"position":{"character":8,"line":727},"text [2022-05-21 20:02:31.651] nbytes: 0 [2022-05-21 20:02:32.635] original nbytes: 29597 [2022-05-21 20:02:32.636] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:32.636] nbytes: 21430 [2022-05-21 20:02:32.649] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:32.649] nbytes: 13238 [2022-05-21 20:02:32.662] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:32.662] nbytes: 5046 [2022-05-21 20:02:32.672] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:32.672] nbytes: 0 [2022-05-21 20:02:32.688] original nbytes: 184 [2022-05-21 20:02:32.688] newdata: {"params":{"textDocument":{"uri":"file:\/\/\/Users\/no [2022-05-21 20:02:32.688] nbytes: 0 [2022-05-21 20:02:32.912] original nbytes: 29599 [2022-05-21 20:02:32.912] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:32.913] nbytes: 21432 [2022-05-21 20:02:32.925] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:32.926] nbytes: 13240 [2022-05-21 20:02:32.938] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:32.939] nbytes: 5048 [2022-05-21 20:02:32.951] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:32.952] nbytes: 0 [2022-05-21 20:02:33.583] original nbytes: 29600 [2022-05-21 20:02:33.583] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:33.583] nbytes: 21433 [2022-05-21 20:02:33.596] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:33.596] nbytes: 13241 [2022-05-21 20:02:33.607] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:33.607] nbytes: 5049 [2022-05-21 20:02:33.618] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:33.619] nbytes: 0 [2022-05-21 20:02:33.840] original nbytes: 29602 [2022-05-21 20:02:33.840] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:33.840] nbytes: 21435 [2022-05-21 20:02:33.850] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:33.851] nbytes: 13243 [2022-05-21 20:02:33.863] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:33.864] nbytes: 5051 [2022-05-21 20:02:33.876] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:33.876] nbytes: 0 [2022-05-21 20:02:34.207] original nbytes: 29603 [2022-05-21 20:02:34.208] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:34.208] nbytes: 21436 [2022-05-21 20:02:34.221] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:34.221] nbytes: 13244 [2022-05-21 20:02:34.234] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:34.234] nbytes: 5052 [2022-05-21 20:02:34.247] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:34.247] nbytes: 0 [2022-05-21 20:02:34.363] original nbytes: 29605 [2022-05-21 20:02:34.363] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:34.363] nbytes: 21438 [2022-05-21 20:02:34.376] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:34.376] nbytes: 13246 [2022-05-21 20:02:34.389] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:34.389] nbytes: 5054 [2022-05-21 20:02:34.402] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:34.402] nbytes: 0 [2022-05-21 20:02:34.415] original nbytes: 246 [2022-05-21 20:02:34.415] newdata: {"params":{"textDocument":{"uri":"file:\/\/\/Users\/no [2022-05-21 20:02:34.415] nbytes: 0 [2022-05-21 20:02:34.638] original nbytes: 184 [2022-05-21 20:02:34.638] newdata: {"params":{"textDocument":{"uri":"file:\/\/\/Users\/no [2022-05-21 20:02:34.638] nbytes: 0 [2022-05-21 20:02:34.651] original nbytes: 29607 [2022-05-21 20:02:34.652] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:34.652] nbytes: 21647 [2022-05-21 20:02:34.664] newdata: : the new infectious\ncases. Taking logarithm would ma [2022-05-21 20:02:34.665] nbytes: 13455 [2022-05-21 20:02:34.677] newdata: .png\")\n```\n\nSpecifically, for feature gender, high [2022-05-21 20:02:34.677] nbytes: 5263 [2022-05-21 20:02:34.688] newdata: "Measured with aggregated, anonymized sets\",\n [2022-05-21 20:02:34.688] nbytes: 0 [2022-05-21 20:02:34.906] original nbytes: 29611 [2022-05-21 20:02:34.906] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:34.906] nbytes: 21444 [2022-05-21 20:02:34.917] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:34.917] nbytes: 13252 [2022-05-21 20:02:34.929] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:34.930] nbytes: 5060 [2022-05-21 20:02:34.941] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:34.941] nbytes: 0 [2022-05-21 20:02:35.373] original nbytes: 29612 [2022-05-21 20:02:35.373] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:35.374] nbytes: 21445 [2022-05-21 20:02:35.387] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:35.387] nbytes: 13253 [2022-05-21 20:02:35.397] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:35.398] nbytes: 5061 [2022-05-21 20:02:35.411] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:35.411] nbytes: 0 [2022-05-21 20:02:35.532] original nbytes: 29613 [2022-05-21 20:02:35.532] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:35.532] nbytes: 21446 [2022-05-21 20:02:35.545] newdata: e also take two lags: 7 days and 14 days, as this woul [2022-05-21 20:02:35.546] nbytes: 13254 [2022-05-21 20:02:35.558] newdata: y.\nAlso the numerical contributions to predicted valu [2022-05-21 20:02:35.559] nbytes: 5062 [2022-05-21 20:02:35.571] newdata: )\n ),\n caption = \"Filed specification [2022-05-21 20:02:35.572] nbytes: 0 [2022-05-21 20:02:35.588] original nbytes: 222 [2022-05-21 20:02:35.588] newdata: {"params":{"position":{"character":13,"line":729},"tex [2022-05-21 20:02:35.588] nbytes: 0 [2022-05-21 20:02:35.604] original nbytes: 29614 [2022-05-21 20:02:35.604] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:35.604] nbytes: 21692 [2022-05-21 20:02:35.617] newdata: take the logarithm of our target data: the new infect [2022-05-21 20:02:35.617] nbytes: 13500 [2022-05-21 20:02:35.630] newdata: :include_graphics(\".\/figures_sz\/6.2.png\")\n```\n\n [2022-05-21 20:02:35.630] nbytes: 5308 [2022-05-21 20:02:35.643] newdata: ted, anonymized sets\",\n \"Measured with a [2022-05-21 20:02:35.643] nbytes: 0 [2022-05-21 20:02:35.660] original nbytes: 222 [2022-05-21 20:02:35.660] newdata: {"params":{"position":{"character":13,"line":729},"tex [2022-05-21 20:02:35.661] nbytes: 0 [2022-05-21 20:02:36.734] original nbytes: 29616 [2022-05-21 20:02:36.734] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:36.735] nbytes: 21449 [2022-05-21 20:02:36.747] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:36.748] nbytes: 13282 [2022-05-21 20:02:36.760] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:36.761] nbytes: 5115 [2022-05-21 20:02:36.774] newdata: {"params":{"contentChanges":[{"text":"---\ntitle: 'On [2022-05-21 20:02:36.774] nbytes: -3052 [2022-05-21 20:02:36.787] error handling json: Error: lexical error: invalid char in json text. @tsapp].\nFor covariates, w{"params":{"contentChanges":[{"te (right here) ------^ Stack trace: 1: parse_string(txt, bigint_as_char) 2: parseJSON(txt, bigint_as_char) 3: parse_and_simplify(txt = txt, simplifyVector = simplifyVector, simplifyDataFrame = simplifyDataFrame, simplifyMatrix = simplifyMatrix, flatten = flatten, ...) 4: jsonlite::fromJSON(data, simplifyVector = FALSE) [2022-05-21 20:02:36.899] Error: Unexpected non-empty line Call: self$read_header() Stack trace: 1: stop("Unexpected non-empty line") 2: self$read_header() 3: self$fetch(blocking = FALSE) ```
milanglacier commented 2 years ago

I make a simple change to the function self$read_content() which checks if the two batches read by read_char() are the same, if the two batches are the same, then ignore this batch and continue to read. Currently it seems work fine on the rmd file I list as example, I will try to give more test on my problematic files. Once I find that it works fine on all my problematic files, I will submit a PR

(updated: test on a R file with 1500 lines caret/resamples.R and it works fine, the completion is smooth)

randy3k commented 2 years ago

I make a simple change to the function self$read_content() which checks if the two batches read by read_char() are the same, if the two batches are the same, then ignore this batch and continue to read.

Currently it seems work fine on the rmd file I list as example,

I will try to give more test on my problematic files.

Once I find that it works fine on all my problematic files, I will submit a PR

(updated: test on a R file with 1500 lines caret/resamples.R and it works fine, the completion is smooth)

That's useful. Not sure why the data is doubled. For some reason, the cursor position is reset.