PCdLf / wearalyze

Shiny app for wearables package
0 stars 0 forks source link

Spark installation #16

Open luciennedenuil opened 1 week ago

luciennedenuil commented 1 week ago

hi Veerle, @hypebright

When trying to upload the Nowatch and Empatica data in Wearalyze, the app crashes and i receive the following message in R:

Warning in spark_install_find(version, hadoop_version, latest = FALSE) : The Spark version specified may not be available. Please consider running spark_available_versions() to list all known available Spark versions. Warning: Error in : Running Spark on Windows requires the Microsoft Visual C++ 2010 SP1 Redistributable Package. Please download and install from:

https://www.microsoft.com/en-us/download/details.aspx?id=26999

Then restart R after the installation completes

85: stop 84: verify_msvcr100 [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/install_spark_windows.R#26] 83: prepare_windows_environment [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/install_spark_windows.R#47] 82: shell_connection [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/shell_connection.R#57] 81: spark_connect_method.default [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/connection_spark.R#355] 79: spark_connect [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/connection_spark.R#235] 78: read_raw_embrace_plus [C:/Users/Ipse1/AppData/Local/Temp/RtmpOgPVQQ/R.INSTALL4be833442c79/PCdLf-wearables-4e15339/R/read_embrace_plus.R#263] 77: read_embrace_plus [C:/Users/Ipse1/AppData/Local/Temp/RtmpOgPVQQ/R.INSTALL4be833442c79/PCdLf-wearables-4e15339/R/read_embrace_plus.R#149] 73: observe [C:\Users\Ipse1\Downloads\wearalyze-main\wearalyze-main\app\view\dataUpload.R#276] 72: [C:/Users/Ipse1/AppData/Local/Temp/RtmpaaqI0u/R.INSTALL1eb071365529/shiny/R/utils.R#1455] 1: shiny::runApp [C:/Users/Ipse1/AppData/Local/Temp/RtmpaaqI0u/R.INSTALL1eb071365529/shiny/R/runapp.R#388]

Could you help me with this issue?

hypebright commented 1 week ago

Hi @luciennedenuil ,

For Spark to run properly Windows needs some system dependencies to be installed. Normally those would be there but perhaps they are outdated or just not available.

Can you try downloading and installing the Microsoft Visual C++ Redistributable, and then restart R to apply the changes? You can download the latest version here: https://learn.microsoft.com/en-us/cpp/windows/latest-supported-vc-redist?view=msvc-170#latest-microsoft-visual-c-redistributable-version

I think you're on X64, but if you're in doubt do the following:

  1. Press Windows Key + R to open the Run dialog box.
  2. Type msinfo32 and press Enter.

Let me know if that works!

luciennedenuil commented 1 week ago

Hi @hypebright,

thanks! i have installed the Redistributable. Luckily it is possible to upload the Nowatch zip files :) however it did not work for the Empatica zip files: wearalyze still crashes when trying to upload the Empatica files. Coding for the Empatica error:

Listening on http://127.0.0.1:3730 ℹ Connecting to local Spark cluster Warning in sprintf(versions$pattern, version$spark, version$hadoop) : 2 arguments not used by format 'spark-3.5-bin-hadoop2.7' Warning: Error in system2: '"C:\Users\Ipse1\AppData\Local/spark/spark-3.5-bin-hadoop2.7/bin/spark-submit"' not found 86: system2 85: versionAttempt [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/spark_version.R#110] 84: spark_version_from_home [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/spark_version.R#125] 83: validate_java_version [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/java.R#57] 82: shell_connection [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/shell_connection.R#61] 81: spark_connect_method.default [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/connection_spark.R#355] 79: spark_connect [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/connection_spark.R#235] 78: read_raw_embrace_plus [C:/Users/Ipse1/AppData/Local/Temp/RtmpOgPVQQ/R.INSTALL4be833442c79/PCdLf-wearables-4e15339/R/read_embrace_plus.R#263] 77: read_embrace_plus [C:/Users/Ipse1/AppData/Local/Temp/RtmpOgPVQQ/R.INSTALL4be833442c79/PCdLf-wearables-4e15339/R/read_embrace_plus.R#149] 73: observe [C:\Users\Ipse1\Downloads\wearalyze-main\wearalyze-main\app\view\dataUpload.R#276] 72: [C:/Users/Ipse1/AppData/Local/Temp/RtmpaaqI0u/R.INSTALL1eb071365529/shiny/R/utils.R#1455] 1: shiny::runApp [C:/Users/Ipse1/AppData/Local/Temp/RtmpaaqI0u/R.INSTALL1eb071365529/shiny/R/runapp.R#388]

luciennedenuil commented 1 week ago

In addition, i have tried to open the Empatica files by using the 'select folder' button, but that does not work as well.

Listening on http://127.0.0.1:4703 Warning: Error in file: invalid 'description' argument 78: file 77: read.table 76: read.csv 75: read_aggregated_embrace_plus [C:/Users/Ipse1/AppData/Local/Temp/RtmpOgPVQQ/R.INSTALL4be833442c79/PCdLf-wearables-4e15339/R/read_embrace_plus.R#203] 74: read_embrace_plus [C:/Users/Ipse1/AppData/Local/Temp/RtmpOgPVQQ/R.INSTALL4be833442c79/PCdLf-wearables-4e15339/R/read_embrace_plus.R#153] 73: observe [C:\Users\Ipse1\Downloads\wearalyze-main\wearalyze-main\app\view\dataUpload.R#376] 72: [C:/Users/Ipse1/AppData/Local/Temp/RtmpaaqI0u/R.INSTALL1eb071365529/shiny/R/utils.R#1455] 1: shiny::runApp [C:/Users/Ipse1/AppData/Local/Temp/RtmpaaqI0u/R.INSTALL1eb071365529/shiny/R/runapp.R#388]

hypebright commented 1 week ago

Hi @hypebright,

thanks! i have installed the Redistributable. Luckily it is possible to upload the Nowatch zip files :) however it did not work for the Empatica zip files: wearalyze still crashes when trying to upload the Empatica files. Coding for the Empatica error:

Listening on http://127.0.0.1:3730 ℹ Connecting to local Spark cluster Warning in sprintf(versions$pattern, version$spark, version$hadoop) : 2 arguments not used by format 'spark-3.5-bin-hadoop2.7' Warning: Error in system2: '"C:\Users\Ipse1\AppData\Local/spark/spark-3.5-bin-hadoop2.7/bin/spark-submit"' not found 86: system2 85: versionAttempt [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/spark_version.R#110] 84: spark_version_from_home [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/spark_version.R#125] 83: validate_java_version [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/java.R#57] 82: shell_connection [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/shell_connection.R#61] 81: spark_connect_method.default [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/connection_spark.R#355] 79: spark_connect [C:/Users/Ipse1/AppData/Local/Temp/RtmpwD9hrQ/R.INSTALL4de813bf54c8/sparklyr/R/connection_spark.R#235] 78: read_raw_embrace_plus [C:/Users/Ipse1/AppData/Local/Temp/RtmpOgPVQQ/R.INSTALL4be833442c79/PCdLf-wearables-4e15339/R/read_embrace_plus.R#263] 77: read_embrace_plus [C:/Users/Ipse1/AppData/Local/Temp/RtmpOgPVQQ/R.INSTALL4be833442c79/PCdLf-wearables-4e15339/R/read_embrace_plus.R#149] 73: observe [C:\Users\Ipse1\Downloads\wearalyze-main\wearalyze-main\app\view\dataUpload.R#276] 72: [C:/Users/Ipse1/AppData/Local/Temp/RtmpaaqI0u/R.INSTALL1eb071365529/shiny/R/utils.R#1455] 1: shiny::runApp [C:/Users/Ipse1/AppData/Local/Temp/RtmpaaqI0u/R.INSTALL1eb071365529/shiny/R/runapp.R#388]

This seems like something is not going right with the Spark installation on your computer still. Perhaps you can try to manually install Spark from here: https://spark.apache.org/downloads.html

Can you also attach which files you exactly tried to upload when using the "select folder" button? And tell me if you used the "aggregated data" checkbox or not?

When using aggregated data, the app will use the .csv files and not the .avro files. We only need Spark for the .avro files.

luciennedenuil commented 6 days ago

Hi @hypebright ,

it seems like it is now possible to upload the Empatica files through the 'select folder' button, good news. However, it is still not working by using a Zip-folder. I will check the Spark installation you send next week.