Closed devilgate closed 5 years ago
Needs to log to a separate log4j file or log rotation will not log. This has been done before by copying RIFLogger.java but it is probably time for a better solution.
Postgres: OK SQL Server: FAILS updating the results map table. Probably a piece of SQL Server specific code has been missed in the migration. Error trace appears in the front end OK.
!!!!!! Statistics WebService !!!!!!
07:53:31.040 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.StatisticsResource]:
In runScript, serving /statistics/script
07:53:31.041 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.StatisticsResource]:
Parameters object is not null
07:53:31.041 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.ScriptService]:
In runScript
07:53:31.041 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.ScriptService]:
R parameters:
userID=peter
password=XXXXXXXX
odbcDataSource=SQLServer13
db_driver_prefix=jdbc:sqlserver
db_host=localhost
db_port=1433
db_name=sahsuland
db_driver_class_name=com.microsoft.sqlserver.jdbc.SQLServerDriver
db_url=jdbc:sqlserver://localhost:1433
java_lib_path_dir=C:\Program Files\Apache Software Foundation\Tomcat 8.5\webapps\rifServices\WEB-INF\lib
odbcDataSource=SQLServer13
studyID=174
investigationName=TEST_1002
names.adj.1=NONE
adj.1=FALSE
studyName=TEST 1002
studyDescription=TEST 1002 LUNG CANCER HET 95_96
investigationId=160
studyType=diseaseMapping
model=HET
working_dir=c:\rifDemo\scratchSpace
databaseType=ms
07:53:31.042 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.ScriptService]:
Loading R script C:\Program Files\Apache Software Foundation\Tomcat 8.5\webapps\statistics\WEB-INF\classes\Statistics_JRI.R
07:53:31.042 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.ScriptService]:
Source(\): 'C:\\Program Files\\Apache Software Foundation\\Tomcat 8.5\\webapps\\statistics\\WEB-INF\\classes\\Statistics_JRI.R'
07:53:32.022 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.logging.LoggingConsole]:
rFlushConsole[2] calls: 58, length: 846, time period: PT1M38.861S
"C:/Program Files/R/R-3.4.4/library"
R version 3.4.4 (2018-03-15)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 10 x64 (build 17134)
Matrix products: default
locale:
[1] LC_COLLATE=English_United Kingdom.1252
[2] LC_CTYPE=English_United Kingdom.1252
[3] LC_MONETARY=English_United Kingdom.1252
[4] LC_NUMERIC=C
[5] LC_TIME=English_United Kingdom.1252
attached base packages:
[1] stats graphics grDevices utils datasets methods base
loaded via a namespace (and not attached):
[1] compiler_3.4.4
R Error/Warning/Notice: Loading required package: RODBC
R Error/Warning/Notice: Loading required package: RJDBC
R Error/Warning/Notice: Loading required package: DBI
R Error/Warning/Notice: Loading required package: rJava
R Error/Warning/Notice: Loading required package: sp
07:53:33.680 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.logging.LoggingConsole]:
rFlushConsole[3] calls: 2, length: 186, time period: PT1.657S
R Error/Warning/Notice: Loading required package: Matrix
R Error/Warning/Notice: This is INLA_17.06.20 built 2017-06-20 03:42:30 UTC.
See www.r-inla.org/contact-us for how to get help.
07:53:35.123 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.ScriptService]:
Done: 'C:\\Program Files\\Apache Software Foundation\\Tomcat 8.5\\webapps\\statistics\\WEB-INF\\classes\\Statistics_JRI.R'
07:53:35.125 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.ScriptService]:
Loading R script C:\Program Files\Apache Software Foundation\Tomcat 8.5\webapps\statistics\WEB-INF\classes\performSmoothingActivity.R
07:53:35.126 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.ScriptService]:
Source(\): 'C:\\Program Files\\Apache Software Foundation\\Tomcat 8.5\\webapps\\statistics\\WEB-INF\\classes\\performSmoothingActivity.R'
07:53:35.136 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.ScriptService]:
Done: 'C:\\Program Files\\Apache Software Foundation\\Tomcat 8.5\\webapps\\statistics\\WEB-INF\\classes\\performSmoothingActivity.R'
07:53:35.136 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.ScriptService]:
Calling Disease Mapping R function
07:53:35.298 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.logging.LoggingConsole]:
rFlushConsole[4] calls: 6, length: 689, time period: PT1.617S
R Error/Warning/Notice:
Attaching package: 'INLA'
R Error/Warning/Notice: The following object is masked from 'package:pryr':
f
R Error/Warning/Notice: Checking rgeos availability: FALSE
Note: when rgeos is not available, polygon geometry computations in maptools depend on gpclib,
which has a restricted licence. It is disabled by default;
to enable gpclib, type gpclibPermit()
R Error/Warning/Notice: Loading required package: spData
R Error/Warning/Notice: To access larger datasets in this package, install the spDataLarge
package with: `install.packages('spDataLarge',
repos='https://nowosad.github.io/drat/', type='source'))`
In runRSmoothingFunctions in JRI script
07:54:05.131 [http-nio-8080-exec-7] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.stats.service.logging.LoggingConsole]:
rFlushConsole[5] calls: 14, length: 1031, time period: PT29.833S
In establishTableNames; scratchSpace is scratchSpace/d101-200/s174/data
Table names established
Copy: C:\Program Files\Apache Software Foundation\Tomcat 8.5/webapps/statistics/WEB-INF/classes/performSmoothingActivity.R to: scratchSpace/d101-200/s174/data/performSmoothingActivity.R
Copy: C:\Program Files\Apache Software Foundation\Tomcat 8.5/webapps/statistics/WEB-INF/classes/Statistics_csv.R to: scratchSpace/d101-200/s174/data/Statistics_csv.R
Copy: C:\Program Files\Apache Software Foundation\Tomcat 8.5/webapps/statistics/WEB-INF/classes/Statistics_Common.R to: scratchSpace/d101-200/s174/data/Statistics_Common.R
Copy: C:\Program Files\Apache Software Foundation\Tomcat 8.5/webapps/statistics/WEB-INF/classes/rif40_run_R.bat to: scratchSpace/d101-200/s174/data/rif40_run_R.bat
Create: scratchSpace/d101-200/s174/data/rif40_run_R_env.bat
About to test exitValue 0 and connection
Performing basic stats and smoothing
Statistics_JRI.R exitValue: 1; append statsOutput tracer: 217; to errorTrace tracer: 3
07:54:05.514 [pool-7-thread-1] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.rif.services.datastorage.common.StatisticsProcessing]:
Statistics service called: POST http://localhost:8080/statistics/service/script returned a response status of 500 Internal Server Error
07:54:05.515 [pool-7-thread-1] ERROR org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.rif.services.datastorage.common.StatisticsProcessing]:
Received a failure response from the Statistics Service. Response as a string is
POST http://localhost:8080/statistics/service/script returned a response status of 500 Internal Server Error
The Entity from the Response is
Trying JDBC connection using driver class com.microsoft.sqlserver.jdbc.SQLServerDriver, lib dir C:\Program Files\Apache Software Foundation\Tomcat 8.5\webapps\rifServices\WEB-INF\lib, URL jdbc:sqlserver://localhost:1433, user peter
... JDBC connection established
Connected to DB
connectToDb exitValue: 0
About to fetch extract table
JDBC EXTRACT TABLE NAME: rif_studies.s174_extract
Querying by JDBC: select * from rif_studies.s174_extract
JDBC Saving extract frame to: scratchSpace/d101-200/s174/data/tmp_s174_extract.csv
JDBC rif_studies.s174_extract numberOfRows=108328==
About to get adjacency matrix
Querying by JDBC: SELECT b2.adjacencytable
FROM [rif40].[rif40_studies] b1, [rif40].[rif40_geographies] b2
WHERE b1.study_id = 174
AND b2.geography = b1.geography
Querying by JDBC: WITH b AS ( /* Tilemaker: has adjacency table */
SELECT b1.area_id, b3.geolevel_id
FROM [rif40].[rif40_study_areas] b1, [rif40].[rif40_studies] b2, [rif40].[rif40_geolevels] b3
WHERE b1.study_id = 174
AND b1.study_id = b2.study_id
AND b2.geography = b3.geography
)
SELECT c1.areaid AS area_id, c1.num_adjacencies, c1.adjacency_list
FROM [rif_data].[adjacency_sahsuland] c1, b
WHERE c1.geolevel_id = b.geolevel_id
AND c1.areaid = b.area_id
ORDER BY 1
rif40_GetAdjacencyMatrix numberOfRows=1229==
Saving adjacency matrix to: scratchSpace/d101-200/s174/data/tmp_s174_adjacency_matrix.csv
About to smooth
Covariates: none
Removed: 0 of 8012 non adjacent areas
Bayes smoothing with HET model type no adjustment
Posterior probability calculated
callPerformSmoothingActivity exitValue: 0
performSmoothingActivity() OK: 0
typeof(result$area_id[1]) ----> integer ; check.integer(result$area_id[1]): FALSE ; result$area_id[1]: 01.001.000100.1
About to save to table
In saveDataFrameToDatabaseTableSaving data frame to: scratchSpace/d101-200/s174/data/tmp_s174_map.csv
Creating temporary table: peter.tmp_s174_map
Replace INF will NA for temporary table: peter.tmp_s174_map
Replace NAN with NA for temporary table: peter.tmp_s174_map
Replace "" will NA for temporary table: peter.tmp_s174_map
About to write temporary table: peter.tmp_s174_map; first 10 rows
area_id gid gid_rowindex username study_id inv_id band_id genders
1 01.001.000100.1 NA NA NA 174 160 1 1
2 01.001.000100.1 NA NA NA 174 160 1 2
3 01.001.000100.1 NA NA NA 174 160 1 3
4 01.001.000100.2 NA NA NA 174 160 2 1
5 01.001.000100.2 NA NA NA 174 160 2 2
6 01.001.000100.2 NA NA NA 174 160 2 3
7 01.001.000200.1 NA NA NA 174 160 3 1
8 01.001.000200.1 NA NA NA 174 160 3 2
9 01.001.000200.1 NA NA NA 174 160 3 3
10 01.001.000300.1 NA NA NA 174 160 4 1
direct_standardisation adjusted observed expected lower95 upper95
1 0 0 4 9.298619 0.1172072 1.101410
2 0 0 10 6.215759 0.7714888 2.958666
3 0 0 14 16.502903 0.4637930 1.423363
4 0 0 8 9.799017 0.3524672 1.608650
5 0 0 4 6.032311 0.1806713 1.697788
6 0 0 12 16.576684 0.3740540 1.264522
7 0 0 16 8.413628 1.0869725 3.088204
8 0 0 4 4.376545 0.2490242 2.340108
9 0 0 20 12.706711 0.9614226 2.430871
10 0 0 8 9.577352 0.3606250 1.645882
relative_risk smoothed_relative_risk posterior_probability
1 0.4301714 0.4301714 0.1171262
2 1.6088139 1.6088139 0.9480916
3 0.8483356 0.8483356 0.4156899
4 0.8164085 0.8164085 0.4203541
5 0.6630957 0.6630957 0.3690862
6 0.7239084 0.7239084 0.2528017
7 1.9016766 1.9016766 0.9849630
8 0.9139630 0.9139630 0.5702991
9 1.5739714 1.5739714 0.9689681
10 0.8353040 0.8353040 0.4408056
posterior_probability_upper95 posterior_probability_lower95
1 NA NA
2 NA NA
3 NA NA
4 NA NA
5 NA NA
6 NA NA
7 NA NA
8 NA NA
9 NA NA
10 NA NA
residual_relative_risk residual_rr_lower95 residual_rr_upper95 smoothed_smr
1 NA NA NA 0.6453322
2 NA NA NA 1.3677098
3 NA NA NA 0.8699985
4 NA NA NA 0.8522729
5 NA NA NA 0.7294029
6 NA NA NA 0.7878698
7 NA NA NA 1.5300215
8 NA NA NA 0.8778297
9 NA NA NA 1.3548498
10 NA NA NA 0.8637252
smoothed_smr_lower95 smoothed_smr_upper95
1 0.3577265 1.112127
2 0.7438924 2.360420
3 0.5606205 1.307479
4 0.4971489 1.400670
5 0.3422524 1.433292
6 0.5006380 1.199196
7 0.9550995 2.364329
8 0.4001204 1.773553
9 0.8962529 1.989012
10 0.5031049 1.421521
Stack tracer >>>
.handleSimpleError(function (obj)
{
calls = sys.calls()
calls = ca stop("execute JDBC update query failed in dbSendUpdate (", .jcall(x, "S", " .local(conn, statement, ...) dbSendUpdate(conn, inss, list = list) dbSendUpdate(conn, inss, list = list) .local(conn, name, value, ...) dbWriteTable(connection, name = temporarySmoothedResultsTableName, value = dbWriteTable(connection, name = temporarySmoothedResultsTableName, value = Statistics_Common.R#178: withVisible(expr) Statistics_Common.R#178: withCallingHandlers(withVisible(expr), error = err withErrorTracing({
if (dbExistsTable(connection, temporarySmoothedResul doTryCatch(return(expr), name, parentenv, handler) tryCatchOne(expr, names, parentenv, handlers[[1]]) tryCatchList(expr, names[-nh], parentenv, handlers[-nh]) doTryCatch(return(expr), name, parentenv, handler) tryCatchOne(tryCatchList(expr, names[-nh], parentenv, handlers[-nh]), names tryCatchList(expr, classes, parentenv, handlers) tryCatch({
withErrorTracing({
if (dbExistsTable(connection, tem eval(expr, pf) eval(expr, pf) withVisible(eval(expr, pf)) evalVis(expr) JdbcHandler.R#186: capture.output({
cat("In saveDataFrameToDatabaseTabl saveDataFrameToDatabaseTable(result) tryCatch({
withErrorTracing({
cat(paste0("About to fetch extrac eval(expr, pf) eval(expr, pf) withVisible(eval(expr, pf)) evalVis(expr) Statistics_JRI.R#97: capture.output({
tryCatch({
withErrorTraci runRSmoothingFunctions()
<<< End of stack tracer.
saveDataFrameToDatabaseTable() ERROR: execute JDBC update query failed in dbSendUpdate (The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 4 (""): The supplied value is not a valid instance of data type float. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision.) ; call stack: .local
saveDataFrameToDatabaseTable() ERROR: execute JDBC update query failed in dbSendUpdate (The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 4 (""): The supplied value is not a valid instance of data type float. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision.) ; call stack: conn
saveDataFrameToDatabaseTable() ERROR: execute JDBC update query failed in dbSendUpdate (The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 4 (""): The supplied value is not a valid instance of data type float. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision.) ; call stack: statement
saveDataFrameToDatabaseTable() ERROR: execute JDBC update query failed in dbSendUpdate (The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 4 (""): The supplied value is not a valid instance of data type float. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision.) ; call stack: ...
saveDataFrameToDatabaseTable finishing
In saveDataFrameToDatabaseTableSaving data frame to: scratchSpace/d101-200/s174/data/tmp_s174_map.csv
Creating temporary table: peter.tmp_s174_map
Replace INF will NA for temporary table: peter.tmp_s174_map
Replace NAN with NA for temporary table: peter.tmp_s174_map
Replace "" will NA for temporary table: peter.tmp_s174_map
About to write temporary table: peter.tmp_s174_map; first 10 rows
area_id gid gid_rowindex username study_id inv_id band_id genders
1 01.001.000100.1 NA NA NA 174 160 1 1
2 01.001.000100.1 NA NA NA 174 160 1 2
3 01.001.000100.1 NA NA NA 174 160 1 3
4 01.001.000100.2 NA NA NA 174 160 2 1
5 01.001.000100.2 NA NA NA 174 160 2 2
6 01.001.000100.2 NA NA NA 174 160 2 3
7 01.001.000200.1 NA NA NA 174 160 3 1
8 01.001.000200.1 NA NA NA 174 160 3 2
9 01.001.000200.1 NA NA NA 174 160 3 3
10 01.001.000300.1 NA NA NA 174 160 4 1
direct_standardisation adjusted observed expected lower95 upper95
1 0 0 4 9.298619 0.1172072 1.101410
2 0 0 10 6.215759 0.7714888 2.958666
3 0 0 14 16.502903 0.4637930 1.423363
4 0 0 8 9.799017 0.3524672 1.608650
5 0 0 4 6.032311 0.1806713 1.697788
6 0 0 12 16.576684 0.3740540 1.264522
7 0 0 16 8.413628 1.0869725 3.088204
8 0 0 4 4.376545 0.2490242 2.340108
9 0 0 20 12.706711 0.9614226 2.430871
10 0 0 8 9.577352 0.3606250 1.645882
relative_risk smoothed_relative_risk posterior_probability
1 0.4301714 0.4301714 0.1171262
2 1.6088139 1.6088139 0.9480916
3 0.8483356 0.8483356 0.4156899
4 0.8164085 0.8164085 0.4203541
5 0.6630957 0.6630957 0.3690862
6 0.7239084 0.7239084 0.2528017
7 1.9016766 1.9016766 0.9849630
8 0.9139630 0.9139630 0.5702991
9 1.5739714 1.5739714 0.9689681
10 0.8353040 0.8353040 0.4408056
posterior_probability_upper95 posterior_probability_lower95
1 NA NA
2 NA NA
3 NA NA
4 NA NA
5 NA NA
6 NA NA
7 NA NA
8 NA NA
9 NA NA
10 NA NA
residual_relative_risk residual_rr_lower95 residual_rr_upper95 smoothed_smr
1 NA NA NA 0.6453322
2 NA NA NA 1.3677098
3 NA NA NA 0.8699985
4 NA NA NA 0.8522729
5 NA NA NA 0.7294029
6 NA NA NA 0.7878698
7 NA NA NA 1.5300215
8 NA NA NA 0.8778297
9 NA NA NA 1.3548498
10 NA NA NA 0.8637252
smoothed_smr_lower95 smoothed_smr_upper95
1 0.3577265 1.112127
2 0.7438924 2.360420
3 0.5606205 1.307479
4 0.4971489 1.400670
5 0.3422524 1.433292
6 0.5006380 1.199196
7 0.9550995 2.364329
8 0.4001204 1.773553
9 0.8962529 1.989012
10 0.5031049 1.421521
Stack tracer >>>
.handleSimpleError(function (obj)
{
calls = sys.calls()
calls = ca stop("execute JDBC update query failed in dbSendUpdate (", .jcall(x, "S", " .local(conn, statement, ...) dbSendUpdate(conn, inss, list = list) dbSendUpdate(conn, inss, list = list) .local(conn, name, value, ...) dbWriteTable(connection, name = temporarySmoothedResultsTableName, value = dbWriteTable(connection, name = temporarySmoothedResultsTableName, value = Statistics_Common.R#178: withVisible(expr) Statistics_Common.R#178: withCallingHandlers(withVisible(expr), error = err withErrorTracing({
if (dbExistsTable(connection, temporarySmoothedResul doTryCatch(return(expr), name, parentenv, handler) tryCatchOne(expr, names, parentenv, handlers[[1]]) tryCatchList(expr, names[-nh], parentenv, handlers[-nh]) doTryCatch(return(expr), name, parentenv, handler) tryCatchOne(tryCatchList(expr, names[-nh], parentenv, handlers[-nh]), names tryCatchList(expr, classes, parentenv, handlers) tryCatch({
withErrorTracing({
if (dbExistsTable(connection, tem eval(expr, pf) eval(expr, pf) withVisible(eval(expr, pf)) evalVis(expr) JdbcHandler.R#186: capture.output({
cat("In saveDataFrameToDatabaseTabl saveDataFrameToDatabaseTable(result) tryCatch({
withErrorTracing({
cat(paste0("About to fetch extrac eval(expr, pf) eval(expr, pf) withVisible(eval(expr, pf)) evalVis(expr) Statistics_JRI.R#97: capture.output({
tryCatch({
withErrorTraci runRSmoothingFunctions()
<<< End of stack tracer.
saveDataFrameToDatabaseTable() ERROR: execute JDBC update query failed in dbSendUpdate (The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 4 (""): The supplied value is not a valid instance of data type float. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision.) ; call stack: .local
saveDataFrameToDatabaseTable() ERROR: execute JDBC update query failed in dbSendUpdate (The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 4 (""): The supplied value is not a valid instance of data type float. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision.) ; call stack: conn
saveDataFrameToDatabaseTable() ERROR: execute JDBC update query failed in dbSendUpdate (The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 4 (""): The supplied value is not a valid instance of data type float. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision.) ; call stack: statement
saveDataFrameToDatabaseTable() ERROR: execute JDBC update query failed in dbSendUpdate (The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 4 (""): The supplied value is not a valid instance of data type float. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision.) ; call stack: ...
saveDataFrameToDatabaseTable finishing
.Primitive("return")
----------------- End of Statistics failure details -------------------
07:54:05.516 [pool-7-thread-1] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.rif.services.datastorage.common.CommonStudyStateManager]:
CommonStudyStateManager updateStudyStatus: study_id: 174; state: F; statusMessage: The study results have been created, but there were errors in the statistics processing.
07:54:05.568 [pool-7-thread-1] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.rif.services.datastorage.common.CommonStudyStateManager]:
CommonStudyStateManager: study_id: 174; COMMIT
07:54:05.569 [pool-7-thread-1] INFO org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.rif.services.datastorage.common.RunStudyThread]:
run() FAILED: null
07:54:05.569 [pool-7-thread-1] ERROR org.sahsu.rif.generic.util.RIFLogger : [org.sahsu.rif.generic.system.RIFServiceException]:
1 error(s). Error code is 'null'. Message list is: 'Status is 'The study results have been created, but there were errors in the statistics processing.'; error study state is 'STUDY_RESULTS_RFAILURE''
=============================================
Stack trace of cause follows
=============================================
org.sahsu.rif.services.datastorage.common.StatisticsProcessing.performStep(StatisticsProcessing.java:159)
org.sahsu.rif.services.datastorage.common.RunStudyThread.smoothResults(RunStudyThread.java:247)
org.sahsu.rif.services.datastorage.common.RunStudyThread.run(RunStudyThread.java:107)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
Hi Peter,
On the logging: I copied the existing log configuration (not the class), and on my system the statistics service logs to a separate file. The only problem is that it's a second instance of RIF_middleware*.log
. So the naming could be better. I agree that the whole area needs an overhaul, but I didn't want to touch that as part of this change, since this is quite big already. We have a separate story for handling all that, #86.
On SQL Server: I really need to set up an environment where I can test with it. Maybe I can get that done as part of my installation-process-improvement work. For now, I'm surprised to see "execute JDBC update" in the above log. Shouldn't it be using ODBC? That could perhaps be the problem: something I've done by moving the R to a new service might have disrupted the JDBC/ODBC decision. I'll investigate.
But you suggest that "Probably a piece of SQL Server specific code has been missed in the migration." Do you mean the migration of code to the new service? I didn't knowingly touch any SQL in that. I moved the R scripts, and wrote new Java. There is SQL in the R scripts, but not in anything I touched.
Would you be able to investigate this a bit more?
Actually the problem is the SQL Server is running the JDBC code and hitting the known MARS (array INSERT/UPDATE) bug. I will look at this further, All the log is going into one file which will need tpo be changed to avoid log4j logrotate issues
odbcDataSource=SQLServer13
db_driver_prefix=jdbc:sqlserver
db_host=localhost
db_port=1433
db_name=sahsuland
db_driver_class_name=com.microsoft.sqlserver.jdbc.SQLServerDriver
databaseType=ms
I'd come to the same conclusion. I was loading both the ODBC and JDBC scripts all the time. This change fixes that, so please give it another try.
As to the logging: strange. I get two files: RIF_middleware.2018-11-22-1.log
has the rifServices
details, and RIF_middleware.log-2018-11-22-1.log
has the statsService
. The naming is weird, but they are separated.
But why will there be log rotation issues if both services log to the same file?
OK, so I think this is now fine to merge. I have added a separate statistics logger with new log4j.xml with an Abstract CommonLogger class. Logging is now fine and all the services will auto-restart except for the statistics service which does, but rifServices hangs. This hang is also seen if you try to auto-restart all the services at once (e.g. make install target). This is I suspect a threading bug in log4j.
Postgres and SQL Server now both OK.
This change introduces a new service, the Statistics Service.
It comes in the form of a new module:
statsService
; which produces a new WAR file:statistics.war
You shouldn't have to do anything special to run it, except make sure the new WAR file is deployed to Tomcat along with the others.
With this in place, everything runs the same as before, but you can hot-deploy
rifServices.war
without it causing problems.You can't, however, safely hot-deploy this new WAR file. That's for exactly the same reason as the original problem, not surprisingly. I remain convinced that the root problem is in JRI, but it might be inherent to the fact that it is loading a native shared library (DLL or similar). I experimented with ways to ensure that the native library gets unloaded on shutdown, but without success so far.
For now I've just documented the fact that, should you need to redeploy
statistics.war
, you will have to restart Tomcat. It's not something that should be happening very often anyway.This closes #111 and also #87.