dams-mcda / Dams-MCDA

Emma Fox R/Shiny Project with a docker server configuration
1 stars 0 forks source link

Converting Matlab logic to R #90

Closed sythel closed 4 years ago

sythel commented 5 years ago

if successful closes issue #89 if fail remove on-hold label from issue #89

elbfox commented 4 years ago

WSMUpdate_Dams branch started to address these changes. Outline of tasks:

Retrieve criteria scores for each dam (referred to as DamsDataMartrix), for each MCDA scenario (from server?) a 3D matrix [dams,criteria,alternatives]

Normalization procedure:

Rank:

Retrieve table, map of highest ranked scenario:

elbfox commented 4 years ago

For retrieval of 3D matrix [dams, criteria, alternatives] I created a hyperframe using cbind() to pull specific 2d matrices together [dams, criteria] under each of 5 decision alternatives. May have to break this step up by decision alternative to normalize before binding into 3D matrix.

elbfox commented 4 years ago

Hyperframe (each cell entry as a matrix) is apparently NOT the same as an array (actual 3d matrix), so I switched to using abind and specifying force.array=TRUE with along = 3 to get the 3d matrix we discussed with Sharon. Working on normalizing in a 3D matrix because I'm not sure pulling it apart is the right call, here.

elbfox commented 4 years ago

Normalization and weighting debugging help needed in WSMUpdate_Dams branch if at all possible, @sythel. Using test 3d array.

Have not gotten to the Matlab scenario ranking stuff @samGroy. Have to get the WSM function working in three dimensions first, before I start translating from Matlab to R.

sythel commented 4 years ago

@elbfox regarding the normalization procedure below. how are we dealing with the alternative (3rd dimension) part of DamsDataMatrix?

Normalization procedure: get maximum and minimum criteria score for each criteria, each dam, produces two 2D matrices [dams, max/min criteria]

for positive scores: norm = (f - f_min) / (f_max - f_min)
for negative scores (like cost): norm = 1 - (f - f_max) / (f_min - f_max)
result is 3D matrix with dam-specific criteria scores normalized by min and max criteria sampled over all alternatives
sythel commented 4 years ago

it would be nice to have a data example of each step (i.e. the 2d/3d matrix). This would make it easier to verify any progress I make is correct.

sythel commented 4 years ago

DamsDataMatrix in my working branch as of now: (levels -> preference)

level 1
0, 0, 0.04, 1229307, 2, 0, 98100000, 3, 2, 3.7, 2.9, 2.4, 1, 1
0, 0, 0, 406400, 1, 0, 37700000, 7, 2.1, 3.3, 2.9, 2.5, 1, 1
0, 0, 0.01, 1657029, 1, 0, 203300000, 4, 2, 3.3, 3, 2.4, 1, 1
0, 0, 0.05, 1000, 2, 0, 0, 1, 1, 1, 1, 1, 1, 1
0, 0, 0.03, 402800, 2, 0, 47300000, 8, 1, 1, 1, 1, 1, 1
0, 0, 1.7, 747000, 3, 0, 2.34e+08, 5, 1, 1, 1, 1, 1, 1
0, 0, 0, 948973, 3, 5, 73200000, 2, 1, 1, 1, 1, 1, 1
0, 0, 0, 246000, 3, 0, 28118000, 6, 1, 1, 1, 1, 1, 1
shiny_server_1  | level 2
0, 0, 0.04, 1229307, 2, 0, 98100000, 18268, 2.7, 3.5, 3.5, 3.3, 1, 1
0, 0, 0, 1897182, 1, 0, 59789504.04, 11134, 2.7, 3.3, 3.4, 3, 1, 1
0, 0, 0.01, 1657029, 1, 0, 203300000, 37859, 2.6, 3.4, 3.5, 3.2, 1, 1
0, 0, 0.05, 101877, 2, 0, 730000, 136, 1, 1, 1, 1, 1, 1
0, 0, 0.03, 1880377, 2, 0, 74627079.75, 13897, 1, 1, 1, 1, 1, 1
0, 0, 1.7, 3487193, 3, 0, 280575539.6, 52249, 1, 1, 1, 1, 1, 1
0, 0, 0, 948973, 3, 5, 73200000, 13631, 1, 1, 1, 1, 1, 1
0, 0, 0, 1148393, 3, 0, 48225639.53, 8981, 1, 1, 1, 1, 1, 1
shiny_server_1  | level 3
0, 0, 0.04, 1414530, 2, 0, 98100000, 3, 4, 3.8, 4.1, 3.9, 1, 1
0, 0, 0, 470743, 1, 0, 37700000, 7, 3.9, 3.5, 3.7, 3.9, 1, 1
0, 0, 0.01, 1969831, 1, 0, 203300000, 4, 3.6, 3.3, 3.9, 3.7, 1, 1
0, 0, 0.05, 6646, 2, 0, 0, 1, 1, 1, 1, 1, 1, 1
0, 0, 0.03, 467463, 2, 0, 47300000, 8, 1, 1, 1, 1, 1, 1
0, 0, 1.7, 1072415, 3, 0, 2.34e+08, 5, 1, 1, 1, 1, 1, 1
0, 0, 0, 1066577, 3, 5, 73200000, 2, 1, 1, 1, 1, 1, 1
0, 0, 0, 278819, 3, 0, 28118000, 6, 1, 1, 1, 1, 1, 1
shiny_server_1  | level 4
0, 0, 0.04, 1414530, 2, 0, 98100000, 18268, 3.9, 3.8, 3.9, 4.1, 1, 1
0, 0, 0, 1961525, 1, 0, 59789504.04, 11134, 3.6, 3.7, 3.7, 3.7, 1, 1
0, 0, 0.01, 1969831, 1, 0, 203300000, 37859, 3.6, 3.5, 3.9, 3.7, 1, 1
0, 0, 0.05, 107523, 2, 0, 730000, 136, 1, 1, 1, 1, 1, 1
0, 0, 0.03, 1945040, 2, 0, 74627079.75, 13897, 1, 1, 1, 1, 1, 1
0, 0, 1.7, 3812608, 3, 0, 280575539.6, 52249, 1, 1, 1, 1, 1, 1
0, 0, 0, 1066577, 3, 5, 73200000, 13631, 1, 1, 1, 1, 1, 1
0, 0, 0, 1181212, 3, 0, 48225639.53, 8981, 1, 1, 1, 1, 1, 1
shiny_server_1  | level 5
0, 0, 0.04, 319433, 2, 0, 0, 0, 3.7, 4, 2.6, 3.3, 1, 1
0, 0, 0, 168458, 1, 0, 0, 0, 3.6, 1.8, 2.3, 3.6, 1, 1
0, 0, 0.01, 215152, 1, 0, 0, 0, 3.6, 1.8, 2.5, 3.5, 1, 1
0, 0, 0.05, 61843, 2, 0, 0, 0, 1, 1, 1, 1, 1, 1
0, 0, 0.03, 212088, 2, 0, 0, 0, 1, 1, 1, 1, 1, 1
0, 0, 1.7, 723720, 3, 0, 0, 0, 1, 1, 1, 1, 1, 1
0, 0, 0, 179289, 3, 5, 0, 0, 1, 1, 1, 1, 1, 1
0, 0, 0, 160151, 3, 0, 0, 0, 1, 1, 1, 1, 1, 1
elbfox commented 4 years ago

@elbfox regarding the normalization procedure below. how are we dealing with the alternative (3rd dimension) part of DamsDataMatrix?

Normalization procedure: get maximum and minimum criteria score for each criteria, each dam, produces two 2D matrices [dams, max/min criteria]

for positive scores: norm = (f - f_min) / (f_max - f_min)
for negative scores (like cost): norm = 1 - (f - f_max) / (f_min - f_max)
result is 3D matrix with dam-specific criteria scores normalized by min and max criteria sampled over all alternatives

The normalization is supposed to apply for each criterion across the set of alternatives for a single dam (2 dimensions: criteria, alternatives) repeated for the whole set of dams. I'm not sure that this is yet clear in the code. I'll take a look at it tomorrow and add some examples of what I'm trying to do.

samGroy commented 4 years ago

Hi Emma, the version of WSM I put on github uses pre-normalized data, so there’s no need for that step, only multiplication by preferences and summing scores. Also the data/maps should be the final version for the dams and criteria you and Sharon selected.

Best Sam

On Aug 13, 2019, at 6:48 PM, Emma Fox notifications@github.com wrote:

@elbfox regarding the normalization procedure below. how are we dealing with the alternative (3rd dimension) part of DamsDataMatrix?

Normalization procedure: get maximum and minimum criteria score for each criteria, each dam, produces two 2D matrices [dams, max/min criteria]

for positive scores: norm = (f - f_min) / (f_max - f_min) for negative scores (like cost): norm = 1 - (f - f_max) / (f_min - f_max) result is 3D matrix with dam-specific criteria scores normalized by min and max criteria sampled over all alternatives The normalization is supposed to apply for each criterion across the set of alternatives for a single dam (2 dimensions: criteria, alternatives) repeated for the whole set of dams. I'm not sure that this is yet clear in the code. I'll take a look at it tomorrow and add some examples of what I'm trying to do.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.

elbfox commented 4 years ago

Hi Sam, thanks! All good on the maps, just working through the preferences/score sum piece. We needed to have the raw data there too, so I'm still working through where to fit in what you shared because the semantics are a little different than what I'm used to in R. I'm really behind on all of this. Billy, the files that Sam is referring to should be in the WSMUpdate_Dams branch. I'm going to work on this shortly.

elbfox commented 4 years ago

I think it's important that the normalization calculation is included in the WSM script if possible.

elbfox commented 4 years ago

Ok @sythel, starting with your DamsDataMatrix

1) First step: preference data from slider bars = RawCriteriaMatrix . Example: using fake data I made up at the top of the WSM script. The criteria don't sum to 1 here for each row, but they will have to based on the constraints put on user data entry via slider bars.

RawCriteriaMatrix , , 1

  [,1] [,2]  [,3]  [,4] [,5]  [,6] [,7]  [,8] [,9] [,10] [,11] [,12]

[1,] 0.000 0.1 0.050 0.000 0.05 0.000 0.2 0.000 0.1 0.050 0.000 0.05 [2,] 0.200 0.2 0.025 0.000 0.75 0.000 0.1 0.200 0.2 0.025 0.000 0.75 [3,] 0.050 0.0 0.050 0.000 0.20 0.000 0.1 0.050 0.0 0.050 0.000 0.20 [4,] 0.025 0.0 0.750 0.000 0.10 0.200 0.2 0.025 0.0 0.750 0.000 0.10 [5,] 0.050 0.0 0.200 0.000 0.10 0.050 0.0 0.050 0.0 0.200 0.000 0.10 [6,] 0.750 0.0 0.100 0.200 0.20 0.025 0.0 0.750 0.0 0.100 0.200 0.20 [7,] 0.200 0.0 0.100 0.050 0.00 0.050 0.0 0.200 0.0 0.100 0.050 0.00 [8,] 0.100 0.2 0.200 0.025 0.00 0.750 0.0 0.100 0.2 0.200 0.025 0.00 [,13] [,14] [1,] 0.000 0.2 [2,] 0.000 0.1 [3,] 0.000 0.1 [4,] 0.200 0.2 [5,] 0.050 0.0 [6,] 0.025 0.0 [7,] 0.050 0.0 [8,] 0.750 0.0

, , 2

  [,1] [,2]  [,3]  [,4] [,5]  [,6] [,7]  [,8] [,9] [,10] [,11] [,12]

[1,] 0.000 0.1 0.050 0.000 0.05 0.000 0.2 0.000 0.1 0.050 0.000 0.05 [2,] 0.200 0.2 0.025 0.000 0.75 0.000 0.1 0.200 0.2 0.025 0.000 0.75 [3,] 0.050 0.0 0.050 0.000 0.20 0.000 0.1 0.050 0.0 0.050 0.000 0.20 [4,] 0.025 0.0 0.750 0.000 0.10 0.200 0.2 0.025 0.0 0.750 0.000 0.10 [5,] 0.050 0.0 0.200 0.000 0.10 0.050 0.0 0.050 0.0 0.200 0.000 0.10 [6,] 0.750 0.0 0.100 0.200 0.20 0.025 0.0 0.750 0.0 0.100 0.200 0.20 [7,] 0.200 0.0 0.100 0.050 0.00 0.050 0.0 0.200 0.0 0.100 0.050 0.00 [8,] 0.100 0.2 0.200 0.025 0.00 0.750 0.0 0.100 0.2 0.200 0.025 0.00 [,13] [,14] [1,] 0.000 0.2 [2,] 0.000 0.1 [3,] 0.000 0.1 [4,] 0.200 0.2 [5,] 0.050 0.0 [6,] 0.025 0.0 [7,] 0.050 0.0 [8,] 0.750 0.0

, , 3

  [,1] [,2]  [,3]  [,4] [,5]  [,6] [,7]  [,8] [,9] [,10] [,11] [,12]

[1,] 0.000 0.1 0.050 0.000 0.05 0.000 0.2 0.000 0.1 0.050 0.000 0.05 [2,] 0.200 0.2 0.025 0.000 0.75 0.000 0.1 0.200 0.2 0.025 0.000 0.75 [3,] 0.050 0.0 0.050 0.000 0.20 0.000 0.1 0.050 0.0 0.050 0.000 0.20 [4,] 0.025 0.0 0.750 0.000 0.10 0.200 0.2 0.025 0.0 0.750 0.000 0.10 [5,] 0.050 0.0 0.200 0.000 0.10 0.050 0.0 0.050 0.0 0.200 0.000 0.10 [6,] 0.750 0.0 0.100 0.200 0.20 0.025 0.0 0.750 0.0 0.100 0.200 0.20 [7,] 0.200 0.0 0.100 0.050 0.00 0.050 0.0 0.200 0.0 0.100 0.050 0.00 [8,] 0.100 0.2 0.200 0.025 0.00 0.750 0.0 0.100 0.2 0.200 0.025 0.00 [,13] [,14] [1,] 0.000 0.2 [2,] 0.000 0.1 [3,] 0.000 0.1 [4,] 0.200 0.2 [5,] 0.050 0.0 [6,] 0.025 0.0 [7,] 0.050 0.0 [8,] 0.750 0.0

, , 4

  [,1] [,2]  [,3]  [,4] [,5]  [,6] [,7]  [,8] [,9] [,10] [,11] [,12]

[1,] 0.000 0.1 0.050 0.000 0.05 0.000 0.2 0.000 0.1 0.050 0.000 0.05 [2,] 0.200 0.2 0.025 0.000 0.75 0.000 0.1 0.200 0.2 0.025 0.000 0.75 [3,] 0.050 0.0 0.050 0.000 0.20 0.000 0.1 0.050 0.0 0.050 0.000 0.20 [4,] 0.025 0.0 0.750 0.000 0.10 0.200 0.2 0.025 0.0 0.750 0.000 0.10 [5,] 0.050 0.0 0.200 0.000 0.10 0.050 0.0 0.050 0.0 0.200 0.000 0.10 [6,] 0.750 0.0 0.100 0.200 0.20 0.025 0.0 0.750 0.0 0.100 0.200 0.20 [7,] 0.200 0.0 0.100 0.050 0.00 0.050 0.0 0.200 0.0 0.100 0.050 0.00 [8,] 0.100 0.2 0.200 0.025 0.00 0.750 0.0 0.100 0.2 0.200 0.025 0.00 [,13] [,14] [1,] 0.000 0.2 [2,] 0.000 0.1 [3,] 0.000 0.1 [4,] 0.200 0.2 [5,] 0.050 0.0 [6,] 0.025 0.0 [7,] 0.050 0.0 [8,] 0.750 0.0

, , 5

  [,1] [,2]  [,3]  [,4] [,5]  [,6] [,7]  [,8] [,9] [,10] [,11] [,12]

[1,] 0.000 0.1 0.050 0.000 0.05 0.000 0.2 0.000 0.1 0.050 0.000 0.05 [2,] 0.200 0.2 0.025 0.000 0.75 0.000 0.1 0.200 0.2 0.025 0.000 0.75 [3,] 0.050 0.0 0.050 0.000 0.20 0.000 0.1 0.050 0.0 0.050 0.000 0.20 [4,] 0.025 0.0 0.750 0.000 0.10 0.200 0.2 0.025 0.0 0.750 0.000 0.10 [5,] 0.050 0.0 0.200 0.000 0.10 0.050 0.0 0.050 0.0 0.200 0.000 0.10 [6,] 0.750 0.0 0.100 0.200 0.20 0.025 0.0 0.750 0.0 0.100 0.200 0.20 [7,] 0.200 0.0 0.100 0.050 0.00 0.050 0.0 0.200 0.0 0.100 0.050 0.00 [8,] 0.100 0.2 0.200 0.025 0.00 0.750 0.0 0.100 0.2 0.200 0.025 0.00 [,13] [,14] [1,] 0.000 0.2 [2,] 0.000 0.1 [3,] 0.000 0.1 [4,] 0.200 0.2 [5,] 0.050 0.0 [6,] 0.025 0.0 [7,] 0.050 0.0 [8,] 0.750 0.0

2) DamsDataMatrix normalization should normalize criteria to the range using max/min values as in:

WSMMaxVector <- list("list", matrix_cols)
  for ( k in 1:matrix_cols ){
      WSMMaxVector[[k]] <- max(DamsDataMatrix[,k,], na.rm=FALSE)      }
  WSMMaxVector <- unlist(WSMMaxVector)

  WSMMinVector <- list("list", matrix_cols)
  for ( k in 1:matrix_cols ){
      WSMMinVector[[k]] <- min(DamsDataMatrix[,k,], na.rm=FALSE)      }
  WSMMinVector <- unlist(WSMMinVector)

(SEE ATTACHED SPREADSHEET) DamsDataNormalized.xlsx 3)Prefs matrix * DamsDataNormalized

4) weighted sum calculation for each dam across all alternatives (8 x 2D matrices) ***this is the single-dam result, each 2D matrix is the result for the individual dam results tabs.

5) and then normalize alternatives to the range, likewise using min/max values (back to the 3D matrix)

6) weighted sum calculation across all dam/alternative 'scenarios' (3D) ***this is the multi-dam result, where the dam/alternative 'scenarios' are all considered and optimized for the "best" result.

elbfox commented 4 years ago

OK I'm somewhat confident that I actually have this all sorted out in the SamUpdate_WSM branch, now. @sythel I will make a pull request and assign you for review.

Up Next: I will spend some time this weekend adjusting the existing graph-related issues to reflect the variable names I'm using in the WSM function so we can get those working. The function dynamically identifies which map we need to show in the multi-dam result page based on the multi-dam ranking (Thanks @samGroy!!), but I am not sure yet how to actually grab the map from the maps folder in a dynamic way.