nsidc / usaon-benefit-tool

Application for configuring USAON Benefit Tool value tree analysis surveys and gathering input from respondents
https://usaon-benefit-tool.readthedocs.io/
MIT License
0 stars 3 forks source link

Add additional fields to "add link" page #295

Closed hazelshapiro closed 1 day ago

hazelshapiro commented 5 months ago

Should include three text description boxes (and new fields on the Link model):

For later:

See glossary for more info

Sherwin-14 commented 4 months ago

@mfisher87 Is this still relevant? Do you have any specific opinions regarding this?

mfisher87 commented 4 months ago

Hey @Sherwin-14 :) This is still relevant. Looks like we currently only have performance and criticality rating, but we need to add rationale and the others:

https://github.com/nsidc/usaon-benefit-tool/blob/7b8cec64605372c451d55af817607c0e47c5bf13/usaon_benefit_tool/models/tables.py#L398-L413

I am updating the issue description to include what I know. I'm thinking we could create a constant (if there isn't already one) so we can update all the large text fields at once, e.g. DB_TEXT_SIZE_LARGE = 8196, and similar constants for the other common string lengths we use. That way it will be easier to change them if we get them wrong :)

@hazelshapiro I'm adding some "TBD" markers to the issue description. Can you fill those in and also verify the character limits?

hazelshapiro commented 3 months ago

Character limits above look good. Variable/attribute: from the glossary - "If an observing system or data product contains many observable properties or variables, this allows a respondent to specify which field they used." Logistically it would be a text field with a ~300 char limit Rated by: Allow the value to be selected from a dropdown

hazelshapiro commented 3 months ago

Another option would be to have the 'rated by' as an assessment level detail, rather than associated with each individual rating. This is linked to: https://github.com/nsidc/usaon-benefit-tool/milestone/ Let me confirm with Sandy, but I think this approach would simplify things and be sufficient. And as a best practice, we would add a description at the assessment level that includes who was responsible for what piece of the rating. Let's remember to add a note to the decision record.

hazelshapiro commented 3 months ago

Moving this to the response object milestone since it has to do with how analysts enter data. That way we can keep this milestone as projects that are mostly on my list.