nih-cfde / training-and-engagement

Materials for the Training and Engagement Website
https://training.nih-cfde.org/
Other
25 stars 10 forks source link

Acceptance testing of Epic 1 and 2 #326

Closed ACharbonneau closed 3 years ago

ACharbonneau commented 3 years ago

This testing has two big pieces:

We are doing all of our testing in Staging.


New functionality Tests:

Quick tests

Home page

Dashboard

DCC Review

Registry

Navbar

Link to QA screens

Use Cases:


Previous functionality Tests*:

Use cases

We should be working to put these into something like Selenium for future tests

Process for acceptance testing

I would like each use case checked by at least two people. Preferably with a mix of browsers and OS, so that we have a better chance of spotting potential bugs.

  1. Choose a use case that you will validate
  2. Copy the text from the next comment into a new document
  3. Follow the use case, filling out the document as you go
  4. If you encounter one of the Quick Tests, check that it is right and check it off. If you don't encounter it as part of your use case, leave it blank
  5. When you are done with your use case, post your filled form as a comment in this thread
ACharbonneau commented 3 years ago

Use Case Testing form

Reminders: Test on staging! Test in incognito browser!!

Reviewer: Date of review: Time of review (with timezone!!!): OS (including version): Browser (including version): Use case: Review type: Role groups:

Use Case Test

Instructions

I would like each use case checked by at least two people. Preferably with a mix of browsers and OS, so that we have a better chance of spotting potential bugs. 1. Choose a use case that you will validate 2. Copy the text from the next comment into a new document 3. Follow the use case, filling out the document as you go 4. If you encounter one of the Quick Tests, check that it is right and check it off. If you don't encounter it as part of your use case, leave it blank. If it doesn't work, add some text explaining the problem 5. When you are done with your use case, post your filled form as a comment in this thread I recommend starting by looking at the Quick Tests section and seeing which ones will be part of your use case so you can check them as you go instead of backtracking at the end

Use Case Description

1. Evaluate the description.

2. Try to complete the steps as they are described for the persona in the use case.

Instructions

For each step record: - the specific action you took, for e.g. I clicked on 'leg' in the 'anatomy' filter at [this web address]() - Whether that action was possible/worked - Whether the *results* of that action are as described - and if they are not as described, how they differ - Any other comments you have, or things you were surprised about. Be specific! Copy the lines below as many times as needed for your use case

Action:

Action:

Testing user permissions functionality

If you are a Reviewer or Submitter, please try the following and document with screenshots (including computer clock):

Tasks for this use case:

  1. Based on the description you walked through, does this list of tasks make sense? If not, why not? Are there missing tasks? Unused tasks? Task descriptions that don't quite match the workflow? Be specific both about which tasks and their specific problems.

  2. OPTIONAL (if not already addressed above): Check whether each general task works, regardless of whether the specific instance described in the description works.

Instructions

For each task record: - the specific action you took, for e.g. I clicked on 'leg' in the 'anatomy' filter at [this web address]() - note that tasks are generally broader than the description, so you likely will need to do more than one action to test it - Whether that action was possible/worked, i.e. was it technically possible to do? - Whether the *results* of that action are what you expect, i.e. did it 'work' in the way a user would want - and if they are not as described, how they differ Copy the lines below as many times as needed for your use case

Action:

Action:

Requirements for this use case:

  1. Based on the description you walked through and it's tasks, does this list of requirements make sense? If not, why not? Are there things you needed but are not listed as requirements? Unused requirements? Requirement descriptions that don't quite match the workflow? Be specific both about which requirements and their specific problems.

  2. OPTIONAL (if not already addressed above): Check whether each requirement works, if possible, regardless of whether the specific instance described in the description works.

Instructions

For each requirement record: - the specific action you took, for e.g. I clicked on 'leg' in the 'anatomy' filter at [this web address]() - note that requirements are very broad, so you may need to do more than one action to test it - if you can't find a way to test the requirement, record that and why - Whether that action was possible/worked, i.e. was it technically possible to do? - Whether the *results* of that action are what you expect, i.e. did it 'work' in the way a user would want - and if they are not as described, how they differ

Action:

Action:

Overall

What difficulties did you encounter while completing your use case?

Did you see any spelling, grammar or similar mistakes on any resource you visited in completing your use case?

What other comments or questions do you have about your use case?

What other comments or questions do you have about any of the resources you visited?

What feedback do you have about this form/testing process?

Quick Tests

Complete test if it is encountered as part of your use case.

Link to QA screens for reference

Home page

Dashboard

DCC Review

Registry

Navbar

marisalim commented 3 years ago

I'll be adding my reviews to this comment:

Use case Review
1 https://hackmd.io/QXwwSjowTVuykHoG9vyu9A?edit --> now updated for review on staging
2 https://hackmd.io/zIYOcnMeRcygH8l8mbx5OA?view --> now updated for review on staging
8 https://hackmd.io/klIOHOxGQK6n93JhhDog8g?view
9 https://hackmd.io/k6GupdmjQy-stCxn8ttuzw?edit
ACharbonneau commented 3 years ago

Sorry, testing use case 8, 9 and 10 is going to be slightly more complicated than I thought. I need you to have different permissions for each one. So, below is a table of what your role will be for each use case, I need you to do them one at a time, and notify me so I can change your permissions before you start the next one. I also will need you to tell me when you're done so I can remove your permissions, because you don't actually work for any of these DCCs. @nih-cfde/training

When I give you permissions you will get an email, which you will have to open and accept the invite for before you have them. You will also need to refresh your portal session (log out, log back in) for the settings to take effect. You can check which role you have currently by logging into the portal, clicking your name in the top right corner and clicking "My profile". Roles are called "Groups". Be sure you are using the correct role for your test!! I have made a new line in the review form for you to put in your role(s)/group(s), be sure to actually check your log in to fill it and not just assume you have the right one.

Use case Role DCC Tester
8 Submitter KF Abhijna
8 Reviewer GTEx Marisa
9 Submitter LINCS Marisa
9 Reviewer 4D Saranya
10 Approver HMP Saranya
10 Approver Metabolomics Jose
ACharbonneau commented 3 years ago

@marisalim It looks like you didn't test staging? That's where all the changes are. Sorry, but I'll need you to go through it again on the right server.

Everyone else @nih-cfde/training be sure you're looking at STAGING

ACharbonneau commented 3 years ago

Other notes that will hopefully help others (Thanks to @marisalim for being guinea pig and showing me the places that need clarification 🥇 )

did not encounter these/know what color palette/wireframe is the updated one

In both the form and the first comment there is a link called Link to QA screens for reference, which is all of the wireframes the quicktests is talking about, for your reference. It shows the new, button styles, etc. It does not show the new color palette for long, uninteresting reasons, but the new palette is image

The way I got the tables, it wasn’t necessary to be logged in to the portal.

Whenever you do log into the portal, I need you to note your group access. I added a new line in the form for this, I had not even considered the log in aspect yet.

josesanchez1815 commented 3 years ago

@ACharbonneau HI I would like to test the number 10 use case

josesanchez1815 commented 3 years ago

I'll be adding my reviews to these comments:

Use case Review
1 https://hackmd.io/src-seiVTwaNcgcGXmuizQ
6 https://hackmd.io/jPIdMIcfTn2Z28ayw3fSmA
10 https://hackmd.io/7CM9p-kMTyi8ts3JPflhxw
josesanchez1815 commented 3 years ago

Amanda "Re: acceptance testing. Can all of you start with the previous functionality use cases and tell me when you have completed yours. Then I will give you the extra permissions to try the new functionality. I don't want you to test public stuff as admins I need it tested as the public."

marisalim commented 3 years ago

@ACharbonneau I've completed my review for UC8, ready to review UC9!

s-canchi commented 3 years ago

My individual use case reviews:

Use case Use case type Review
uc-0002 Previous Functionality Test https://hackmd.io/Tc5x7u8NRreu-MF2nfZ0Mw?view
uc-0007 Previous Functionality Test https://hackmd.io/Ihvvtdk8TkirRLYLuqOBjA?view
uc-0009 New Functionality Test https://hackmd.io/ylGbQtQgSVGfTo3kSF0rqA?view
uc-0010 New Functionality Test https://hackmd.io/QzI8jkg4QdG8XwhOL2MElw?view
marisalim commented 3 years ago

@s-canchi @abhijna @josesanchez1815 I added a new section to the review form (also below) to help test the portal by breaking things! 🔨

Testing user permissions functionality

If you are a Reviewer or Submitter, please try the following and document steps with screenshots (including computer clock):

abhijna commented 3 years ago
Use case Review
uc-0008 https://hackmd.io/vQCAThVbScaInbhXfVpaAQ?view
uc-0007 https://hackmd.io/iQPIEsm9SkOs6NBGtxyAVA?view
uc-0006 https://hackmd.io/EOkBXwktQAuPrhtyYP7k8A?view
ACharbonneau commented 3 years ago

Please be sure you are doing your portal testing with permissions in an incognito browser We’re having some problems with changing permissions not updating properly

Can you all please include the clock in screenshots for testing It helps me show that problems aren’t due to system time-out or doing steps in the wrong order

ACharbonneau commented 3 years ago

Reviewers: Read only, can look but not change things Submitters: can use the submission tool and see things, but not change submission status Approver: can’t submit, but can change existing submission statuses

If you notice that you can do something you don't think you should be able to based on those descriptions, please document it

ACharbonneau commented 3 years ago

Hey @nih-cfde/training I'm reviewing your reviews as I see them appear, and you are finding some really useful things. But, I need you to explore the errors you find and document them a lot more. I have given you use cases as a guide so you have some specific steps to do, but we are testing this system before we send it to the public, and we are testing the whole thing. When you notice something is broken, we need to figure out how much is broken so we can have someone fix it.

I'm going to keep pulling out examples of things as I see them to get everyone on the same page about what we're trying to accomplish here. We're testing a bunch of new things, I don't know upfront whats broken, so I can't write an all knowing guide to how you should proceed with testing, it's really a matter of exploration and documentation.

A really good example I just found is in Saranya Use Case 2:

image

This is a great observation! There's clear instructions, and a screenshot and that's great. But, when you find something like this, I need you to stop and explore it. Does it happen just once, or is it repeatable? Is this the only facet that does that? Test all of them, or at least test several of them and record which ones work as expected and which don't, so I can write a bug report. We're all busy, and I know that this takes a lot of time, but having them fix just this one facet when in fact two of them do this, is not very useful for our end users. I just went and tested all the facets to find this, but I won't be able to go double check every issue we find. Think about video game testers, they don't just play the game, they spend a lot of time running into walls and falling into pits trying to make sure that walls actually stop you and pits are actually holes.

This extra information about a problem you encounter is exactly the sort of extra info I was hoping for in the "What difficulties did you encounter while completing your use case?" and "What other comments or questions do you have about any of the resources you visited?" If you have suggestions about the form, and how to make it better or the wording better for doing actual testing please add them to your form or edit the form above.

ACharbonneau commented 3 years ago

Other things to add to this process:

Well that’s annoying. This is your notification I guess :slightly_smiling_face: If you have a review without notes from me, but where you found problems, please tell me. I think I got them all, but I might have missed

We should probably put them all the reviews into a spreadsheet or something where we can track which ones are done, and checked and have issues submitted more easily for next time

It would also be good if the two people reviewing worked together/combined their reviews into one, where any problems were replicated by the other person.

s-canchi commented 3 years ago

I have updated my test forms to reply your comments. The weird thing is that many of the bugs that we encountered during testing were fixed, which is making replicating the issues hard. Before the next iteration of tests we want to:

ACharbonneau commented 3 years ago

The only things that were updated on the portal were the permissions for the edit button that I asked ya'll to retest as they fixed it, and they did the restart yesterday to fix a permissions issue from my use case. But, they aren't actively changing things or fixing bugs as you test. I'm not sure why there were so many things ya'll reported that you couldn't replicate, but it's not that they're changing things.

ctb commented 3 years ago

Hi @s-canchi and others,

I have updated my test forms to reply your comments. The weird thing is that many of the bugs that we encountered during testing were fixed, which is making replicating the issues hard. Before the next iteration of tests we want to:

* refine the use case descriptions to eliminate some of the guesswork involved (and multiple interpretations)

This shifts a lot of the burden to the person writing the use case. I would suggest instead documenting what guesses you had to make, and what your alternative guesses were, when you were doing the testing. This would then help refine the use cases and/or generate new use cases, which is quite valuable.

I also note that overly precise use cases can be problematic. We're looking for some level of generality here, and "reasonable" behavior - as defined by "I thought it might work this way, and it didn't" - can be very hard to nail down in advance.

* get selenium tests to work to ensure the focus of manual testing remains of details rather than overall function

+1

Another thought - if you're having trouble reproducing, you can maybe try doing screen recording? I've had good luck just using zoom -> recording in the cloud, so that it's there to back me up when I try to replicate problems.

josesanchez1815 commented 3 years ago

May I have the special permissions I need for use case 10? I have completed use cases 1 and 6! @ACharbonneau

10 Approver Metabolomics Jose
ACharbonneau commented 3 years ago

Since hackmd doesn't notify, perhaps it would be useful if the review templates were more like a guide to what to do, and issues actually went into a spreadsheet or similar, one problem per row. Then we could make sure that each one was checked and is a real issue, I could track which ones I've submitted as issues and mark them as they're fixed, and which ones need more info.

ctb commented 3 years ago

On Tue, Feb 23, 2021 at 08:48:39AM -0800, Amanda Charbonneau wrote:

Since hackmd doesn't notify, perhaps it would be useful if the review templates were more like a guide to what to do, and issues actually went into a spreadsheet or similar, one problem per row. Then we could make sure that each one was checked and is a real issue, I could track which ones I've submitted as issues and mark them as they're fixed, and which ones need more info.

notifications CAN happen -

https://hackmd.io/settings#notification

josesanchez1815 commented 3 years ago

@ACharbonneau I can start the spreadsheet if you would like!

ACharbonneau commented 3 years ago

Nope. Like I said at the meeting last week, I want ya'll to do a retrospective of how this review process went and to decide how you want to do it next time. There is no reason to make new resources before you've decided what you're going to do.

ACharbonneau commented 3 years ago

We tested all the things! Issues for broken things were created in various repos with the label "testing" and either fixed or put in the backlog.

@nih-cfde/training if they don't already exist someone please make issues for:

and then close this issue.

thanks

marisalim commented 3 years ago

have made issues for the things listed in https://github.com/nih-cfde/training-and-engagement/issues/326#issuecomment-790006014

closing this issue now!