githubtraining / using-github-actions-for-ci

Course repository for "GitHub Actions: Continuous Integration"
https://lab.github.com/githubtraining/github-actions:-continuous-integration
Creative Commons Attribution 4.0 International
3 stars 16 forks source link

Auditing Course Prior to Universe #21

Closed mattdavis0351 closed 4 years ago

mattdavis0351 commented 4 years ago

Using GitHub Actions for CI

I have broken things down into multiple categories and have tried to keep notes along every step of the way as I took this course. Please forgive me for not breaking these out into steps directly, sometimes it's hard to tell which Learning Lab step I am in.

Category Description Icon
Flow Changes that I think would improve the learners ability to navigate through the course along our intended path :droplet:
Concern Something jumped out at me about this and I wanted to bring them up for discussion 🤔
Bug Something broken while I was following the steps for the course as intended :bug:
Vulnerability Something broke because of an action I took that wasn't intended. This is being labeled as a vulnerability under the scope that the course is vulnerable to this type of action happening and breaks because of it. :warning:

So let's get started!


:droplet:

Severity: Low

Problem: After joining the course I have no real indication about what the first step is. The only thing I am presented with when I visit my repository is this README.md file.

I know from experience that there is either an issue or a pull request open for me, but it's unfair to expect our learners to have that experience.

Screen Shot 2019-11-06 at 12.36.03 PM

As expected, when I click the link on the course steps page I am taken to the issue as expected to begin taking this course. ⬇️

Screen Shot 2019-11-06 at 12.39.47 PM

Suggestions: Add a link here in the README.md that also points me to the first issue.


:thinking:

Severity: Low

The desired name for the first pull request is CI for Node and it is highly case sensitive. As you can see, if I use a lowercase version of the same name, ci for node the validation step fails

Screen Shot 2019-11-06 at 1.34.45 PM

Can we standardize the input we are collecting from the user to be less case sensitive? Consider how form input on a webpage might be collected and then transformed to all lowercase on the backend for logic consistency.

JavaScript can do this be implementing the toLowerCase() function:

const userInput = "My User Does SiLLy ThInGs";

const ourInput = userInput.toLowerCase();

console.log(ourInput);

output:

my user does silly things

Having something like this would allow for minor mistakes to happen from the learner without impeding the flow of the course.

This may also impact the speed at which a course can progress since we wouldn't always be waiting on a Learning Lab response to explain to the user that they didn't use the proper case when defining the text for something.


:bug:

Severity: Moderate (prevents course progress)

Problem: If the user names the first pull request incorrectly they can end up in a infinite loop of being told to name it correctly. This prevents to course from progressing.

Steps to reproduce:

  1. continue naming the pull request incorrectly Screen Shot 2019-11-06 at 1.50.42 PM
  2. This will continue forever as long as the user provides invalid names

Suggestions:


:bug:

Severity: Moderate(prevents course from progressing)

Problem: If the check_suite finishes before the bot has a chance to listen for the payload the course does not continue unless the user manually triggers the check_suite again.

Steps to reproduce:

  1. Name the first pull request incorrectly (see other bug :point_up:)
  2. Wait for the check_suite to finish running
  3. Name the first pull request to the correct value
  4. Once the bot finishes explaining the workflow file it responds with the following: Screen Shot 2019-11-06 at 2.16.14 PM
  5. Refresh the page as you are instructed to by the bot
  6. Refreshing does not trigger the check_suite again
  7. The course has officially stalled

Probable Cause:

Suggestions:


:warning:

Severity: High (prevents course progress, creates rework for the learner)

When following the steps outlined in the first issue:

It is entirely possible to break this course by selecting the incorrect workflow template.

Steps to reproduce:

  1. Navigate to the Actions tab
  2. Select any workflow other than the one intended, in this case I chose the Node.js Package template Screen Shot 2019-11-06 at 12.49.04 PM
  3. Commit the workflow to a new branch
  4. Create full request titled CI for Node
  5. The Learning Lab bot will comment on our pull request with the intended response, but will never continue even after following the suggestions of a refresh: Screen Shot 2019-11-06 at 1.05.09 PM
  6. The following error is generated by Learning Lab:
    HttpError: {"message":"Validation Failed","errors":[{"resource":"PullRequestReviewComment","code":"invalid","field":"path"}],"documentation_url":"https://developer.github.com/v3/pulls/comments/#create-a-comment"}
       at /app/node_modules/@octokit/rest/lib/request/request.js:72:19
       at processTicksAndRejections (internal/process/task_queues.js:93:5)
       at async Context.runActions (/app/lib/context.js:216:24)
       at async Course.runHandler (/app/lib/course.js:184:32)

Probable Cause: The config.yml for this course is expecting a very specific file path after the templated workflow is committed. When the wrong template is used, the expected filename changes and thus breaks at this step:

Screen Shot 2019-11-06 at 1.00.18 PM

As we can see the file parameter is looking for .github/workflows/nodejs.yml because our learner selected a different template workflow as seen in step 2 above the actual file that exists is .github/workflows/npmpublish.yml.

Further Findings:

Suggestions


:bug:

Severity: Low (course progresses, feels untested and rushed)

Problem: In the CI for Node pull request the user is prompted to enter the name of a failing test to have the course progress. The Course progresses regardless of what is typed.

No actual learning is reinforced

Steps to reproduce:

  1. When prompted to enter the name of a failing test type something other than what is asked. See below: Screen Shot 2019-11-06 at 2.38.33 PM
  2. I typed the word bread and the course progressed

Suggestions:


:bug:

Severity: Low (can break progression, easily avoidable)

Problem: Continual bad changes to game.js keep the learner in an endless loop of identical responses.

Steps to reproduce:

  1. Ignore the suggested changes from the Learning Lab bot
  2. Change the value of this.p2 to anything you want
  3. Watch the bot respond with help
  4. Change the value of this.p2 to another improper value
  5. Watch the bot respond with help
  6. Continue this loop forever Screen Shot 2019-11-06 at 2.48.45 PM

Suggestions:


:thinking:

Severity: Low(not tested, just guessing)

Problem: I am guessing that the Improve CI pull request will suffer from all of the same pitfalls the CI for node pull request did.


:warning:

Severity: High(Course progresses when it shouldn't, eventually breaks)

Problem: When editing the node.yml workflow file in the Improve CI pull request any change made to the file triggers a 'success' style response from the bot.

Steps to reproduce:

  1. Ignore the suggested changes to .github/workflows/nodejs.yml
  2. Edit .github/workflows/nodejs.yml by making any change you wish to Screen Shot 2019-11-06 at 3.15.01 PM
  3. Commit changes to current branch the PR is for
  4. Watch the bot respond with success style message
  5. Finally, accept the suggested changes, ignoring what is currently being asked of you.
  6. The course is now entirely out of sync since we are only listening for the completion of check_suites. We can no longer accept suggestions Screen Shot 2019-11-06 at 3.27.36 PM

Further findings: The edited workflow file executes, which is something we expect, however it is something we are not accounting for. Our learners could end up running out of usage quota without realizing it by making unexpected changes to these files. Although that's a risk to them, it's one we should do our best to mitigate by handling the body of these files better than what we are doing.

Take a look at the workflow running.

Screen Shot 2019-11-06 at 3.16.45 PM

As a new learner I might not expect this behavior, especially since the Learning Lab bot didn't inform me that the changes that I made were incorrect.

This poses a new challenge for us as course authors. We haven't had to think about how Actions and Learning Lab impact one another. For every good thing we can do when these two features are married together there are ten things we need to account for to provide a better experience.

At this point I have gotten this far in the course, and now my best option is to restart the entire thing

Screen Shot 2019-11-06 at 3.32.21 PM

Suggestions:


Biggest Takeaways

There are many things we need to consider from the course design perspective. Working to incorporate more thorough validation of inputs and exit strategies for infinite loops will improve the quality of the courses we create in the future.

Addressing these two issues alone will help us maintain the confidence people have in the Learning Lab platform. These changes will also dramatically improve the quality of our courses.

I ask the question of how can we not only guide the learner as they progress through the course, but also help them dislodge themselves when they get stuck on something they don't understand?

We cannot assume the learner knows anything about GitHub, writing code, working with branches, issues or pull requests. The moment we assume that anything is familiar to them is the moment we fail them.

The exception to that mindset is when we explicitly guide them down a learning path, without allowing access to assets before we know for certain they have the required familiarity.

brianamarie commented 4 years ago

This has long past, so I am going to close this issue. Please reopen if necessary!