exercism / cfml

Exercism exercises in CFML.
https://exercism.org/tracks/cfml
MIT License
13 stars 25 forks source link

Proposed re-work of CFML exercises #26

Closed bdw429s closed 7 years ago

bdw429s commented 7 years ago

Ok, here's my proposal for reworking the CFML exercises to make them quick and easy to run either from the CLI or a web browser. I don't know where the readme instructions would go for how to run the tests so I'll just put them here. You can clone my repo to give it a test locally.

  1. Install CommandBox CLI. On a Mac, this is as easy as brew install commandbox. For Window and Linux, see the installation instructions here: https://ortus.gitbooks.io/commandbox-documentation/content/setup/installation.html
  2. cd into a specific exercise directory (in this case, the only one is excercises/leap and run the command box task run TestRunner
  3. The results of your tests will display in the CLI
  4. Implement your exercise in the Leap.cfc class and re-run the test runner until the tests pass.

To run your tests from a browser:

  1. cd into the root of the exercises folder.
  2. Run box install in that folder to install dependencies
  3. Run box start and after a few seconds a browser window will open with an HTML report of your tests

My new design has the following Features:

TODO:

  1. Implement a watcher flag to the task runner that will sit and watch the directory for changes and rerun the tests automatically.
  2. Put in a nice directory list in the root exercise folder that lists each of the exercises out for people who prefer to start up a web server. This will be the first page that loads in their browser. It can be dynamic based on the file system and provide a link into each exercise.
bdw429s commented 7 years ago

Sweet, I see we have a fancy Travis build that's already provided some feedback. I've reviewed a bunch of docs floating around in the different repos about exactly what the anatomy of an exercise should be but I'm not clear on several things.

ilya-khadykin commented 7 years ago

The build is failing because of configlet lint verifications, you can read more info about it here - https://github.com/exercism/configlet#lint More info about config.json - https://github.com/exercism/problem-specifications/blob/master/CONTRIBUTING.md#track-configuration-file Discussion about patterns in config.json - https://github.com/exercism/discussions/issues/192

The command "bin/fetch-configlet" exited with 0.
0.18s$ bin/configlet lint .
-> An implementation for '.meta' was found, but config.json does not reference this exercise.
-> The implementation for '.meta' is missing an example solution.
-> The implementation for 'leap' is missing an example solution.
The command "bin/configlet lint ." exited with 1.

Looks like configlet lint doesn't handle hidden directories correctly, it considers .meta as an exercise and tries to find required files. It probably deserves an issue in corresponding repo. Or you can move files related to TestRunner elsewhere

It can't find example solution for leap as well, since you haven't provided "solution_pattern" in config.json

You also removed "test_pattern" and travis.yml is not running actual tests in its current version, which doesn't seem rigth

I also suggest adding hello-world exercise

Keep up a good work!

Insti commented 7 years ago

.meta should be a subdirectory of the leap exercise.

Anything in .meta will not be served to the student, so it is a good place for the solution file.

Insti commented 7 years ago

I've seen some notes here and there about having a test suite that confirms the provided example solution. How does that need to be set up? Do I need to provide a separate test runner that runs the test suite against the example solution?

Yes.

Does this happen in the Travis build for my repo?

Yes.

Do I need get modify the Travis build such that it downloads all the CLI tooling necessary to run the tests?

Yes.

I don't think that this is required but it is good to automatically ensure that your exercises are complete and working.

Insti commented 7 years ago

The most important thing is to make configlet happy.

bdw429s commented 7 years ago

Thanks for the replies @Insti and @m-a-ge , the additional links were helpful. Somehow in my hour or so of roving over all the readme's and opening all the markdown files I can find I still hadn't come across some of that info.

Or you can move files related to TestRunner elsewhere

Where would be a safe location I could move some boilerplate code that's going to be used for every exercise? Since the CLI fetch command seems to basically only grab the contents of the exercises folder, I assume it would need to be in there, but any folders I create are going to anger the Linter if they're not actual exercises. Do I need to just give up on trying to be clever and go ahead and paste the same boilerplate test runner code in every exercise?

The other thing I was trying to do was have the user install the TestBox testing library in the directory above the tests. That seemed better than installing the entire directory inside each example, but I need to know if the box.json that I have in the exercises folder in my pull will be brought along with the CLI's fetch command.

It can't find example solution for leap as well, since you haven't provided "solution_pattern" in config.json

Good catch, I fixed that.

You also removed "test_pattern"

That's correct. The docs I found said that the default test pattern is test and I'm using that word in my unit test classes, so I removed the setting to use the default.

travis.yml is not running actual tests in its current version, which doesn't seem rigth

Is it not right because the ColdFusion repo has a really old template for the Travis build, or is it not right because I need to do some work to implement a runner to test them?

I also suggest adding hello-world exercise

Yes, that will probably be next. Honestly, I just grabbed the Leap exercise because it was already here in the repo from someone who apparently tried to get it off the ground a year or so ago.

.meta should be a subdirectory of the leap exercise.

Well, not if I want it to do what I need it to do which is to store some reusable files that all exercises can include to reduce the amount of boilerplate copied into each exercise. Also, fewer touchpoints if I want to modify the test runners in the future. Perhaps I simply should have avoided the name .meta as I didn't realize it was already a convention in use (despite the seeming lack of docs on it). I'll happily change the name to something else, but I think the issue is bigger than that. Firstly, the Linter isn't going to like that and secondly, I'm not sure yet if the CLI's fetch command will pull it anyway. That's where I need some help.

Anything in .meta will not be served to the student, so it is a good place for the solution file.

Speaking of the solution file, I'm getting some mixed signals there. Is the idea that the solution will be served to the user for them to refer back to in case they need help, or would that be considered cheating and the user should not have access at all? I mean, the repos are public so it's not like we're really hiding anything...

Do I need get modify the Travis build such that it downloads all the CLI tooling necessary to run the tests?

Yes.

Are there any guides, conventions, expectations, or standard implementations for this or am I free to just manhandle the Travis build until it's running the tests? My first thought is to simply create a task runner that loops over all the exercise folders and calls each of their test runners with a flag that has it run against the solution. Also, I usually install CommandBox via the apt-get repo which requires sudo on Travis since Travis stopped whitelisting source repos. Will that be an issue?

Thanks again for all the help! :+1:

bdw429s commented 7 years ago

:point_up: I moved all the test running logic back inside the exercise folder for now just to get the test to pass. I don't have a way to test whether or not the CLI fetch will pull the box.json I have in the root of the exercises folder. That is what will allow a user to run box install in that folder to pull in the TestBox library used to run the tests.

bdw429s commented 7 years ago

New commits allow a user to start a test watcher in the exercise folder like so:

box task run testRunner --watcher

Any edits to .cfc files will rerun the tests.

Also, Solution can be tested via a new flag as well which I'll use in the Travis build

box task run testRunner --solution
ilya-khadykin commented 7 years ago

As I see it (if I understood everything correctly):

It's a good idea to add INSTALLATION.md in docs directory and describe how to prepare the environment before start working on track exercises, example - https://github.com/exercism/groovy/blob/master/docs/INSTALLATION.md

bdw429s commented 7 years ago

Thanks for the input @m-a-ge but I think we may be on a different page. The reusable test runner logic was to be used for the exercises that the end users would test, not for the Travis build. My understanding (and what I've seen from using the CLI to fetch a track) is that only the contents of the exercises track is delivered to the end user. If that is the case, I cannot include default test runners that the end user will need at the top level of the repository as those files will not be delivered when they fetch the exercises. Does that make sense?

Same with the box.json. The user will need that so they can run box install in the exercise directory and install the testing framework. (The testing framework consists of a folder of code that will be created right there in the current working directory) Moving box.json out of the exercises folder also means the user would never receive it.

I'm also specifically looking for confirmation about whether or not files in the root of the exercises folder will actually get delivered. I don't know how to actually test it until this pull is merged. If they do get delivered along with the exercises like I'm hoping, I'll address my second TODO item above which is to place an index.cfm file in there so when a user starts up a web server in that folder, the default page that opens in their browser can have a nice little list of all the exercises for them to view.

Insti commented 7 years ago

You shouldn't need to provide the whole test framework with every exercise. That should be something the student does once. And the information about how to do this is provided as part of the installation setup instructions.

bdw429s commented 7 years ago

Thanks @Insti but I don't think we're understanding each other. The test runner files I wanted to include form a directory higher up is not the test framework, it's just a simple task (think of it like an Ant task, but not XML) that uses the test framework to run the test suite. This is something I'm providing, but I was looking for a way to not have the code duplicated over and over in each exercise folder.

The actual test framework itself is a little different. I realize I can just put the steps in a read me for the user to go and install it, but I'm trying to make this as easy as possible with as few steps, and with the steps as simple as I can make them. The easiest way is to have a single box.json (works like NPM's package.json that already lists the dependencies that are needed so they can just do box install and get them all. It seems that it would be best if this was a one-time operation and not something that needed installed for each exercise, which is why I would need to store the box.json outside of any particular exercise directory.

Insti commented 7 years ago

Have a look at the haskell track, I think they wanted to do something like this. I think the solution was just to duplicate the file in every exercise. There was some discussion at one point about common data directories, but I don't remember what happened with that, have a look at csharp,java, fsharp tracks for clues.

bdw429s commented 7 years ago

Ok, I have updated the Travis build and added a task runner to test all the possible solutions. The Travis build is now including this and passing.

Insti commented 7 years ago

It might be helpful to include this box.json somewhere so we could see what it looks like.

bdw429s commented 7 years ago

It might be helpful to include this box.json somewhere so we could see what it looks like.

@Insti It's right here in this pull request as part of the files. Do you see it there?

Insti commented 7 years ago

Ah OK I can see it now, I was looking at the master branch. :blush:

Insti commented 7 years ago

I don't understand why as a student I even need a box.json

From what I can figure out, I could run:

sudo  install commandbox
box install testbox

To set up my environment, then when I wanted to run the tests, something like:

testbox run LeapTest

(caveat: I have no idea what I'm doing with coldfusion and maybe an environment that is complicated to configure is a language feature.)

bdw429s commented 7 years ago

I don't understand why as a student I even need a box.json

Why do cooks need recipes? After all, it's just a matter of finding all the right ingredients, right? :)

I see what you're saying and to answer your question, you don't need it, but box.json is the standard way of defining the dependencies for a project. (I looked through the Haskell examples and they all have a package.yml) Right now, TestBox is all that's needed to run the tests, but perhaps that will change in the future. The point is for the student to not need to run any more commands than are necessary so I'm trying to follow the idiomatic best practices of CFML developers with the setup here.

Strictly speaking, CFML is not difficult to configure and the box install step won't even be necessary when running the test from the CLI. The task runner will do that for them, but I still need to define the dependencies required to run the code somewhere. I can keep everything inside the individual exercises folder, but that just seems like a lot of extra stuff repeated over and over. Especially for the path where a user might want to run their tests through a web browser instead of the CLI. It seems like it would a total pain to be moving between exercises and starting up a new web server in each one (and shutting down the other ones) which is why I set it up so a user who chose to start up a web server could just do so with the web root pointing at the root of the exercises folder. This would allow each exercise to easily share the TestBox framework and run all the exercises by just navigating into each folder in their web browser. (It may be important to note that CFML libraries are not installed into the engine like Ruby, but downloaded into the working directory of the actual project itself)

Honestly, maybe I'm confusing things by even bothering with the web based path. CFML has historically been used for building web sites so it's a natural tendency to do things in a browser, though not a requirement. Either way, it wouldn't change the fact that TestBox is a dependency of any app that wants to runs tests and the CF-centric way to describe that dependency is via box.json in the same manner a java dev might use pom.xml, a Node dev would use package.json and a Haskel dev would use package.yml.

Insti commented 7 years ago

I'm trying to follow the idiomatic best practices of CFML developers with the setup here.

That's a good thing I guess.

maybe I'm confusing things by even bothering with the web based path.

Possibly, none of the exercism exercises are really web pages, they're all(?) just simple input/output mappings.

This would allow each exercise to easily share the TestBox framework and run all the exercises by just navigating into each folder in their web browser. (It may be important to note that CFML libraries are not installed into the engine like Ruby, but downloaded into the working directory of the actual project itself)

Each Exercism exercise is typically implemented as its own project, I think this trying to put them under a common "exercism" project is what is causing you trouble.

bdw429s commented 7 years ago

Each Exercism exercise is typically implemented as its own project, I think this trying to put them under a common "exercism" project is what is causing you trouble.

Fair enough. It just really seemed like a "code" smell and like it would be multiplying the amount of setup to get each one working but no sense in fighting against the system if this isn't really how Exercism works.

Possibly, none of the exercism exercises are really web pages, they're all(?) just simple input/output mappings.

Yep, that's correct. Running it in a web server would just allow you to view the test results in HTML, but perhaps that's just nonsense tainted by years (15!!) of CFML dev that were all web based. I suppose Exercism isn't really for existing CF devs but for new devs who are probably not steeped in any such precedent and totally just expect a CLI. I think I'll ditch the web-based approach and just keep it all simple and CLI-driven. Less is more :)

Insti commented 7 years ago

It's still a legitimate tactic to include instructions in the INSTALL.md describing how to set up the web based multi-project way, you just won't be able to rely on the exercism client to serve the "umbrella level" project files for you. But downloading/copy/pasting from github doesn't seem unreasonable for manual a one-off setup procedure.

tleen commented 7 years ago

.meta/ has become more prominent with configlet v3+. There are files in there now that can be used to modify the exercises's README.md file. I just spent some time doc'ing these files in the Go track README. It is used in concert with the configlet generate command to create a README using a combination of the exercise information in the problem-specifications repository, local overrides specific to the track and local overrides specific to the track exercise.

bdw429s commented 7 years ago

Wow, thanks for the links @tleen, that's some of the best docs I've found for some of this stuff. So far I haven't done any work to add any readmes, though I have a lot of ideas of what I'll put in them. I'm still just trying to get the exercise format nailed down. I like the idea of generating the tests from the in/out data so I'll have to check out how Go is doing that to do something similar for the CFML track. I assume you're basically just templating out a test file that has the expectations/assertions dynamically generated?

I'm rearranging the directories (again) right now, but once I get this done, what's the process for getting this merged in? I assume I'll attack some of the readme stuff in a pull request as well, but not sure if everything should be just lumped into this pull. Is there a particular person who needs to approve this or merge it? @kytrinyx also mentioned getting rights to merge my own pulls at some point, but I'm not sure if that's something that doesn't happen right away until you're proven trustworthy :angel:

tleen commented 7 years ago

I assume you're basically just templating out a test file that has the expectations/assertions dynamically generated?

Sort of, a lot like the README generation, the problem-specifications repo has other common exercise information that may be helpful. One of these is canonical JSON representation of test data that you can use for a problem. Some problems have it, some don't, it is a work in progress. But if they do have it one of the things a track can figure out is how to get that data synchronized with the test expectations within their own track.

For example: if you look at Go there is a utility that along with code present in some exercises that will pull the canonical-data.json file from the exercise in the problem-specifications repo take the JSON from turn it into a representation Go can use and use that representation in the test.

How the data gets from a problem-specifications canonical-data.json to an actual test in an exercise in a track is so much dependent on the language and how test work in that language that it will be track specific. The above is just how Go does it. I'm not sure how it would be done in CF.

You will often see in maintaining a track that Exercism tries to keep common reusable information together in problem-specifications but that in no way obligates you to use it. There is a constant push and pull between trying to keep exercises in sync between language tracks and doing what works best for each language in its own track.

For now I would ignore all the import test data stuff and keep things simple till the track has its feet under it. This is entirely unnecessary and more of a nice-to-have-thing. Just use whatever test data you think is good and embed it directly in the tests. After the track has a few exercises going you will be in a much better position to evaluate how to automate syncing that data up with your own exercise implementations.

bdw429s commented 7 years ago

My latest commits have rearranged the folder structures. Everything is now kept inside of each exercise directory. All a student needs to do in order to test their code (after the one-time action of installing CommandBox) is this:

box task run TestRunner
# or start a test watcher
box task run TestRunner --watcher

That will take care of installing the testing framework and running any test specs.

The /tasks/TestAllSolutions.cfc runner will tests against the solution in every exercise. It is wired up in Travis and can also be manually run by executing this command from the root of the repo.

box task run tasks/TestAllSolutions

A maintainer can also run the test suite against their solution for a given exercise by running the exercise tests like so:

box task run TestRunner --solution
bdw429s commented 7 years ago

I have added a hello world exercise to this pull. What else should I work on? Should I just keep adding stuff to this pull or start creating new pulls? It feels kind of dirty to just be dropping everything here, but since there's so much stuff being added/changed it's going to be hard to start creating a bunch of branches if nothing is merged yet.

I noticed that there's a ton of missing markdown files in the docs folder that I can start stubbing out. I'm not 100% sure where they are all used, but I can guess by looking at the content in other repos like this one: https://github.com/exercism/ruby/tree/master/docs

Smarticles101 commented 7 years ago

@bdw429s I would start making more PR's. The trend seems to be to use one RP per issue, or thing you're trying to implement. I'd probably work on some documentation (installation, contributing, about, etc.) and then implement more exercises!

Insti commented 7 years ago

I'd create separate PRs for things. You can create branches off this branch and then submit PRs that just contain those differences. So everything works for testing locally, once this PR is merged all the subsequent PRs will be able to be merged.

Insti commented 7 years ago

The next thing to do is to update the documentation in the docs file so students know how to use the track.

Then implement a bunch of exercises so they've got something to do.

Insti commented 7 years ago

Just wanted to say this is looking really good. I'm impressed with how much of this you've been able to figure out 👍 Asking good questions (which you have been doing) helps. 👍

Keep up the good work. :heart:

tleen commented 7 years ago

Then implement a bunch of exercises so they've got something to do.

Perhaps no more than 5 then it would be time to look at the track and start figuring out how to build in the tooling around including stuff from the problem-specifications repo. If you get too many exercises it becomes a chore to try an retroactively update them all. Plus you would like to have a solid strategy for other contributors to add exercises when the ball gets rolling.

tleen commented 7 years ago

I have added a hello world exercise to this pull. What else should I work on? Should I just keep adding stuff to this pull or start creating new pulls?

I would generally agree to add some documentation to the track. Like all the stuff from the initial comment on this PR. I have a text editor, what else do I need to make this work? Like what are all these *box things, why do I need them? how do I install them? Where do I find language references to read up if I get stuck trying to do something?

Then I would make sure the hello-world exercise has a ton of easy walkthrough instructions on how to implement a .cf? file. I would stub that exercise for sure just so absolute Cold Fusion newbies can start piecing together what a solution would even look like in the track. hello-world exists as basically a ready-to-go solution where all a new user would need to do is change some text from "" to "hello world", save, and follow the instructions on how to run the tests. All the difficulty in that exercise should be based on gaining familiarity with the environment.

This is a mega-commit to be sure, after this probably keep the PRs targeted to a single exercise implementation or meta track work for your own stake and for anyone else trying to assist with review. I would merge it if you think it all works. None of us but you can tell if the CF stuff works and we can fix any of the Exercism stuff if it is broken.

This is also a really good experience for Exercism too, once the track gets going you may want to think about contributing to some "how to get a track going" guide for anyone trying to boot up a new language track. I'm sure our documentation can always be improved. Try to keep note of anything that seems really confusing or counterintuitive about this process.

I'm excited to be a test newbie on this track. I last did CF in the 90s and am interested to see what has happened to it since!

bdw429s commented 7 years ago

@tleen I think this pull is ready to merge. I've just created a second pull which branched from this one that has a bunch of the documentation updates: https://github.com/exercism/coldfusion/pull/27 This pull will need merged first.

I'll do a second pass later on the Hello World exercise to add more help, but I'm still sort of figuring exactly what docs and markdown files all end up visible to the user so I'm not 100% sure yet where to put additional help. I'm sure this will make more sense once I can see it running.

kytrinyx commented 7 years ago

Thank you so much to everyone who has pitched in on this thread asking and answering questions!

A couple of quick notes:

  1. We have a fairly extensive launch guide that I'm sure can be dramatically improved based on all the questions and answers here. https://github.com/exercism/docs/blob/master/language-tracks/launch/README.md
  2. The exercism CLI for v2 is expected to have a prepare command that will be (automatically) called the first time someone downloads an exercise for a track, and any time the settings for a track have changed. I'm working on the detection logic for that. (see "setup_patterns" in https://github.com/exercism/discussions/issues/192)