Open richardartoul opened 9 years ago
hi @richardartoul! thanks for this pull. i wonder if "extra credit" is the term we want to use. this has been brought up in #129 and i wonder if we should use that thread to figure out exactly what our overall strategy is before including this.
i definitely think this is valuable and would like to integrate part, if not all. @jmeas @kadamwhite @rmurphey ?
Fixed the typo :) The discussion on extra-credit / advanced topics is interesting. Let me know if I can be of any assistance!
thanks! and re: the convo, please weigh in with your opinions/reasoning! the space is wide open right now :)
Hi. Long time follower here; first comment.
I'd prefer to leave tasks like this out, even as if noted to be extra credit. It's fun to know how the internals of these things work, but probably not a good practice to reimplement them.
If we're looking to make this test accessible to newer programmers, as a lot of the recent work on this code has noted, I think it is a part of our responsibility to practice some focus in terms of what we draw their attention to. For most developers, these things will rarely be practically relevant, and, when they are, there is existing documentation for them.
I would be surprised to see if someone ever polyfilled, say, Array.reduce within a codebase I was working on, or, if I was asked to write an implementation in an interview. For me, the intent of js-assement is more practical than theoretical, which is why I felt strongly enough to comment here. :) Please, feel free to disagree though.
He 7e7, welcome to the discussion :) I'm new to the project as well. I think the crux of the issue here is this: What issue is js-assessment designed to solve? According to the repository description, JS-assessment is designed to "assess the skills of a candidate for a JavaScript position, or to improve one's own skills."
As it currently stands, I find that JS-assessment is a great start towards accomplishing this goal, but as a comprehensive solution falls flat. Lets take the arrays section for example, because thats the section I'm proposing to expand with my pull request.
The latest version of js-assessment asks the user to implement, among others, these 6 functions: 1) indexOf 2) append 3) truncate 4) prepend 5) curtail 6) concat
Each of the functions can be implemented with a single line of code using the built in array methods which correspond exactly with the desired functionality. From an employers perspective, what has solving these tasks taught you about a potential hire? Practically nothing, all you know is that they are familiar with the most basic methods regarding javascript arrays. Even worse, from a self-assessment perspective, successfully implementing all these functions lulls the user into thinking they know more about arrays than they really do.
Now let me address your other point, why make the user implement native array methods from scratch? You're right that it would be a poor idea to implement these methods on a project when they already exist, however, implementing these functions from scratch demonstrates understanding of a few things:
1) The programmer understands that "pushing" and "popping" items into/outof an array is computationally "cheap" (constant time), while "shifting" and "unshifting" items out of an array is computationally "expensive" (linear time) because the entire array has to be iterated through. Similarly, implementing these methods demonstrates that the programmer is aware that under the hood, indexOf and concat are not magical functions, but in fact require iterating through entire arrays. This has consequences on how a programmer chooses to write their software.
2) Implementing these methods from scratch also demonstrates that the programmer is aware of the difference between a function that modifies a passed parameter (like pop and push), and a function that returns an entirely new object (concat).
3) The programmer is aware that arrays in javascript are actually just very specialized objects whose properties can be manipulated just like an objects can (the .length property can be manipulated directly)
4) If the person performing the assessment happens to take a look at the mocha tests themselves, they might learn a little bit about subclassing, inheritance, and prototyping (another subject that js-assessment doesn't cover in enough depth)
In summary, js-assessment has two stated goals:
1) Allow employers to assess the skills of a potential hire
2) Allow individuals to assess their own Javascript skills and learn where they need to improve
As it stands now, JS-assessment is too simplistic to accomplish either of these goals to any degree of satisfaction by industry standards. Perhaps calling these problems extra credit is not the way to go, perhaps an entirely new repository needs to be made. Js-assessment-intermediate or JS-assessment-advanced. Exactly where to add new tests and what to call them I'm not sure about, but as it stands, the existing tests are too simplistic to accomplish JS-assessments stated goals.
I understand. Algorithmic optimization is important to some people and, at the same time, difficult for beginners to grasp.
At some point, though, you're right, this does come into the skills spectrum. Should js-assesment have progressing levels of difficulty like this? If so, how? Maybe some core contribs should weigh in here.
hey @7e7 and @richardartoul ! great discussion. i think moving forward, js-assessment wants a way to grow and stay relevant.
i've been considering a "package mgmt" solution of sorts where we have a core set of tests as well as a "-contrib" set and you can generate your own "assessment" by listing the modules you'd like to include. that'd be a big refactor but would allow for more community generated intermediate and advanced material without bloating or over complicating the core functionality. thoughts?
I like this idea, @ashleygwilliams, because it allows you to easily build your own assessment if you wanted to use this in an educational setting. This would also allow you to split out these 'extra credit' questions into a different segment.
Definite +1 for allowing additional suites of tests to be "plugged in," but I want to propose that some sort of level system be built into the repository itself.
One of the hardest things as a beginner (I am speaking from my own, personal experience) is knowing where to start. When I fired up JS-Assessement for the first time, shortly after it was released, I had to look through each suite before I found one I was comfortable starting with. I feel it would be great to provide more advanced tasks, but if they were just mixed in with the beginner tasks then that increases the number of (potentially intimidating) tasks a beginner has to sift through, and that could be discouraging.
I'd like to see some sort of level namespacing, so that I could do npm run test:intermediate
or localhost:4444/intermediate
or some such, so that these tests can coexist with the existing ones but in a way that doesn't immediately make it harder to begin.
</$0.02>
I've been using a subset of the js-assesment as part of a hiring process for a while now and have some thoughts on this. Overall, from the perspective of assessing the skills of a potential hire, it’s a great tool for estimating skill level and a good starting point for a conversation with a candidate. Extra emphasis here on using the assessment as a starting point for a conversation with a candidate.
What have you learned if the candidate answers with all the reference answers? Are they good at Googling and/or copy/paste? More often, the power of the js-assessment is in setting what you want to be the lowest threshold for people you want to talk to and looking at how candidates complete any given unit test. From there, you can discuss their answers or conduct any other technical questions relevant to your hiring process.
As an aside, depending on how much your organization relies on node and js unit testing, these factors can be just as telling about a candidate as their actual assessment answers. You may want to consider offering a primer on these to candidates before having them take the assessment, example unit tests, or setting up controlled environment that they can use for taking the assessment. Keep in mind that not all software is written by software companies - there are a lot of industries that rely on creating their own software but may not hold themselves to the same rigors that some software companies do.
As far as difficulty level goes, there is a variety in level in the existing questions. I'm all for having a variety of difficulty levels, and adding in additional questions is a good idea. But even better is the idea of modularly building out a subset of questions appropriate to the situation (e.g. would you ask a candidate for an internship/junior position about currying?). The js-assessment as is is fairly exhaustive and can take candidate upwards of several hours to slog through. Candidates balk at taking a massive test as a screener/precursor to interviewing. So +1 on a level system/custom build system. With doing so, it would be a good idea to flag the difficulty level of each test.
Again, from the perspective of assessing the skills of a potential hire, the assessment works pretty well IMHO. I'm not concerned if a candidate know every esoteric detail of the language; I want them to know enough to have a larger discussion.
I'm all for modularity, however, I don't think that a package manager is the way to go. It adds (in my opinion) an unnecessary level of complexity and refactoring of the existing code base. The existing setup actually lends itself to namespacing and modularity quite well. Different test suites can use different runner.html and in addition if you take a look at the existing package.json file:
"scripts": { "start": "node index.js", "test": "mocha -R spec 'tests/app'" },
it easy to see how we could do something like this:
"scripts": { "start": "node index.js", "test-beginner": "mocha -R spec 'tests/app-beginner'" "test-intermediate": "mocha -R spec 'tests/app-intermediate'" },
In terms of the js-assessment already being fairly long, well, I don't think its too much to expect employers using the js-assessment test to pick and choose which tasks they want prospective hires to undergo. Its already easy to turn certain tests on and off by simpling adding an "x" in front of the corresponding mocha describe statement.
I think one possible temporary solution (because it will take time to figure out which tests go together, what is intermediate, what is advanced, what we want to include in the core functionality etc) is to add a "additional tests" section. Here we an add more nuanced/advanced tests that we think are useful. If someone wants to use these tests, they can simply execute npm test-additional and people can also pick and choose which of the additional tests get run using the mocha "x" feature (we can default them all to off). This area could also serve as a "testing ground" to see which tests should make it into the core set of tests that js-assessment currently employers. Thoughts?
Added a set of extra credit arrray-related tasks that require the user to reimplement some of the native array methods. I added them because I was frustrated that the solution to some of the original arrays tasks (namely append, truncate, prepend, curtail, concat, and indexOf) could be solved with a single line of code. The previous tasks simply tested if the user was aware of array methods, whereas the extra credit tasks test if the user understands the methods and could implement them from scratch.