Open pauldijou opened 9 years ago
Hi @pauldijou, actually this is one of the aspects of the app that is tightly coupled with the yeoman-generator
implementation. Even though the CLI it presents you the questions one by one, in reality some of these are defined by the generator author as a single stack, usually when that happens it means that the generator doesn't really need the previous answers to present you the next ones.
That said, I don't think the current implementation may cause problems, in the example you gave other prompts would be displayed to the user as it already happens in many generators that we tested during development.
I still think this issue has a lot of value though since it keeps the interface closer to what authors/users are used to in the CLI and it would fix the annoying issue of not having the confirm button visible on the page after scrolling down.
:+1: I give it my thumbs up and will happily review a PR, thanks for you thorough description :smile:
Hey @ruyadorno , I just pushed a fake generator with some funky crazy prompts. If the Yeoman app can support them, I'm sure it can support nearly any generator. Right now, the HTML promtps are a total mess while the CLI works just fine. To be honest, I don't think I will have time to submit a PR for now.
Anyway, enjoy: generator-app-test
I didn't dive into the app code, but it looks like it tries to compute all the possible prompts and display them all at once. It think this solution cannot work and that the prompts should be asked one by one for several reasons.
1) Nearly all properties of a prompt (the message, the choices, the validation, displaying it or not) can be based on the previous answers.
2) You can have prompt loops. For example, if you need a list of inputs from the user, you might want to ask the same input prompt again and again until the user leave it empty, concatenating all the answers.
3) Depending on the user answers, the generator might compose with others generators, which might add new prompts.
Obviously this has some problems compared to the CLI. What if the user changes an old answer? Reseting all following prompts will not be enough, especially if we started asking prompts from a composed generator which shouldn't have been composed in the first place according to the new answer. One solution is to freeze any answered prompts, just like the CLI. Not really HTML friendly tough.