tud-hri / joan

JOAN is an software package that allows to perform human-in-the loop experiments in the open source driving simulator CARLA. JOAN facilitates communication between human input devices and CARLA, the implementation of haptic feedback, systematically storing experiment data, and the automatic execution of experiments with multiple experimental conditions.
Other
11 stars 7 forks source link

[JOSS Review] Documentation & Functionality #10

Closed humanfactors closed 2 years ago

humanfactors commented 2 years ago

Main review issue: https://github.com/openjournals/joss-reviews/issues/4250

First of all, I want to say thank you for your efforts in developing and sharing this code. As someone who has worked with driving simulators in human factors research — good open source solutions are extremely rare to come by. Your work is exactly what is needed in the field at the moment, so thank you for your contributions!

Below, I provide feedback on a number of issues specifically related to repository and documentation itself, this issue does not assess the software paper (I will create a second issue for this).

Apologies for the length of the review, I thought it was easier to keep everything in one place than multiple issues. I intend all my feedback to be as constructive as possible and I am aiming to improve your project and make it as accessible to as many individuals as possible!

1. A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?

I thought the introduction you provide here https://joan.readthedocs.io/en/latest/introduction/ is the best description of JOAN. The Github repository README.md was quite vague, and due to the "high level" documentation home page, it was difficult for a me as a naive individual, to understand what your software did. I suggest it could make your work clearer to include the description you provide in the introduction on the README.md page of your repository.

I would also like to emphasise an issue that I think underpins most of my comments below. I think you really need a description of what Carla doesn't offer, to make much clearer what JOAN does do. At the moment, I can get an understanding of the features JOAN has, but it's difficult to assess how useful these are without knowing the limitations in Carla. I note you hint at this in the second paragraph of the introduction page, but I suggest expanding on this to make it really easy for new users to get how it all works.

2. Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.

I did note I was unable to actually install Carla, so keep that in mind.

I note that you don't have an automated installation solution. I very much appreciate this is in part due to Carla requiring manual installation. However, I would like to ask whether you have any ideas for making it clearer to validate the build context for Carla.

You have a lot of warnings about Python versions and versions of Visual Studio on path, and I wondered if a simple script could be produced to check this. Again, not essential but I'm wondering whether you had any thoughts to this regard.

I note that the dependencies are listed gradually, making it hard to know up-front what will be installed.

The instructions in the compilation pages are dependent on externally hosted URLS: https://www.dropbox.com/s/6v35q307dosin55/120222_JOAN_Assets.zip?dl=0. I appreciate that large asset files don't belong on Github. However, it is would be more appropriate to use a service like FigShare (https://figshare.com/) in which the materials are hosted on an archived, long-term storage solution, with citable DOIs for the materials. This is particularly important for anything that is crucial to the installation process.

Currently, the documentation often doesn't describe why a user is being instructed to do a set of steps. For example, on the Windows Carla page https://joan.readthedocs.io/en/latest/setup-carla-windows/, I was initially confused as to why a user couldn't download the pre-compiled binary. It would be good to have a statement here https://joan.readthedocs.io/en/latest/setup-carla-windows/ like:

"In order to use the UE Editor which is required to run JOAN experiments, you must compile Carla, as the binary release does not contain the editor. Furthermore, manually compiling ensures you can install all JOAN assets in the binary."

Similarly, it's not stated that "Unreal Engine 4.24 is required in order to build Carla" anywhere on the page. The user is just prompted to install it. These kind of descriptions just clarify why actions are being taken.

In summary, what I am suggesting is to look over the installation documentation and ensure that for each step you are clearly stating why the user is about to do this, and foreshadowing what is ahead.

Could you clarify where the .egg file should be included "Include the CARLA .egg file in the JOAN project"

3. Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).

Excellent examples and animated gifs are provided in the documentation. I think users will appreciate the clarity of instructions provided for the software.

I did wonder though if a more complete example of the software and it's use could be provided somewhere? That is, almost like a brief methods section where you can describe a complete example use case of JOAN. This should address questions like, what variables were captured and why, what input devices were used, how the map was developed. Then, connect these details with the features of JOAN. These kinds of information can help an individual form a more complete mental model of your software (and make them hopefully more likely to use and cite it).

As I noted below, I expected the documentation for JOAN under "First Steps" to cover this kind of introductory description, but the information on the page was instead highly technical (https://joan.readthedocs.io/en/latest/firststeps-joan-overview/. Perhaps it is here you could introduce the kind of example I am talking about, shifting this technical overview information somewhere else?

To be clear, I don't think this needs to be very lengthy, but might help a lot in attracting new users.

4. Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?

Perhaps what is missing overall is a better description of what Carla would look like without JOAN.

In my view, it was quite difficult to ascertain what exactly the purpose of each JOAN module was. From a technical/developer perspective, I could see a lot of detail was placed into describing the API: https://joan.readthedocs.io/en/latest/firststeps-joan-overview/. However, for the average or naive user, it is unclear exactly what the modules are used for or why they are needed. For example, looking at the data recorder module https://joan.readthedocs.io/en/latest/modules-datarecorder/, I was unsure what exactly each variable you could select would record. What would the data plotter be used for in terms of experiments?

Take for example the data recorder module. Perhaps you could extend it along the lines of:

"The data recorder module is used to record the data from an experiment run in JOAN (linked to Carla). Data available for capture include: Driver interaction behaviour (e.g., steering wheel, braking, throttle), etc... The data recorder module enables you to export varibales to a CSV document, and is essential if you are planning on using the [other modules] or [purpose]." In addition to the other notes you have there.

5. Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?

I note that you provide some example use and manual tests here: https://joan.readthedocs.io/en/latest/example-usage-and-testing/

6. Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

7. Minor Assorted Issues


Again, I thank the authors for their hard work on an important package. I hope you find this review process constructive, and I hope I can use your package one day in the future.

niekbeckers commented 2 years ago

Hi @humanfactors,

Thanks for your comprehensive review, we very much appreciate it! We are working on your comments and will reply here on a rolling basis :-)

We've updated the statement of need in readme.md and the documentation introduction (comments 1.1, 1.2) , better indicating what 'gaps' in Carla JOAN addresses.

OlgerSiebinga commented 2 years ago

Hi @humanfactors, As @niekbeckers already said, thanks for the many useful and constructive comments! By now, we have addressed most of them. I tried to address and explain the three last open checkboxes below. Let me know if you agree or have any other questions.

The Build Context

We agree that building Carla is a tedious process with a lot of warnings and conditions. The problem here is that we also don’t fully know to what extent these specific versions of dependencies really matter. The instructions to build Carla are heavily based on their own documentation, and instead of trying to also develop extra tools for building Carla we prefer to link users to their documentation.

Example of usage

I have updated the introduction page in the documentation to give more details on how we think JOAN can be best used. I’ve also added a link to a research paper that actually used JOAN. Is that what you had in mind?

Tests

The testing is a difficult part because we are trying to validate if a human-in-the-loop experiment is working as intended. So comparing (human input) values to a pre-recorded CSV is not going to be of any help. What we are actually testing here boils down to three things:

If any of these things fail, the user will get an exception or clearly unexpected behavior (e.g., if the human input steering angles are larger than expected). So, we think comparing values is not really necessary to test if JOAN works as expected.

Thanks again for the extensive review of these topics!

humanfactors commented 2 years ago

Thanks for the update. I'm currently on leave and have limited reception, but will be able to finalise this next week.

Cheers.

On Tue, 12 Apr 2022, 11:20 pm Olger Siebinga, @.***> wrote:

Hi @humanfactors https://github.com/humanfactors, As @niekbeckers https://github.com/niekbeckers already said, thanks for the many useful and constructive comments! By now, we have addressed most of them. I tried to address and explain the three last open checkboxes below. Let me know if you agree or have any other questions. The Build Context

We agree that building Carla is a tedious process with a lot of warnings and conditions. The problem here is that we also don’t fully know to what extent these specific versions of dependencies really matter. The instructions to build Carla are heavily based on their own documentation, and instead of trying to also develop extra tools for building Carla we prefer to link users to their documentation. Example of usage

I have updated the introduction page https://joan.readthedocs.io/en/latest/introduction/ in the documentation to give more details on how we think JOAN can be best used. I’ve also added a link to a research paper that actually used JOAN. Is that what you had in mind? Tests

The testing is a difficult part because we are trying to validate if a human-in-the-loop experiment is working as intended. So comparing (human input) values to a pre-recorded CSV is not going to be of any help. What we are actually testing here boils down to three things:

  • Does the connection with Carla work?
  • Does the hardware input work?
  • Does JOAN’s underlying structure (with settings, news, and a bunch of separate Python processes) work?

If any of these things fail, the user will get an exception or clearly unexpected behavior (e.g., if the human input steering angles are larger than expected). So, we think comparing values is not really necessary to test if JOAN works as expected.

Thanks again for the extensive review of these topics!

— Reply to this email directly, view it on GitHub https://github.com/tud-hri/joan/issues/10#issuecomment-1096719427, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD2ZWLB7NI3TCDLH7FKNRB3VEV2ATANCNFSM5RG4F65A . You are receiving this because you were mentioned.Message ID: @.***>

humanfactors commented 2 years ago

Few quick follow ups:

Otherwise changes look good!

niekbeckers commented 2 years ago

Hi @humanfactors, thanks for spotting these - we've converted all remaining Dropbox links to the 4TU data storage links and fixed the note overrun.

humanfactors commented 2 years ago

Thanks, all my concerns from there are resolved then :)