UBC-MDS / data-analysis-review-2023

0 stars 0 forks source link

Submission: Group 19: Speed Dating Analysis #20

Open monazhu opened 7 months ago

monazhu commented 7 months ago

Submitting authors: @mishelly-h, @rorywhite200, @wenyunie, @monazhu

Repository: https://github.com/UBC-MDS/speed_dating_analysis Report link: https://ubc-mds.github.io/speed_dating_analysis/output/analysis_report.html Abstract/executive summary: This research delves into the dynamics of self-perceived attractiveness in the context of dating. We explore whether individuals accurately gauge their own appeal compared to external judgments. Analyzing data from speed dating studies, the findings reveal a systematic tendency for individuals to overestimate their attractiveness. While a significant correlation exists between self-ratings and others’ ratings, this research underscores the interplay between self-perception and external judgments in the realm of dating. The implications range from improved self-esteem for those perceiving themselves as more attractive to potential challenges in social interactions. Future research could investigate the influence of contemporary factors like social media on self-perception and exploring the multidimensional aspects of attractiveness.

Editor: @ttimbers Reviewer: Celeste Zhao, Jing Wen, Merete Lutz, Orix Au Yeung

meretelutz commented 7 months ago

Data analysis review checklist

Reviewer: @meretelutz

Conflict of interest

Code of Conduct

General checks

Documentation

Code quality

Reproducibility

Analysis report

Estimated hours spent reviewing: 1.5

Review Comments:

Please provide more detailed feedback here on what was done particularly well, and what could be improved. It is especially important to elaborate on items that you were not able to check off in the list above.

  1. License - You missed a change when copying and pasting in an example license. Under attribution it says 'Copyright © Tiffany A. Timbers, Trevor Campbell, Melissa Lee' when it should be your team members.

  2. The instructions under 'Usage' in your README are pretty exhaustive and a little confusing to look at. If the container method is preferred then that should be first under that section. Maybe include a separate .md with more troubleshooting notes and keep the instructions in the README simple.

  3. I think you could justify your methodology more, i.e. explain why you chose a t-test and why you chose a 95% confidence interval. I think you've declared your assumptions well, especially with the observations being correlated (not iid).

  4. I was unable to run your analysis, as I have a mac with an M2 chip. I had an issue when trying to compose the docker image. The exact error message is below. analysis-env The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested

  5. Overall, great job guys! This was a super interesting project, really fun to learn about!

Attribution

This was derived from the JOSE review checklist and the ROpenSci review checklist.

Jing-19 commented 7 months ago

Data analysis review checklist

Reviewer: @Jing-19

Conflict of interest

Code of Conduct

General checks

Code quality

Reproducibility

Analysis report

Estimated hours spent reviewing: 2

Review Comments:

Overall, the report is very well-written with a lot of literature research to make your point sound and easy to follow. I was able to reproduce your analysis locally with very few commands! Here are some minor changes I suggest:

  1. README References: The About section currently contains only two in-text citations. It might be helpful to update the reference list to fully reflect the resources used in README specifically.

  2. Running Analyses on Your Local Environment under Usage Section: I think the information in the Usage section could be clearer. Perhaps separating the steps for setting up the environment from the notes (similar to what you have for Docker setup), or using different text styles for each, could improve readability.

  3. Your report has excellent flow! However, I would recommend adding some Exploratory Data Analysis (EDA) before the analysis section (e.g. any outliers and data preprocessing) and explaining how you decide on the methodology. Also, stating the hypothesis at the beginning of the analysis would be helpful.

Attribution

This was derived from the JOSE review checklist and the ROpenSci review checklist.

SoloSynth1 commented 7 months ago

Data analysis review checklist

Reviewer: @SoloSynth1

Conflict of interest

Code of Conduct

General checks

Documentation

Code quality

Reproducibility

Analysis report

Estimated hours spent reviewing: 2

Review Comments:

Please provide more detailed feedback here on what was done particularly well, and what could be improved. It is especially important to elaborate on items that you were not able to check off in the list above.

Overall a very clear and concise analysis. The R codes are well-documented and well-tested. I could reproduce the pipeline and the final HTML report without any problem.

However, there are some minor issues that I have spotted so far:

Attribution

This was derived from the JOSE review checklist and the ROpenSci review checklist.

celestezhao commented 7 months ago

Data analysis review checklist

Reviewer: @celestezhao

Conflict of interest

Code of Conduct

General checks

Documentation

Code quality

Reproducibility

Analysis report

Estimated hours spent reviewing: 2.5

Review Comments:

This report is well-organized with clear conclusions, and the organization of project files and the reproducibility of results are also clear and easy to understand. The conclusion is very interesting: "the findings reveal a systematic tendency for individuals to overestimate their attractiveness", which immediately caught my eye!

Here are my minor suggestions:

Overall I really like this project, well done guys!

Attribution

This was derived from the JOSE review checklist and the ROpenSci review checklist.