pyOpenSci / software-submission

Submit your package for review by pyOpenSci here! If you have questions please post them here: https://pyopensci.discourse.group/
94 stars 36 forks source link

Presubmission Inquiry: Visualization Tool: Sevivi, a python package and CLI tool to generate videos of sensor data graphs synchronized to a video of the sensor movement #47

Closed enra64 closed 2 years ago

enra64 commented 2 years ago

Submitting Author: Arne Herdick (@enra64)
Package Name: sevivi One-Line Description of Package: sevivi is a python package and command line tool to generate videos of sensor data graphs synchronized to a video of the sensor movement. Repository Link (if existing): https://github.com/hpi-ch/sevivi/


Description

Sevivi is designed to render plots of sensor data next to a video that was taken synchronously, synchronizing the sensor data precisely to the video. It allows you to investigate why certain patterns occur in your sensor data based on the exact moment in the video.

image

Scope

Sevivi renders plots of given data next to a given video.

Target audience is researchers working with motion data, e.g., prediction of squat intensity using acceleration data. When these researchers have taken videos of their trials, they might want to see what exactly has produced a certain pattern, helping to differentiate between noise and signal. Sevivi makes this easier, by rendering synchronized videos of both the original video and the sensor data plots synchronously. Synchronization can be done manually, using an IMU on the camera (e.g. https://github.com/DavidGillsjo/VideoIMUCapture-Android/) or by using skeleton data from a tracking software (we tested with an azure kinect).

Our research indicates that no similar python packages or other programs exist.

P.S. *Have feedback/comments about our review process? Leave a comment here

NickleDave commented 2 years ago

Hi @enra64 welcome to pyOpenSci!

Thank you for your detailed pre-submission inquiry.

I need to check in with other editors about where we are at with software reviews, but I did want to make sure you knew we saw this.

It's about to be a holiday where I am in the US, but we will get back to by the end of next week.

NickleDave commented 2 years ago

Hi again @enra64 -- sorry for the delay in getting back to you.

This does appear to be within scope as a data visualization tool.

It's not clear to me who we would reach out to for reviews though.

Would you be able to point me to some researchers working in related areas, or developing related tools? I realize you did not find any similar programs that exist for your specific tool, but if you can give me an idea of who the "target audience" that would really help. E.g., if you can think of 2 or 3 researchers in your area.

Thank you

arianesasso commented 2 years ago

Dear @NickleDave, I could volunteer to help review this package after the holidays :). Although I am not involved in its creation, I belong to the same organization as @enra64 so there could be some bias in that.

Another reviewer suggestion could be @edgarriba: https://github.com/edgarriba. Cheers!

NickleDave commented 2 years ago

Thank you @arianesasso that is helpful.

I think you are right that we should avoid anything that could appear biased if possible, but you have given me a much better idea of where to search for reviewers.

And @edgarriba if you are open to reviewing for us again, your expertise would be very welcome here.

lwasser commented 2 years ago

hi everyone - just checking in here!! This looks like a presubmission inquiry that has been approved for review. @enra64 can you please submit a NEW issue that uses the actual review template rather than the pre submission? We will then update all of the reviewer, etc information there as we do for other reviews. thank you all! we can always link to this issue for reference if need be in the actual review issue but this issue is missing the headers that we want to keep tabs on the review process! Many thanks!

edgarriba commented 2 years ago

Hi! What are the timelines here?

lwasser commented 2 years ago

hi @edgarriba we are hitting the holiday season here in the US and so pyOpenSci will be taking a holiday break through the new year. Here is what i suggest before I sign off for the next week! First, please open a new submission for review if your package is ready.

Please do this anytime but know that we won't be able to look at it until after January 3. Once you do that I will close this issue and we can begin looking for reviewers. Until then, please have a wonderful new year and we will check back in here after January 3. I am signing off now so i won't see any additional messages until then as well.

NickleDave commented 2 years ago

Hi @edgarriba thank you for your quick reply.

In our guide we ask for a three week turnaround but with everyone's schedules extra taxed right now I would say 1-2 months.

I am actually waiting to hear more from the author of the package @enra64 about the intended audience -- I think that might have got lost in the conversation above.

Let me hear more from @enra64 and do my own homework based on related tools from @arianesasso then discuss with @lwasser after the holidays.

So, long story short, we are not asking you to review right now but if we did it, we'd hope to have it back 1-2 months after the initial submission.

edgarriba commented 2 years ago

Perfect guys, no prob from my side

enra64 commented 2 years ago

Thanks everyone for checking this out. Hopefully my colleague will take over creating the review issue soon. Regarding the related researchers: I think the following papers from the exercise recognition/exercise intensity recognition space might be interesting:

Another area of interest for this package might be gait analysis, even though this could be difficult due to moving subjects.

NickleDave commented 2 years ago

Perfect, thank you @enra64 -- that helps me understand the research context much better.

Please do go ahead with creating the submission, and we will move forward on a review.

I will go ahead and close this.