CaptainEven / MCMOT

Real time one-stage multi-class & multi-object tracking based on anchor-free detection and ReID
MIT License
386 stars 82 forks source link

Enhancement: Support for Additional Object Classes in MCMOT Tracking #113

Open yihong1120 opened 9 months ago

yihong1120 commented 9 months ago

Dear MCMOT Contributors,

Firstly, I would like to extend my gratitude for your exceptional work on the MCMOT project. It's an impressive extension of FairMOT and has been instrumental in advancing multi-class multi-object tracking.

Context: I have been utilising the MCMOT system for a project that requires tracking a diverse set of objects beyond the current C5 and VisDrone datasets. While the current implementation superbly supports the predefined object classes, there is a growing need for a more flexible system that can adapt to new object classes without significant manual intervention.

Issue: The current architecture and pre-trained models are optimised for the predefined classes in C5 and VisDrone datasets. However, for users who wish to track additional or different object classes, the process of integrating new classes is not straightforward and lacks documentation.

Suggested Enhancement: It would be highly beneficial for the community if the project could include:

  1. A clear guide on how to train the model on new object classes, including data preparation, labelling, and training procedures.
  2. A more modular approach in the codebase to facilitate the addition of new object classes without altering the core functionality.
  3. If possible, an update to the pre-trained models to support a wider array of object classes, or a tool to assist users in creating their own pre-trained models for custom object classes.

Potential Impact: Implementing these enhancements could significantly broaden the applicability of MCMOT for various domains, such as retail analytics, wildlife monitoring, and other areas where object classes differ from those in road traffic or aerial surveillance scenarios.

Thank you for considering this suggestion. I am looking forward to your thoughts on this matter and am willing to contribute to the development of these features if needed.

Best regards, yihong1120