A deep-sea creature detector for the 2022 MATE Machine Learning Satellite Challenge.
Developed by Peyton Lee, Neha Nagvekar, and Cassandra Lam as part of the Underwater Remotely Operated Vehicles Team (UWROV) at the University of Washington.
Deepsea-Detector is built on MBARI's Monterey Bay Benthic Object Detector, which can also be found in FathomNet's Model Zoo. The model is trained on data from NOAA Ocean Exploration and FathomNet, with assistance from WoRMS for organism classification. All the images and associated annotations we used can be found in our Roboflow project.
Deepsea-Detector uses YOLOv5 for detections and Norfair for tracking.
You'll need Python 3.8 or higher for this project. We use Git LFS to store some of our larger files, including the weights for our YOLOv5 model.
Open a command terminal and run the following lines:
git clone https://github.com/ShrimpCryptid/deepsea-detector.git
cd deepsea-detector
git lfs install; git lfs fetch; git lfs pull
pip install -r requirements.txt
This downloads and installs this project and all of its dependencies.
Run the following command to boot up the UI:
python src/deepsea_detector_ui.py
This will boot up the Deepsea-Detector UI! Follow the prompts to set up your output video and output comma separated value (CSV) files.
Deepsea-Detector will output a CSV file that includes the first timestamp/frame an organism appeared, its predicted classification, and the timestamp/frame of its exit from view. It will also output an MP4 video file that shows localizations (and optionally classifications as well).
Our UI is a wrapper around the Deepsea-Detector CLI. This allows you to process multiple videos in sequence and define an output directory for videos. To see the full list of options and arguments, run:
python src/detection.py --help
You can also see an example in our Google Colab notebook, which gives you access to GPU resources for free!
Dataset: Roboflow project\ Model Training: Google Colab Python Notebook\ Additional Documentation: In-depth Project Explanation
We would like to thank the following people and organizations: