ocean-data-factory-sweden / kso

Notebooks to upload/download marine footage, connect to a citizen science project, train machine learning models and publish marine biological observations.
GNU General Public License v3.0
4 stars 12 forks source link

Issue in tut 6 with running the detection function #324

Closed Bergylta closed 7 months ago

Bergylta commented 7 months ago

🐛 Bug

A clear and concise description of what the bug is.

To Reproduce (REQUIRED)

Input: Model: GU_crabs_gobies_wrasses1 conf: 0,5

mlp.detect_yolo(
    source=pp.movies_paths,
    conf_thres=conf_thres.value,
    artifact_dir=artifact_dir,
    save_output=True,
    project=mlp.project_name,
    name=exp_name.value,
    model=model.value,
)

Output:

wandb version 0.16.0 is available! To upgrade, please run: $ pip install wandb --upgrade
Tracking run with wandb version 0.15.11
Run data is saved locally in /mimer/NOBACKUP/groups/snic2021-6-9/wandb/run-20231129_172501-ot0qhsnp
Syncing run [exalted-shadow-614](https://wandb.ai/koster/model-evaluations/runs/ot0qhsnp) to [Weights & Biases](https://wandb.ai/koster/model-evaluations) ([docs](https://wandb.me/run))
View project at https://wandb.ai/koster/model-evaluations
View run at https://wandb.ai/koster/model-evaluations/runs/ot0qhsnp
Fusing layers... 
Model summary: 309 layers, 21068529 parameters, 0 gradients

WARNING ⚠️ inference results will accumulate in RAM unless `stream=True` is passed, causing potential out-of-memory
errors for large sources or long-running streams and videos. See https://docs.ultralytics.com/modes/predict/ for help.

Example:
    results = model(source=..., stream=True)  # generator of Results objects
    for r in results:
        boxes = r.boxes  # Boxes object for bbox outputs
        masks = r.masks  # Masks object for segment masks outputs
        probs = r.probs  # Class probabilities for classification outputs

WARNING ⚠️ NMS time limit 0.550s exceeded
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
Cell In[12], line 1
----> 1 mlp.detect_yolo(
      2     source=pp.movies_paths,
      3     conf_thres=conf_thres.value,
      4     artifact_dir=artifact_dir,
      5     save_output=True,
      6     project=mlp.project_name,
      7     name=exp_name.value,
      8     model=model.value,
      9 )

File /usr/src/app/kso-dev/kso_utils/project.py:1728, in MLProjectProcessor.detect_yolo(self, project, name, source, conf_thres, artifact_dir, model, img_size, save_output, test)
   1726     project = Path(self.output_path, project)
   1727 self.eval_dir = increment_path(Path(project) / name, exist_ok=False)
-> 1728 model.predict(
   1729     project=project,
   1730     name=name,
   1731     source=source,
   1732     conf=conf_thres,
   1733     save_txt=True,
   1734     save_conf=True,
   1735     save=save_output,
   1736     imgsz=img_size,
   1737 )
   1738 self.save_detections(conf_thres, model.ckpt_path, self.eval_dir)

File /usr/local/lib/python3.8/dist-packages/ultralytics/engine/model.py:242, in Model.predict(self, source, stream, predictor, **kwargs)
    240 if prompts and hasattr(self.predictor, 'set_prompts'):  # for SAM-type models
    241     self.predictor.set_prompts(prompts)
--> 242 return self.predictor.predict_cli(source=source) if is_cli else self.predictor(source=source, stream=stream)

File /usr/local/lib/python3.8/dist-packages/ultralytics/engine/predictor.py:196, in BasePredictor.__call__(self, source, model, stream, *args, **kwargs)
    194     return self.stream_inference(source, model, *args, **kwargs)
    195 else:
--> 196     return list(self.stream_inference(source, model, *args, **kwargs))

File /usr/local/lib/python3.8/dist-packages/torch/utils/_contextlib.py:35, in _wrap_generator.<locals>.generator_context(*args, **kwargs)
     32 try:
     33     # Issuing `None` to a generator fires it up
     34     with ctx_factory():
---> 35         response = gen.send(None)
     37     while True:
     38         try:
     39             # Forward the response to our caller and get its next request

File /usr/local/lib/python3.8/dist-packages/ultralytics/engine/predictor.py:278, in BasePredictor.stream_inference(self, source, model, *args, **kwargs)
    275 p = Path(p)
    277 if self.args.verbose or self.args.save or self.args.save_txt or self.args.show:
--> 278     s += self.write_results(i, self.results, (p, im, im0))
    279 if self.args.save or self.args.save_txt:
    280     self.results[i].save_dir = self.save_dir.__str__()

File /usr/local/lib/python3.8/dist-packages/ultralytics/engine/predictor.py:166, in BasePredictor.write_results(self, idx, results, batch)
    164 log_string += '%gx%g ' % im.shape[2:]  # print string
    165 result = results[idx]
--> 166 log_string += result.verbose()
    168 if self.args.save or self.args.show:  # Add bbox to image
    169     plot_args = {
    170         'line_width': self.args.line_width,
    171         'boxes': self.args.boxes,
    172         'conf': self.args.show_conf,
    173         'labels': self.args.show_labels}

File /usr/local/lib/python3.8/dist-packages/ultralytics/engine/results.py:275, in Results.verbose(self)
    273     for c in boxes.cls.unique():
    274         n = (boxes.cls == c).sum()  # detections per class
--> 275         log_string += f"{n} {self.names[int(c)]}{'s' * (n > 1)}, "
    276 return log_string

KeyError: 3835

Expected behavior

A clear and concise description of what you expected to happen.

Environment

If applicable, add screenshots to help explain your problem.

Additional context

Add any other context about the problem here.