huggingface / blog

Public repo for HF blog posts
https://hf.co/blog
2.32k stars 720 forks source link

Fine-Tune a Semantic Segmentation Model with a Custom Dataset colab working fine? #531

Open nunezmatias opened 2 years ago

nunezmatias commented 2 years ago

related to this colab https://colab.research.google.com/drive/1BImTyBjW3KtvHGVcjGpYYFZdRGXzM3-j?usp=sharing&hl=en&authuser=1#scrollTo=7Up9QNqOWtSD which is the same that is in the blog https://huggingface.co/blog/fine-tune-segformer

Hi. I have been trying to reproduce the example in colab, but I was not able to do yet. If I use the option

Use a dataset from the Hub

and change to hf_dataset_identifier = "segments/sidewalk-semantic"

(the dafault address does not exist anymore) Then run all the cells as they are And when running the trainning I had to add this line before otherwise there is an error repated to np import numpy as np then after starting the training it has been running the optimization for hours and hours (with GPUs, high RAM option)

and seems it would take 69 hours? image

also there is a Nan issue , not sure if it is related image

Also I could not use the other method Create your own dataset I had errors when bring back the labelled pics from Segments.ai I will reproduce them again and post it here.

segments-tobias commented 1 year ago

Hi @nunezmatias, I'm just seeing your issue now (best to tag me or @NielsRogge in the future). Were you able to resolve your issues?