Closed chandan-wiai closed 1 year ago
Hi @chandan-wiai , could you try to test the main branch of yolort, we cleanup the implementation of NestedTensor
at https://github.com/zhiqwang/yolov5-rt-stack/pull/482 .
Hi @chandan-wiai , could you try to test the main branch of yolort, we cleanup the implementation of NestedTensor at https://github.com/zhiqwang/yolov5-rt-stack/pull/482 .
I actually install yolort as: pip install yolort Now if the main branch is changed, how do I get the latest changes in my package? Is there a new version now that I should update yolort to?
If I do this, would it overwrite the existing version with the updated 'main' branch?
Or from Source
# clone yolort repository locally
git clone https://github.com/zhiqwang/yolov5-rt-stack.git
cd yolov5-rt-stack
# install in editable mode
pip install -e .
yep, and you can pip uninstall yolort
first.
@zhiqwang I tried with the main branch. But it still gives the same error.
Thanks @chandan-wiai , got it, I guess that's caused by the dynamic shape in YOLOTransform
, maybe we should remove this module from model builder, and we provide a method to load the vanilla YOLO
module as following:
from yolort.models import YOLO
model = YOLO.load_from_yolov5(
checkpoint_path,
score_thresh=score_thresh,
nms_thresh=nms_thresh,
version="r6.0",
)
model = model.eval()
we provide a method to load the vanilla YOLO module as following:
from yolort.models import YOLO
model = YOLO.load_from_yolov5( checkpoint_path, score_thresh=score_thresh, nms_thresh=nms_thresh, version="r6.0", )
model = model.eval()
Does this mean, I can take my checkpoint trained on yolort.models.YOLOv5 and load as shown above, and that model object won't have 'transform' module?
Does this mean, I can take my checkpoint trained on yolort.models.YOLOv5 and load as shown above, and that model object won't have 'transform' module?
Hi @chandan-wiai , I guess not in this scenario. There are no parameters or buffers in YOLOTransform
modules, it should be easy theoretically. Maybe we should build the model as following from this api:
from yolort.models.yolo import yolov5_darknet_pan_s_r60 # aka yolov5s
model = yolov5_darknet_pan_s_r60() # we do not specify pretrained=True, i.e. do not load default weights
model.load_state_dict(torch.load('checkpoint_from_yolort.pt'))
model.eval()
We can also discuss this ticket at https://github.com/zhiqwang/yolov5-rt-stack/issues/484 so as not to disturb more people for questions not related to nni
.
@chandan-wiai @zhiqwang Thanks for raising this issue and discussing so much details in here. And had the problem finally been resolved? Could you close the issue? Expect your reply.
Yes @Lijiaoa, this issue can be closed here.
Describe the bug: After pruning, I am trying to speedup the model using ModelSpeedup(model, dummy_input, masks).speedup_model() The model class I am using is from yolort and has a transform attribute in it. It throws the following error:
To me, it looks like NNI can't support 'Transform' type layers. First of all is this observation correct? If yes, is there a way I can bypass this error? Thanks for the help.
Environment:
Reproduce the problem
Code|Example:
How to reproduce: