Closed KennyTC closed 3 months ago
1) The author said here https://aws.amazon.com/jp/blogs/machine-learning/hosting-yolov8-pytorch-model-on-amazon-sagemaker-endpoints/ that "The model weights yolov8l.pt file must be outside the code/ directory and the main inference python script inference.py"
2) But according to the code
from ultralytics import YOLO import os, sagemaker, subprocess, boto3 from datetime import datetime ## Choose a model: model_name = 'yolov8l.pt' YOLO(model_name) os.system(f'mv {model_name} code/.') bashCommand = "tar -cpzf model.tar.gz code/" process = subprocess.Popen(bashCommand.split(), stdout=subprocess.PIPE) output, error = process.communicate()
It seems that the model lies inside the code folder together with inference.py and requirements.txt. Is there any conflict?
code
inference.py
requirements.txt
1) The author said here https://aws.amazon.com/jp/blogs/machine-learning/hosting-yolov8-pytorch-model-on-amazon-sagemaker-endpoints/ that "The model weights yolov8l.pt file must be outside the code/ directory and the main inference python script inference.py"
2) But according to the code
It seems that the model lies inside the
code
folder together withinference.py
andrequirements.txt
. Is there any conflict?