Closed head-iie-vnr closed 4 months ago
Install the YOLOv8 package using pip:
pip install ultralytics
Start Jupyter Notebook while your environment is activated:
jupyter notebook
In the new notebook, write your YOLOv8 code. Below is an example to get you started:
# Import YOLOv8 from the ultralytics package
from ultralytics import YOLO
# Load a pretrained YOLOv8 model (you can also train your own model)
model = YOLO('yolov8n.pt') # You can change 'yolov8n.pt' to other versions like 'yolov8s.pt', 'yolov8m.pt', etc.
# Load an image
img = 'path/to/your/image.jpg' # Replace with your image path
# Perform inference
results = model(img)
# Print results
results.print()
# Show the results
results.show()
Save the notebook as YOLOv8_Jupyter_Notebook.ipynb
.
Add the notebook to your repository, commit, and push it:
git add YOLOv8_Jupyter_Notebook.ipynb
git commit -m "Add YOLOv8 Jupyter Notebook for issue #4"
git push origin main
To reduce the size of your Jupyter Notebook (.ipynb
file), you can follow several strategies:
Clear the outputs of all cells in your notebook. This can significantly reduce the file size as images and large data outputs are removed.
Kernel
in the menu.Restart & Clear Output
.If you want to keep a clean version of your code without outputs, you can convert your notebook to a Python script.
Using Jupyter Interface:
File
-> Download as
-> Python (.py)
.Using Command Line:
jupyter nbconvert --to script your_notebook.ipynb
There are tools designed to strip outputs and metadata from Jupyter Notebooks.
nbstripout:
Install nbstripout
:
pip install nbstripout
Strip the notebook:
nbstripout your_notebook.ipynb
nbconvert:
Install nbconvert
:
pip install nbconvert
Clean the notebook:
jupyter nbconvert --clear-output --inplace your_notebook.ipynb
If your notebook contains many images, consider compressing them before inserting into the notebook. Tools like Pillow can help with this:
from PIL import Image
img = Image.open('path/to/image.jpg')
img.save('path/to/compressed_image.jpg', quality=85)
If your notebook generates large data, consider saving it to external files (e.g., CSV, JSON, HDF5) instead of displaying it directly in the notebook.
import pandas as pd
# Save DataFrame to CSV instead of displaying it
df.to_csv('data.csv')
Tried the same with other Models.
Objective