π Analyze basketball shots and shooting pose with machine learning!
This is an AI-powered application focused on object detection to analyze basketball shots. The app allows users to upload basketball videos for analysis or submit POST requests to an API. Results include detailed shot and pose analysis based on object detection data. The project utilizes OpenPose to compute body keypoints and other metrics.
AI Basketball Analysis leverages artificial intelligence to break down basketball shots by detecting player movements, shot accuracy, and pose data. It uses the popular OpenPose framework for human pose estimation. Whether you're a developer or sports analyst, this project helps explore how AI can automate and enhance basketball analysis.
Important: This project is for noncommercial research use only, as it uses OpenPose's license. Please review the LICENSE for details.
If you're new to human pose estimation, check out this summary article that breaks down OpenPose's key concepts.
To get a copy of the project, run the following command:
git clone https://github.com/chonyy/AI-basketball-analysis.git
Before running the project, ensure all necessary dependencies are installed by running:
pip install -r requirements.txt
Note: This project requires a GPU with CUDA support to run OpenPose efficiently, especially for video analysis.
Once everything is set up, you can host the project locally with a simple command:
python app.py
This will launch the application locally, where you can upload basketball videos or images for analysis.
If you'd prefer not to run the project locally, you can try these alternatives:
Thanks to hardik0, you can experiment with the AI Basketball Analysis in Google Colab without needing your own GPU:
This project is also available on Heroku, though note that heavy computations like TensorFlow may cause timeout errors on Heroku due to limited resources. For best performance, it's recommended to run the app locally.
Hereβs a breakdown of the key components of the project:
Analyze basketball shots from the input video, determining successful and missed shots. Keypoints in different colors represent:
Using OpenPose, the project analyzes the player's elbow and knee angles during a shot, helping determine release angles and times.
This feature visualizes shot detection, showing confidence levels and coordinates for each detection.
The project includes a REST API for detection, allowing you to submit images via a POST request and receive a JSON response with detected keypoints and other data.
POST /detection_json
_The model is based on the Faster R-CNN architecture, trained on the COCO dataset. For more details, refer to the TensorFlow Model Zoo._
We welcome contributions from the community! Hereβs how you can get involved:
git checkout -b feature/your-feature-name
git commit -m 'Add some feature'
git push origin feature/your-feature-name
For more information on contributing, visit Make A Pull Request.