This respository contains python code to connect to the GelSight Mini. The code enables basic functionalities, such as view and save data (images or video) from these devices, and example code to get 3D point cloud data derived from 2D images.
About to update instructions
On Windows, we recommend running the examples within the PyCharm development environment. Here are instructions on how to set up PyCharm.
Download and install Git for Windows
Download and install TortoiseGit. For most users, the correct version is 64-bit Windows. Run the First Start wizard after installation and choose the default options.
Download and install Python 3.10. More recent versions of Python might require additional steps to install the packages used by this codebase. For most users, the correct version is Windows installer (64-bit)
Go to PyCharm and download the PyCharm Community installer
Clone this repository
Git Clone using TortoiseGit | Need to click Show More Options on Windows 11 |
Clone repository using TortoiseGit |
Create the virutal environment in PyCharm |
Run showimages.py in PyCharm |
https://github.com/gelsightinc/gsrobotics/assets/44114954/85b3e123-730d-4dfa-a05a-983dfc1e5a78
Python 3.8 or above
pip3 install .
or
pip3 install . --upgrade
Note this step does not install the ROS or ROS2 related libraries required by the python scripts in examples/ros folder of this repository yet. Please follow the installation guide of those seperately.
PYDIR=`pip3 show gelsight | grep -i location | cut -f2 -d" "`
export PYTHONPATH=$PYDIR/gelsight:$PYTHONPATH
There are multiple marker tracking demos. Each uses a different marker tracking algorithm. You can find all of them in the demo directory.
marker_tracking: contains demos using a mean shift algorithm and an optical flow algorithm
cd demos/marker_tracking python3 mean_shift_marker_tracking.py or python3 optical_flow_marker_tracking.py
mini_tracking_linux_V0: contains demo using compiled code for depth first search to run on Linux, to run:
cd demos/mini_tracking_linux_V0 python3 tracking.py
mini_tracking_windows_V0: contains demo using compiled code for depth first search to run on Windows, to run:
cd demos/mini_tracking_windows_V0 python3.exe tracking.py
Install ROS or ROS2 , see ROS Documentation for instructions. The example code uses cv_bridge which can be installed using:
sudo apt-get install ros-${ROS_DISTRO}-cv-bridge
For example, on Ubuntu 20,
To install cv-bridge for ROS
sudo apt-get install ros-noetic-cv-bridge
To install cv-bridge for ROS2
sudo apt-get install ros-foxy-cv-bridge
The showimages examples publish to the ROS topic /gsmini_rawimg_0
The show3d examples publish to the ROS topic /pcd
They can be viewed in rviz or rviz2
source /opt/ros/noetic/setup.bash
cd examples/ros
roscore
python3 showimages_ros.py
python3 show3d_ros.py
rviz -d mini_ros_3d_config.rviz
source /opt/ros/foxy/setup.bash
cd examples/ros
python3 showimages_ros2.py
python3 show3d_ros2.py
rviz2 -d mini_ros2_3d_config.rviz
The camera on the GelSight Mini is a USB camera.
If you need to adjust the camera settings for your application, you can change the camera parameters using any app or library that can control UVC cameras.
A popular library is v4l2-ctl. To install this library on ubuntu run,
sudo apt-get update
sudo apt-get -y install v4l-utils
Refer to file config/mini_set_cam_params.sh present in this repository to view/edit all the available camera parameters. You can list the devices by running:
v4l2-ctl --list-devices
In most cases when you have one Mini connected to your computer, the device ID is usually 2, because the webcam on your computer is always on device ID 0.
On Windows, you can use the AMCap app to configure the camera settings, see https://docs.arducam.com/UVC-Camera/Quick-Start-for-Different-Systems/Windows/
pip uninstall opencv-python-headless
sudo apt-get install libopenjp2-7
sudo apt-get install qt5-default
pip3 install opencv-python==4.1.2.30
sudo apt-get install libopenexr-dev
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
This package is under active development. Contact support@gelsight.com if have any questions / comments / suggestions.