-
### Search before asking
- [X] I have searched the YOLOv8 [issues](https://github.com/ultralytics/ultralytics/issues) and [discussions](https://github.com/ultralytics/ultralytics/discussions) and f…
-
1. Post lidar-camera calibration, I need to publish processed coloured pointcloud data and lidar points projected on image as ros sensor messages in real time. I am unable to find where it is publishi…
-
Wu & Shen abstract from https://arxiv.org/abs/2308.15586:
> The Dark Energy Spectroscopic Instrument (DESI) survey will provide optical spectra for ∼3 million quasars. Accurate redshifts for these …
-
## Keyword: detection
### Neural Motion Fields: Encoding Grasp Trajectories as Implicit Value Functions
- **Authors:** Authors: Yun-Chun Chen, Adithyavairavan Murali, Balakumar Sundaralingam, Wei Ya…
-
Hi everyone,
I have an instrumented car. I installed sensors on the car roof. Sensors have different relative location and direction in comparison to r3live designed handle by authors. I added a Ve…
-
Hi,
I am planning to use your software for a real application. I want to use the single input model with the uNetXST configuration and a single front facing camera. Is there any way to train the mo…
-
Ubuntu 18.04, OpenCV 3.2, ROS Melodic.
I was trying to use this plugin to perform hand eye calibration for a Realsense D435 mounted on a UR5e arm, all in simulation in Gazebo 9.0. My calibration pa…
-
Following the [Converstion script](https://github.com/YanjieZe/3D-Diffusion-Policy/blob/master/scripts/convert_real_robot_data.py), is it strictly necessary to transform the pointcloud to world frame …
-
There are XR use cases (e.g., "audio AR") that could build on poses and other capabilities exposed by core WebXR (and future extensions). The current spec language, though, appears to require visual d…
-
Hello, I imitated KITTIOdomDataset to read my own video frames, but the output images of **test_simple.py** are all black or yellow. What is the problem? I train the following pretrained models and ge…