Closed 0707yiliu closed 4 years ago
Hi @tKsome
The filter needs to know the shape model of the ball that is provided in a file read at startup. To generate that file, you can use a python script following the instructions detailed in #18.
Hi @tKsome
The filter needs to know the shape model of the ball that is provided in a file read at startup. To generate that file, you can use a python script following the instructions detailed in #18.
owww. lol.
I have downloaded the Matlab for pf3dTracker file because of the tips in pf3dBottom file today.
And you suggest using python script replace the MATLAB.
Can I still use the MATLAB file to produce the .csv file? Is there any version error? ANY WAY, I will try all of it and put the result on here. :)
Hi @tKsome
The filter needs to know the shape model of the ball that is provided in a file read at startup. To generate that file, you can use a python script following the instructions detailed in #18.
I have another question...
When I get the .csv file from the matlab file or python script, should i just replace the trackObjectShapeTemplate in pf3fTracker.ini??? OR have any other's paramater should be replaced???
I have obtained the .csv file to replace the trackObjectShapeTemplate in pf3fTracker.ini by the python file, but it dosen't work.
my ball's radius is 0.25mm
my file:
import argparse import math
parser = argparse.ArgumentParser() parser.add_argument("-r", "--radius", type=float, help="ball radius in millimeters (default: 25)", default=25) parser.add_argument("-p", "--percentage", type=float, help="percentage in [0, 100] for inner and outer radii (default: 20)", default=20) parser.add_argument("-n", "--points", type=int, help="number of points generated (default: 50)", default=50) parser.add_argument("-f", "--file", type=str, help="name of the generated file (default: shape_model.csv)", default='shape_model.csv') args = parser.parse_args()
print('using:') print('radius = {} [mm]'.format(args.radius)) print('percentage = {} [%]'.format(args.percentage)) print('points = {}'.format(args.points)) print('file = "{}"'.format(args.file))
R_i = (1.0 - (args.percentage / 100.0)) args.radius R_o = (1.0 + (args.percentage / 100.0)) args.radius
x = [] y = [] z = [] t = 0.0 t_delta = (2.0 * math.pi) / args.points for i in range(args.points): x.append(0.0) y.append(math.sin(t)) z.append(math.cos(t)) t += t_delta
fout = open(args.file, "w") for i in range(args.points): fout.write('{0:.3f}\n'.format(x[i])) for i in range(args.points): fout.write('{0:.3f}\n'.format(x[i])) for i in range(args.points): fout.write('{0:.3f}\n'.format(R_i y[i])) for i in range(args.points): fout.write('{0:.3f}\n'.format(R_o y[i])) for i in range(args.points): fout.write('{0:.3f}\n'.format(R_i z[i])) for i in range(args.points): fout.write('{0:.3f}\n'.format(R_o z[i])) fout.close()
and put the ball on the place like hand's depth, maybe 200mm??? i don't konw it exactly. And the depth z is:
1174 0.056 0.027 0.712 0.00071 1 180 131 0.121
the depth z in 4th parameter , it equals to 0.712m...
The camera need to calibrate in my computer? I donnot find the calibrate file.
Hi @tKsome
Please, use our official Python script rather than MATLAB.
Once you obtained the CSV file representing the shape model, you're required to put in the correct location in order to let pf3Tracker
find it at startup. The localization of the configuration files works based on the YARP ResourceFinder system. You can read about that at https://github.com/robotology/QA/issues/42.
Anyway, just do:
$ yarp resource --context pf3dTracker --from pf3dTracker.ini
to find out where the file pf3dTracker.ini
used at startup is located and then put the CSV model in the models
subir under that path.
Hope this helps out.
i use the command
--from pf3dTracker.ini/in/my/path
to boot the correct path in the terminal. It had worked in my computer.
The ball's depth is getting in the right direction, however it's not getting the correct depth yet.
I suspect it is the camera calibration problem. I used the original document Shenzhen01 configuration cameraEyes.ini.
Is the configuration file in Shenzhen01 correct???
And i have find a way to calibrate the camera https://github.com/robotology/QA/issues/342 .BUT i don't know if it's feasible yet.
I got the video from left camera. however, compared with the simulator, the ball's position has a error smaller than before. I have calibrated the camera by the approach https://github.com/robotology/QA/issues/342. And i putting the ball on ICUB's right hand, turning off hands' motor. In simulator the ball's position is further to the right, which beyond the palm of hand.
Hi @tKsome
I'm not sure if I got right all the points you mentioned. Anyway, I suspect that you should be playing with the reaching offsets provided by the demo itself to adjust the relative position between the ball and the hands.
In the past, I prepared a kind of walkthrough for this that you can find at #17.
emmm...
That's right. We can get the 3D position from tracker with few errors. And the correct hand position can be adjusted by the offeset in demoRedBall configuration file.
But this is not a long-term stable approach. We want use a eye-hand calibrator tracker to make an accurate postioning. This approach would reduce artificial intervention and bring ICUB closer to real human behaviour.
Any way, what is the most effective way of adjusting offset for actual use.
And I will find and try the hand-eye calibration method~
Thanks cheers GrootLiu.
But this is not a long-term stable approach. We want use a eye-hand calibrator tracker to make an accurate postioning. This approach would reduce artificial intervention and bring ICUB closer to real human behaviour.
You're definitely right @tKsome but this red-ball demo is a very simple demonstrator that by no means aims to attack the problem of eye-hand coordination in robotics. The literature in the field is quite large and in the past we also contributed with some works:
I will read these articles carefully.
Thanks cheers, GrootLiu
Sorry, I forgot to answer this:
Any way, what is the most effective way of adjusting offset for actual use.
Well, the offsets are arm dependent and can be tuned through these parameters. They represent the 3 offsets to be added up to the nominal pose along the 3 axes x, y, z with respect to the root frame attached to the robot's waist.
The ones that are applied first are relative to the reaching stage and aim to keep the hand on the correct side of the ball during tracking. Once the ball remains stationary for a few seconds (configurable), then the robot will attempt to grab it using the second set of offsets.
Typically, offsets are adjusted incrementally by visual inspection.
hi I had run the demoRedBall successful in my personal computer. And the ball's radiu is 0.06. Howerver, when i change my ball's radius, the pf3dTracker and the robot dosen't work.
I change the redball's config.ini:
[grasp] // ball radius [m] for still target detection sphere_radius 0.05 (change it to my ball's radius)
And then i check the code and know the demoRedBall's sphere_radius is just a radiu for calculate the grasp action.
Final, i think about that the redball's radius should can be changed in pf3dTracker. However, i don't find any parameter about the ball's radius.
The ball's radius is a fixed parameter and cannot be change???