Closed changkon closed 9 years ago
I was just trying to figure out if the lasers are actually gathering data at the moment
I don't think we have subscribed any of the entities yet to the basescan. But when we do, I believe we should get the information
I might as well start on that then
Actually I'll be going for dinner soon, but I saw where the laserscan callback method. I was thinking of a way to use the information. The robots are both 1.5m wide. We could code them so if they see something in front of their front-left corner, they could turn right a certain amount. This way they could dodge something that could clip their left side. We would also do the same thing for their right side. How does that sound for obstacle avoidance?
But then when they need to turn around a static object (e.g. the end of a row of kiwifruit) I don't know if the above idea would help/work.
ok so i will finish all the movement stuff?
i will quickly finish alpha movement i guess then help with laser detection?
Sounds good
great stuff
I have found something about data collecting from laser scan. I am gonna work on it tonight.
it will be greate if you guys work on a separate branch like nikhil is doing i think?
lol what separate branch?
jokes it was lennon. It great if you guys could follow lennon.
As soon as the robots are able to receive the data from their laser scanners I will start trying to implement the path finding.
I spent too much time on laserScan publisher last night which should be the requirement in communication.
So the robots are receiving data from their laser scanners now?
I have added the code to receive the distance of the nearest obstacle but haven't tested it yet
I am adding code to receive the angle of the nearest obstacle.
That's how the laser works though. If you set the angle of the laser to 180 degrees, the data returned by the laser should be an array with 180 elements. Each element corresponds to 1 degree. For example if element 79 (which is the 78th degree) contains 4, it means that at 78 degrees there is an object 4m away from the laser.
yeah that's the way I did .I'll push Entity.cpp to nightly soon
Ok great, as soon as you do that I'll start on obstacle avoidance.
I set the default obstacle angle as a value larger than 180. Therefore , if it keeps lager than 180 means no obstacles nearby.
Some git command issues come to me again . I will push it immediately after lunch sorry about that.
that's cool, Guyver has more or less implemented the movement that we need for Alpha so my stuff is for the final version. But if I can get it done for Alpha it'll be good.
parallels@ubuntu:~/indigo_workspace/rosws/src/se306project$ git push origin/nightly warning: push.default is unset; its implicit value is changing in Git 2.0 from 'matching' to 'simple'. To squelch this message and maintain the current behavior after the default changes, use:
git config --global push.default matching
To squelch this message and adopt the new behavior now, use:
git config --global push.default simple
When push.default is set to 'matching', git will push local branches to the remote branches that already exist with the same name.
In Git 2.0, Git will default to the more conservative 'simple' behavior, which only pushes the current branch to the corresponding remote branch that 'git pull' uses to update the current branch.
See 'git help config' and search for 'push.default' for further information. (the 'simple' mode was introduced in Git 1.7.11. Use the similar mode 'current' instead of 'simple' if you sometimes use older versions of Git)
fatal: 'origin/nightly' does not appear to be a git repository fatal: Could not read from remote repository.
Please make sure you have the correct access rights and the repository exists.
I don't know. When I do all the steps again there is some error like this
I am going to UG4 for help
Should be done . Just let me know if there is any problem.
ok . There is some problems . Let me figure it out first.should be quick
Im not sure what minDistance represents? Also the two new variables minDistance and obstacleAngle should be declared in Entity.h
yeah, we fixed it . The problem with minDistance is because that we set our sensor in the middle of the robot instead of the front. We are working on this. For now, probably dont edit the minDistance and obstacleAngle. If it is not necessary, ill edit the code.
Also, we're moving the position of the sensor on the robot. At the moment, the sensor is placed on top of the robot, so it will not detect any obstacles apart from the trunks. We are thinking of placing the sensor at the front of the robot at the bottom.
Yeah that's cool.
We have changed the location of the sensor laser on the front of the robot and changed the height to make detection work
Ok brilliant. Another question though, can you explain what this code is doing? Its from Entity.cpp int l=sizeof(msg.ranges) / sizeof(msg.ranges[0]); for (int i=0; i<l; i++){ if (msg.ranges[i]< minDistance) { minDistance = msg.ranges[i]; obstacleAngle= (i/l) * msg.angle_increment + msg.angle_min; } }
I can see you're looping through the array returned by the laser. I guess I'm not sure what "l" is? Why do you need to do that to calculate l? Also what is happening in that if block?
Yeah, sure.Coz in the C++, there is no method to get the length of a vector directly. What I did is calculating it by dividing the size of whole vector by a single element.
If the robot is blocked , the minDistance should be very small but not a certain value. This is because we have only one sensor at the front bottom of the robot .Like the value will depend on whether robot is blocked by the right front obstacle of straight front obstacle.
The vector ranges is starting from angel_min which is right hand side anticlockwise to left
http://answers.ros.org/question/198843/need-explanation-on-sensor_msgslaserscanmsg/ This website is really useful
Oh okay so when that loop finishes, minDistance will have the distance away from the CLOSEST object and obstacleAngle will be the angle this object is at from the centre?
Yes I believe that is how it works. Worked with Lennon a bit at ug4
ok awesome, great stuff lads.
Actually, Chang Kon worked with me on all the stuff together this afternoon.
I've a bit unsure of how to use the information to create a path plan. I'm planning on asking the TA tomorrow during lab about the obstacle detection
I drew a sequence diagram if it will help anyone lol. I'll make a proper one when the obstacle stuff is sorted
nice
Discussion about how we should integrate movement and the laser sensor information.