Open SRHarrison opened 7 years ago
You likely need a normalized statistic. Can you use a pdf to find find a threshold?
Rob Holman SECNAV/CNO Chair in Oceanography
104 Ocean Admin Bldg. CEOAS-OSU Corvallis, Oregon, USA 97331-5503 ph: 1-541-737-2914 holman@coas.oregonstate.edu http://cil-www.coas.oregonstate.edu
On Sep 26, 2017, at 9:45 AM, Shawn Harrison notifications@github.com wrote:
Hi All,
I have a video that was captured during full sun on a light colored sandy beach with white aerial targets. I'm finding that the intensity threshold value for individual reference points is struggling to discern the targets from the background beach over time. I think the light must be changing through time during the video, and the threshold value that I set to find the target in the first frame does not work at all times during the video. However, the intensity of the target and background sand is so close that I can't find a single value that is able to track the target through the entire video.
Is it possible to feed an expression into the threshold value, e.g. to find the 99%ile intensity in the search window, or similar so that the actual threshold value is changing with each frame as the light changes?
Many thanks!
— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub, or mute the thread.
If you do a cumulative pdf you can easily define a threshold like a 80% or whatever. But you are really looking for a break between two domains, something that should be obvious in the cdf.
Rob Holman SECNAV/CNO Chair in Oceanography
104 Ocean Admin Bldg. CEOAS-OSU Corvallis, Oregon, USA 97331-5503 ph: 1-541-737-2914 holman@coas.oregonstate.edu http://cil-www.coas.oregonstate.edu
On Sep 26, 2017, at 9:45 AM, Shawn Harrison notifications@github.com wrote:
Assigned #60 to @RobHolman.
— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub, or mute the thread.
Hi Shawn, Eleonora has had good luck using the function imregister.m from the image processing toolbox to remove camera motion from images with different lighting. Maybe that would help?
Thanks for the helpful suggestions. Just to update this thread:
I made a quick attempt to implement a new threshold statistic in situations when the href objects' intensity value drops below the set threshold for identification.
In my video, the white objects are very small - only a few pixels (2-7). I edited the file: 'findCOMRefObj.m' in the bottom for loop to do a quick check to see if there are any 'good' pixels (above threshold), and if not try a statistical approach (percentiles or other). Since my href objects are so small, I found best results with a max of the search region, e.g.
good = find(I2>thresh(i));
if (length(good) < minNGood)
thresh(i) = max(I2(:));
end
When I get a chance, I'll look more closely at Eleonora's 'imregister.m' solution.
On Oct 17, 2017, at 4:00 PM, Shawn Harrison notifications@github.com wrote:
Thanks for the helpful suggestions. Just to update this thread:
I made a quick attempt to implement a new threshold statistic in situations when the href objects' intensity value drops below the set threshold for identification.
In my video, the white objects are very small - only a few pixels (2-7). I edited the file: 'findCOMRefObj.m' in the bottom for loop to do a quick check to see if there are any 'good' pixels (above threshold), and if not try a statistical approach (percentiles or other). Since my href objects are so small, I found best results with a max of the search region, e.g.
The specific routine is relatively specific and is just meant to be an example. In some cases it would work better as a minimum thresh to find a dark object. They are often more visible. I also think there are normalized versions that should work. For example looking for kinks in the cumulative pdf of using some of the matlab image processing segmentation tools. There should be plenty of options.
good = find(I2>thresh(i)); if (length(good) < minNGood) thresh(i) = max(I2(:)); end
Does this work? If feels like if you set the thresh to the max, you will never have any pixels that exceed that value.
YHS
When I get a chance, I'll look more closely at Eleonora's 'imregister.m' solution.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
Automating a nice piece of code to include the many different options would be a welcome contribution to the repository if anyone ever feels driven to do so. In the meantime, I'll try to implement a few of your suggestions on my example video.
The max did work much better for my very small targets. BUT, the max intensity does shift between pixels in my href object during the video (due to changes in light reflection and/or any auto sharpening effects the camera might be doing), so the max still is not the best option for me.
However, I will mention that I found better href objects in my FOV (other than the GCP targets that I placed on the beach. And because I did a SfM survey and created a 3D DEM of the study area, I was able to pick out the href object in the DEM in order to determine the elevation to associate with that href object in the rectification. I suggest doing a SfM survey along with a UAV Argus mission anytime possible.
Are you mostly using a fixed view or are you moving through the scene? If not moving, DEM elevation doesn’t seem that important. If moving, SfM would certainly help.
Rob Holman SECNAV/CNO Chair in Oceanography
104 Ocean Admin Bldg. CEOAS-OSU Corvallis, Oregon, USA 97331-5503 ph: 1-541-737-2914 holman@coas.oregonstate.edu http://cil-www.coas.oregonstate.edu
On Oct 19, 2017, at 4:12 PM, Shawn Harrison notifications@github.com wrote:
Automating a nice piece of code to include the many different options would be a welcome contribution to the repository if anyone ever feels driven to do so. In the meantime, I'll try to implement a few of your suggestions on my example video.
The max did work much better for my very small targets. BUT, the max intensity does shift between pixels in my href object during the video (due to changes in light reflection and/or any auto sharpening effects the camera might be doing), so the max still is not the best option for me.
However, I will mention that I found better href objects in my FOV (other than the GCP targets that I placed on the beach. And because I did a SfM survey and created a 3D DEM of the study area, I was able to pick out the href object in the DEM in order to determine the elevation to associate with that href object in the rectification. I suggest doing a SfM survey along with a UAV Argus mission anytime possible.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
while picking up reference points im getting an error 'Subscripted assignment dimension mismatch.
Error in initUAVAnalysis1 (line 99) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);'. could you suggest a solution for it.please...
Do you know how to use debug in matlab. Put a breakpoint there and try executing the line manually, or just execute the right hand side of the line.
On Jan 15, 2019, at 10:30 PM, shimonfrancis notifications@github.com wrote:
while picking up reference points im getting an error 'Subscripted assignment dimension mismatch.
Error in initUAVAnalysis1 (line 99) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);'. could you suggest a solution for it.please...
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
Sir, Thank you for considering and replying for my doubts .I have done the steps . But now I'm getting a message 'Subscripted assignment dimension mismatch. ' at the command window. What should i do in order to avoid that .
Regards Shimon Francis
On Thu, Jan 17, 2019 at 12:57 AM Rob Holman notifications@github.com wrote:
Do you know how to use debug in matlab. Put a breakpoint there and try executing the line manually, or just execute the right hand side of the line.
On Jan 15, 2019, at 10:30 PM, shimonfrancis notifications@github.com wrote:
while picking up reference points im getting an error 'Subscripted assignment dimension mismatch.
Error in initUAVAnalysis1 (line 99) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);'. could you suggest a solution for it.please...
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-454907845, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJFTnvjYszkQ0KvRADoBpL0aHCHAoks5vD30xgaJpZM4PkjNN .
You’ll have to do some debugging yourself here. Put in a break point at the location where it fails, then examine the variables. My suspicion is that you have entered something incorrectly. It should be obvious.
YHS
On Jan 16, 2019, at 9:16 PM, shimonfrancis notifications@github.com wrote:
Sir, Thank you for considering and replying for my doubts .I have done the steps . But now I'm getting a message 'Subscripted assignment dimension mismatch. ' at the command window. What should i do in order to avoid that .
Regards Shimon Francis
On Thu, Jan 17, 2019 at 12:57 AM Rob Holman notifications@github.com wrote:
Do you know how to use debug in matlab. Put a breakpoint there and try executing the line manually, or just execute the right hand side of the line.
On Jan 15, 2019, at 10:30 PM, shimonfrancis notifications@github.com wrote:
while picking up reference points im getting an error 'Subscripted assignment dimension mismatch.
Error in initUAVAnalysis1 (line 99) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);'. could you suggest a solution for it.please...
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-454907845, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJFTnvjYszkQ0KvRADoBpL0aHCHAoks5vD30xgaJpZM4PkjNN .
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
Sir I have done the steps and got the results sir ,please help me to select three pixel instruments in order to apply diwasp tool to obtain directional spectrum of waves.How would we get pixel data from the rectified imagery? Regards Shimon Francis
On Thu, Jan 17, 2019 at 10:09 PM Rob Holman notifications@github.com wrote:
You’ll have to do some debugging yourself here. Put in a break point at the location where it fails, then examine the variables. My suspicion is that you have entered something incorrectly. It should be obvious.
YHS
On Jan 16, 2019, at 9:16 PM, shimonfrancis notifications@github.com wrote:
Sir, Thank you for considering and replying for my doubts .I have done the steps . But now I'm getting a message 'Subscripted assignment dimension mismatch. ' at the command window. What should i do in order to avoid that .
Regards Shimon Francis
On Thu, Jan 17, 2019 at 12:57 AM Rob Holman notifications@github.com wrote:
Do you know how to use debug in matlab. Put a breakpoint there and try executing the line manually, or just execute the right hand side of the line.
On Jan 15, 2019, at 10:30 PM, shimonfrancis < notifications@github.com> wrote:
while picking up reference points im getting an error 'Subscripted assignment dimension mismatch.
Error in initUAVAnalysis1 (line 99) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);'. could you suggest a solution for it.please...
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-454907845 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJFTnvjYszkQ0KvRADoBpL0aHCHAoks5vD30xgaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-455240592, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJNKNxmXqp2lhnOHhngDeLnumKWSGks5vEKcqgaJpZM4PkjNN .
I think there are examples in the UAV toolbox.
On Jan 21, 2019, at 10:01 PM, shimonfrancis notifications@github.com wrote:
Sir I have done the steps and got the results sir ,please help me to select three pixel instruments in order to apply diwasp tool to obtain directional spectrum of waves.How would we get pixel data from the rectified imagery? Regards Shimon Francis
On Thu, Jan 17, 2019 at 10:09 PM Rob Holman notifications@github.com wrote:
You’ll have to do some debugging yourself here. Put in a break point at the location where it fails, then examine the variables. My suspicion is that you have entered something incorrectly. It should be obvious.
YHS
On Jan 16, 2019, at 9:16 PM, shimonfrancis notifications@github.com wrote:
Sir, Thank you for considering and replying for my doubts .I have done the steps . But now I'm getting a message 'Subscripted assignment dimension mismatch. ' at the command window. What should i do in order to avoid that .
Regards Shimon Francis
On Thu, Jan 17, 2019 at 12:57 AM Rob Holman notifications@github.com wrote:
Do you know how to use debug in matlab. Put a breakpoint there and try executing the line manually, or just execute the right hand side of the line.
On Jan 15, 2019, at 10:30 PM, shimonfrancis < notifications@github.com> wrote:
while picking up reference points im getting an error 'Subscripted assignment dimension mismatch.
Error in initUAVAnalysis1 (line 99) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);'. could you suggest a solution for it.please...
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-454907845 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJFTnvjYszkQ0KvRADoBpL0aHCHAoks5vD30xgaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-455240592, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJNKNxmXqp2lhnOHhngDeLnumKWSGks5vEKcqgaJpZM4PkjNN .
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
Use the script makePixelInstsDemo.m in the UAV Toolbox as an example of how to sample pixel data. You will also need to use the PIXel Toolbox from the repository.
Sir , Now I'm trying to rectify my data and its showing an error 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.'. What should I do inorder to rectify my image.
On Wed, Jan 23, 2019 at 4:05 PM mpalmsten notifications@github.com wrote:
Use the script makePixelInstsDemo.m in the UAV Toolbox as an example of how to sample pixel data. You will also need to use the PIXel Toolbox from the repository.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-456752380, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJHT9fyOKYL5R8rv2NW0M5Mm3KiuAks5vGDrxgaJpZM4PkjNN .
Put a debug breakpoint in the code at the line where you call the nlinfit. Make sure that you are passing valid input data (look at the variables). If that looks right, try directly calling your model function (whatever it is called) to see if it works with those inputs. If unsure, put a breakpoint in your model function (whatever it is called) and see why it is failing.
Good luck…Rob
On Feb 7, 2019, at 12:59 AM, shimonfrancis notifications@github.com wrote:
Sir , Now I'm trying to rectify my data and its showing an error 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.'. What should I do inorder to rectify my image.
On Wed, Jan 23, 2019 at 4:05 PM mpalmsten notifications@github.com wrote:
Use the script makePixelInstsDemo.m in the UAV Toolbox as an example of how to sample pixel data. You will also need to use the PIXel Toolbox from the repository.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-456752380, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJHT9fyOKYL5R8rv2NW0M5Mm3KiuAks5vGDrxgaJpZM4PkjNN .
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
Sir, Will i get ll2Argus matlab file. Regards Shimon Francis
On Fri, Feb 8, 2019 at 3:55 AM Rob Holman notifications@github.com wrote:
Put a debug breakpoint in the code at the line where you call the nlinfit. Make sure that you are passing valid input data (look at the variables). If that looks right, try directly calling your model function (whatever it is called) to see if it works with those inputs. If unsure, put a breakpoint in your model function (whatever it is called) and see why it is failing.
Good luck…Rob
On Feb 7, 2019, at 12:59 AM, shimonfrancis notifications@github.com wrote:
Sir , Now I'm trying to rectify my data and its showing an error 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.'. What should I do inorder to rectify my image.
On Wed, Jan 23, 2019 at 4:05 PM mpalmsten notifications@github.com wrote:
Use the script makePixelInstsDemo.m in the UAV Toolbox as an example of how to sample pixel data. You will also need to use the PIXel Toolbox from the repository.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-456752380 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJHT9fyOKYL5R8rv2NW0M5Mm3KiuAks5vGDrxgaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-461618388, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJFv-d1YlcCTebGrlFmnyJm06PJs-ks5vLKfNgaJpZM4PkjNN .
That is a routine that is specific to your site. So you have to create this file yourself.
The idea is that you may have surveyed in lat-long but you need to analyze in local Argus coordinates. If you have already converted everything to local coordinates, you can dispense with it.
YHS
On Feb 21, 2019, at 1:47 AM, shimonfrancis notifications@github.com wrote:
Sir, Will i get ll2Argus matlab file. Regards Shimon Francis
On Fri, Feb 8, 2019 at 3:55 AM Rob Holman notifications@github.com wrote:
Put a debug breakpoint in the code at the line where you call the nlinfit. Make sure that you are passing valid input data (look at the variables). If that looks right, try directly calling your model function (whatever it is called) to see if it works with those inputs. If unsure, put a breakpoint in your model function (whatever it is called) and see why it is failing.
Good luck…Rob
On Feb 7, 2019, at 12:59 AM, shimonfrancis notifications@github.com wrote:
Sir , Now I'm trying to rectify my data and its showing an error 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.'. What should I do inorder to rectify my image.
On Wed, Jan 23, 2019 at 4:05 PM mpalmsten notifications@github.com wrote:
Use the script makePixelInstsDemo.m in the UAV Toolbox as an example of how to sample pixel data. You will also need to use the PIXel Toolbox from the repository.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-456752380 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJHT9fyOKYL5R8rv2NW0M5Mm3KiuAks5vGDrxgaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-461618388, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJFv-d1YlcCTebGrlFmnyJm06PJs-ks5vLKfNgaJpZM4PkjNN .
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
sir ,still my error can't be solved .I need to rectify a single image of averaged images.Will that be a problem sir. Regards
On Fri, Feb 8, 2019 at 3:55 AM Rob Holman notifications@github.com wrote:
Put a debug breakpoint in the code at the line where you call the nlinfit. Make sure that you are passing valid input data (look at the variables). If that looks right, try directly calling your model function (whatever it is called) to see if it works with those inputs. If unsure, put a breakpoint in your model function (whatever it is called) and see why it is failing.
Good luck…Rob
On Feb 7, 2019, at 12:59 AM, shimonfrancis notifications@github.com wrote:
Sir , Now I'm trying to rectify my data and its showing an error 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.'. What should I do inorder to rectify my image.
On Wed, Jan 23, 2019 at 4:05 PM mpalmsten notifications@github.com wrote:
Use the script makePixelInstsDemo.m in the UAV Toolbox as an example of how to sample pixel data. You will also need to use the PIXel Toolbox from the repository.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-456752380 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJHT9fyOKYL5R8rv2NW0M5Mm3KiuAks5vGDrxgaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-461618388, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJFv-d1YlcCTebGrlFmnyJm06PJs-ks5vLKfNgaJpZM4PkjNN .
Hi @shimonfrancis , Are you trying to process a video stream from a UAV or from a fixed video site, e.g. is there movement of the camera position between subsequent frames?
I suggest starting with the m-file 'sampleAerielleVideoDemoPIX.m' to get an idea of the workflow. It also shows how to use the PIXel-Toolbox to collect timestacks at pixel instruments.
Also, if you haven't already consult the UAV-Processing-Toolbox WIKI, a useful resource that explains the concepts and how to use the toolbox. The manual is also helpful. Example data is available on the repository and can give you an idea what the various data structures and files look like, e..g. pixel list definitions, GCP definitions, etc.
I hope these resources help. If they don't, perhaps you can post a few example images of what is happening on your side.
Kind regards, Shawn
@shimonfrancis if you use the GUI version (at the moment branch named kvosGUI), you will not need to create ll2Argus.m as the GUI allows you to create your cross/along-shore coordinate system by providing the origin and direction of the axis in any local coordinate system (defined by an EPSG code). In addition to the references pointed out by Shawn, you can also look at the Appendix of my Master Thesis where there is a step-by-step description of how to use the GUI version of the toolbox. Two examples at Narrabeen-Collaroy, Australia are provided.
All the best, Kilian
Sir, I have done with the example data in the repository. Now I'm doing with my data of an averaged image but I'm getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in rectification (line 98) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);',what should i do inorder to rectify the image.
On Tue, Feb 26, 2019 at 10:01 PM Shawn Harrison notifications@github.com wrote:
Hi @shimonfrancis https://github.com/shimonfrancis , Are you trying to process a video stream from a UAV or from a fixed video site, e.g. is there movement of the camera position between subsequent frames?
I suggest starting with the m-file 'sampleAerielleVideoDemoPIX.m' to get an idea of the workflow. It also shows how to use the PIXel-Toolbox to collect timestacks at pixel instruments.
Also, if you haven't already consult the UAV-Processing-Toolbox WIKI https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/wiki, a useful resource that explains the concepts and how to use the toolbox. The manual https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/blob/master/manual/UAVProcessingREADMEV2.pdf is also helpful. Example data is available on the repository and can give you an idea what the various data structures and files look like, e..g. pixel list definitions, GCP definitions, etc.
I hope these resources help. If they don't, perhaps you can post a few example images of what is happening on your side.
Kind regards, Shawn
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-467510570, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJNAkj6oEgCozPZc5Lglf_3gRR1vnks5vRWFWgaJpZM4PkjNN .
Sir, How could i get the Gui. Could you sent me the link to download it. Regards, Shimon Francis
On Wed, Feb 27, 2019 at 9:11 AM Kilian Vos notifications@github.com wrote:
@shimonfrancis https://github.com/shimonfrancis if you use the GUI version (at the moment branch named kvosGUI), you will not need to create ll2Argus.m as the GUI allows you to create your cross/along-shore coordinate system by providing the origin and direction of the axis in any local coordinate system (defined by an EPSG code). In addition to the references pointed out by Shawn, you can also look at the Appendix of my Master Thesis https://www.researchgate.net/publication/328900961_Remote_sensing_of_the_nearshore_zone_using_a_rotary-wing_UAV where there is a step-by-step description of how to use the GUI version of the toolbox. Two examples at Narrabeen-Collaroy, Australia are provided.
All the best, Kilian
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-467711956, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJMeSg07WdaqjyOss8jfIdPizXBKOks5vRf6AgaJpZM4PkjNN .
hi @shimonfrancis , you can download the GUI here: kvosGUI. All the GUI-specific functions are located in the folder demoGUI. To get started, go into this folder and run the script GUIforUAVToolbox.m. You can use the two examples of inputsFile to setup it up for your specific location.
Sir is there any specification needed for it ...for example MATLAB version etc..
On Fri, Mar 1, 2019, 10:24 AM Kilian Vos notifications@github.com wrote:
hi @shimonfrancis https://github.com/shimonfrancis , you can download the GUI here: kvosGUI https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/tree/kvosGUI. All the GUI-specific functions are located in the folder demoGUI. To get started, go into this folder and run the script GUIforUAVToolbox.m. You can use the two examples of inputsFile to setup it up for your specific location.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-468542435, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJCmQ8Vr-3ko0oJOHlDyG8iESS_rrks5vSLJ4gaJpZM4PkjNN .
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos notifications@github.com wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-468554488, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJGlw7E4P2RbDHHtfkqAIQ04LJtneks5vSMN_gaJpZM4PkjNN .
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis notifications@github.com wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos notifications@github.com wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-468554488, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJGlw7E4P2RbDHHtfkqAIQ04LJtneks5vSMN_gaJpZM4PkjNN .
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
Sir, How could we add more than one camera in rectification.such as in argus system through the UAV processing toolbox
On Wed, Mar 6, 2019 at 3:57 AM Rob Holman notifications@github.com wrote:
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis notifications@github.com wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos notifications@github.com wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-468554488 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJGlw7E4P2RbDHHtfkqAIQ04LJtneks5vSMN_gaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-469883750, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJBG478OFjDLQkr8bw61CVlC5YZfhks5vTu9WgaJpZM4PkjNN .
There is a function called makeMerge that does this (along with some support routines) but we haven’t yet put it on GitHub. This is a topic we will likely broach in future bootcamps.
YHS
On Mar 12, 2019, at 3:50 AM, shimonfrancis notifications@github.com wrote:
Sir, How could we add more than one camera in rectification.such as in argus system through the UAV processing toolbox
On Wed, Mar 6, 2019 at 3:57 AM Rob Holman notifications@github.com wrote:
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis notifications@github.com wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos notifications@github.com wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-468554488 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJGlw7E4P2RbDHHtfkqAIQ04LJtneks5vSMN_gaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-469883750, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJBG478OFjDLQkr8bw61CVlC5YZfhks5vTu9WgaJpZM4PkjNN .
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
Sir, So we cannot combine data from more than one camera using uav tool box.Right sir?
On Tue, Mar 12, 2019 at 9:44 PM Rob Holman notifications@github.com wrote:
There is a function called makeMerge that does this (along with some support routines) but we haven’t yet put it on GitHub. This is a topic we will likely broach in future bootcamps.
YHS
On Mar 12, 2019, at 3:50 AM, shimonfrancis notifications@github.com wrote:
Sir, How could we add more than one camera in rectification.such as in argus system through the UAV processing toolbox
On Wed, Mar 6, 2019 at 3:57 AM Rob Holman notifications@github.com wrote:
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis <notifications@github.com
wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos <notifications@github.com
wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-469883750 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJBG478OFjDLQkr8bw61CVlC5YZfhks5vTu9WgaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-472066413, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJFfTAGOb9L35k1Rlf3pFh4ZVU4fpks5vV9JTgaJpZM4PkjNN .
In fact, the extension from one to multiple cameras is not difficult. But we haven’t made a version that works with independent information (i.e. not using the Argus database). It can become a higher priority if there is a need.
On Mar 13, 2019, at 2:30 AM, shimonfrancis notifications@github.com wrote:
Sir, So we cannot combine data from more than one camera using uav tool box.Right sir?
On Tue, Mar 12, 2019 at 9:44 PM Rob Holman notifications@github.com wrote:
There is a function called makeMerge that does this (along with some support routines) but we haven’t yet put it on GitHub. This is a topic we will likely broach in future bootcamps.
YHS
On Mar 12, 2019, at 3:50 AM, shimonfrancis notifications@github.com wrote:
Sir, How could we add more than one camera in rectification.such as in argus system through the UAV processing toolbox
On Wed, Mar 6, 2019 at 3:57 AM Rob Holman notifications@github.com wrote:
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis <notifications@github.com
wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos <notifications@github.com
wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-469883750 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJBG478OFjDLQkr8bw61CVlC5YZfhks5vTu9WgaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-472066413, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJFfTAGOb9L35k1Rlf3pFh4ZVU4fpks5vV9JTgaJpZM4PkjNN .
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
sir, In Argus systems you are taking the height of the camera mounted as 40 meters . If I'm changing the height of the camera as 10 meters or etc. How could I change it in the UAV processing toolbox. With Regards Shimon Francis
On Wed, Mar 13, 2019 at 11:13 PM Rob Holman notifications@github.com wrote:
In fact, the extension from one to multiple cameras is not difficult. But we haven’t made a version that works with independent information (i.e. not using the Argus database). It can become a higher priority if there is a need.
On Mar 13, 2019, at 2:30 AM, shimonfrancis notifications@github.com wrote:
Sir, So we cannot combine data from more than one camera using uav tool box.Right sir?
On Tue, Mar 12, 2019 at 9:44 PM Rob Holman notifications@github.com wrote:
There is a function called makeMerge that does this (along with some support routines) but we haven’t yet put it on GitHub. This is a topic we will likely broach in future bootcamps.
YHS
On Mar 12, 2019, at 3:50 AM, shimonfrancis <notifications@github.com
wrote:
Sir, How could we add more than one camera in rectification.such as in argus system through the UAV processing toolbox
On Wed, Mar 6, 2019 at 3:57 AM Rob Holman notifications@github.com wrote:
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis < notifications@github.com
wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos < notifications@github.com
wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-472066413 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJFfTAGOb9L35k1Rlf3pFh4ZVU4fpks5vV9JTgaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-472531516, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJLdk3titEsjs1FDFaQbB9w2XRKoPks5vWTjLgaJpZM4PkjNN .
I’m not sure what you are looking at, but the position of the UAV camera should be input if you know it. So not just the height, but the x,y,z. If you are solving using the beta vector, these are the first three elements. They can change with every frame (cuz it is a UAV).
On Mar 27, 2019, at 10:54 PM, shimonfrancis notifications@github.com wrote:
sir, In Argus systems you are taking the height of the camera mounted as 40 meters . If I'm changing the height of the camera as 10 meters or etc. How could I change it in the UAV processing toolbox. With Regards Shimon Francis
On Wed, Mar 13, 2019 at 11:13 PM Rob Holman notifications@github.com wrote:
In fact, the extension from one to multiple cameras is not difficult. But we haven’t made a version that works with independent information (i.e. not using the Argus database). It can become a higher priority if there is a need.
On Mar 13, 2019, at 2:30 AM, shimonfrancis notifications@github.com wrote:
Sir, So we cannot combine data from more than one camera using uav tool box.Right sir?
On Tue, Mar 12, 2019 at 9:44 PM Rob Holman notifications@github.com wrote:
There is a function called makeMerge that does this (along with some support routines) but we haven’t yet put it on GitHub. This is a topic we will likely broach in future bootcamps.
YHS
On Mar 12, 2019, at 3:50 AM, shimonfrancis <notifications@github.com
wrote:
Sir, How could we add more than one camera in rectification.such as in argus system through the UAV processing toolbox
On Wed, Mar 6, 2019 at 3:57 AM Rob Holman notifications@github.com wrote:
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis < notifications@github.com
wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos < notifications@github.com
wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-472066413 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJFfTAGOb9L35k1Rlf3pFh4ZVU4fpks5vV9JTgaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-472531516, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJLdk3titEsjs1FDFaQbB9w2XRKoPks5vWTjLgaJpZM4PkjNN .
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
Sir When we are giving the z value at the inputs file .... When the value of z is below about 40 its giving us an error message....and at 40 or above it its giving us the results. And sir we are using a camera at a fixed position
On Thu, Mar 28, 2019, 9:28 PM Rob Holman notifications@github.com wrote:
I’m not sure what you are looking at, but the position of the UAV camera should be input if you know it. So not just the height, but the x,y,z. If you are solving using the beta vector, these are the first three elements. They can change with every frame (cuz it is a UAV).
On Mar 27, 2019, at 10:54 PM, shimonfrancis notifications@github.com wrote:
sir, In Argus systems you are taking the height of the camera mounted as 40 meters . If I'm changing the height of the camera as 10 meters or etc. How could I change it in the UAV processing toolbox. With Regards Shimon Francis
On Wed, Mar 13, 2019 at 11:13 PM Rob Holman notifications@github.com wrote:
In fact, the extension from one to multiple cameras is not difficult. But we haven’t made a version that works with independent information (i.e. not using the Argus database). It can become a higher priority if there is a need.
On Mar 13, 2019, at 2:30 AM, shimonfrancis <notifications@github.com
wrote:
Sir, So we cannot combine data from more than one camera using uav tool box.Right sir?
On Tue, Mar 12, 2019 at 9:44 PM Rob Holman <notifications@github.com
wrote:
There is a function called makeMerge that does this (along with some support routines) but we haven’t yet put it on GitHub. This is a topic we will likely broach in future bootcamps.
YHS
On Mar 12, 2019, at 3:50 AM, shimonfrancis < notifications@github.com
wrote:
Sir, How could we add more than one camera in rectification.such as in argus system through the UAV processing toolbox
On Wed, Mar 6, 2019 at 3:57 AM Rob Holman < notifications@github.com> wrote:
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis < notifications@github.com
wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos < notifications@github.com
wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-472531516 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJLdk3titEsjs1FDFaQbB9w2XRKoPks5vWTjLgaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-477658611, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJGwDTU4fCqUTN2wiRh4gGyXPGEFnks5vbOa9gaJpZM4PkjNN .
Can you tell me the actual line where you are entering this. Cut and paste a segment of the file.
On Mar 28, 2019, at 9:02 AM, shimonfrancis notifications@github.com wrote:
Sir When we are giving the z value at the inputs file .... When the value of z is below about 40 its giving us an error message....and at 40 or above it its giving us the results. And sir we are using a camera at a fixed position
On Thu, Mar 28, 2019, 9:28 PM Rob Holman notifications@github.com wrote:
I’m not sure what you are looking at, but the position of the UAV camera should be input if you know it. So not just the height, but the x,y,z. If you are solving using the beta vector, these are the first three elements. They can change with every frame (cuz it is a UAV).
On Mar 27, 2019, at 10:54 PM, shimonfrancis notifications@github.com wrote:
sir, In Argus systems you are taking the height of the camera mounted as 40 meters . If I'm changing the height of the camera as 10 meters or etc. How could I change it in the UAV processing toolbox. With Regards Shimon Francis
On Wed, Mar 13, 2019 at 11:13 PM Rob Holman notifications@github.com wrote:
In fact, the extension from one to multiple cameras is not difficult. But we haven’t made a version that works with independent information (i.e. not using the Argus database). It can become a higher priority if there is a need.
On Mar 13, 2019, at 2:30 AM, shimonfrancis <notifications@github.com
wrote:
Sir, So we cannot combine data from more than one camera using uav tool box.Right sir?
On Tue, Mar 12, 2019 at 9:44 PM Rob Holman <notifications@github.com
wrote:
There is a function called makeMerge that does this (along with some support routines) but we haven’t yet put it on GitHub. This is a topic we will likely broach in future bootcamps.
YHS
On Mar 12, 2019, at 3:50 AM, shimonfrancis < notifications@github.com
wrote:
Sir, How could we add more than one camera in rectification.such as in argus system through the UAV processing toolbox
On Wed, Mar 6, 2019 at 3:57 AM Rob Holman < notifications@github.com> wrote:
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis < notifications@github.com
wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos < notifications@github.com
wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-472531516 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJLdk3titEsjs1FDFaQbB9w2XRKoPks5vWTjLgaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-477658611, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJGwDTU4fCqUTN2wiRh4gGyXPGEFnks5vbOa9gaJpZM4PkjNN .
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
Sir, The line is ' inputs.zCam = 10;' if I'm entering the value as 10 then after plotting gcp's its giving an error. When I change the value as 40 then the rectification can be performed smoothly....
On Thu, Mar 28, 2019 at 9:38 PM Rob Holman notifications@github.com wrote:
Can you tell me the actual line where you are entering this. Cut and paste a segment of the file.
On Mar 28, 2019, at 9:02 AM, shimonfrancis notifications@github.com wrote:
Sir When we are giving the z value at the inputs file .... When the value of z is below about 40 its giving us an error message....and at 40 or above it its giving us the results. And sir we are using a camera at a fixed position
On Thu, Mar 28, 2019, 9:28 PM Rob Holman notifications@github.com wrote:
I’m not sure what you are looking at, but the position of the UAV camera should be input if you know it. So not just the height, but the x,y,z. If you are solving using the beta vector, these are the first three elements. They can change with every frame (cuz it is a UAV).
On Mar 27, 2019, at 10:54 PM, shimonfrancis < notifications@github.com> wrote:
sir, In Argus systems you are taking the height of the camera mounted as 40 meters . If I'm changing the height of the camera as 10 meters or etc. How could I change it in the UAV processing toolbox. With Regards Shimon Francis
On Wed, Mar 13, 2019 at 11:13 PM Rob Holman < notifications@github.com> wrote:
In fact, the extension from one to multiple cameras is not difficult. But we haven’t made a version that works with independent information (i.e. not using the Argus database). It can become a higher priority if there is a need.
On Mar 13, 2019, at 2:30 AM, shimonfrancis < notifications@github.com
wrote:
Sir, So we cannot combine data from more than one camera using uav tool box.Right sir?
On Tue, Mar 12, 2019 at 9:44 PM Rob Holman < notifications@github.com
wrote:
There is a function called makeMerge that does this (along with some support routines) but we haven’t yet put it on GitHub. This is a topic we will likely broach in future bootcamps.
YHS
On Mar 12, 2019, at 3:50 AM, shimonfrancis < notifications@github.com
wrote:
Sir, How could we add more than one camera in rectification.such as in argus system through the UAV processing toolbox
On Wed, Mar 6, 2019 at 3:57 AM Rob Holman < notifications@github.com> wrote:
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis < notifications@github.com
wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos < notifications@github.com
wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-477658611 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJGwDTU4fCqUTN2wiRh4gGyXPGEFnks5vbOa9gaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-477662617, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJPw1I-5EnO6g-d-Ajtt6Fik_NEikks5vbOjxgaJpZM4PkjNN .
Sorry but I’m a bit lost. Can you attach the whole inputs m-file or more detail on what you are doing.
On Mar 28, 2019, at 9:30 AM, shimonfrancis notifications@github.com wrote:
Sir, The line is ' inputs.zCam = 10;' if I'm entering the value as 10 then after plotting gcp's its giving an error. When I change the value as 40 then the rectification can be performed smoothly....
On Thu, Mar 28, 2019 at 9:38 PM Rob Holman notifications@github.com wrote:
Can you tell me the actual line where you are entering this. Cut and paste a segment of the file.
On Mar 28, 2019, at 9:02 AM, shimonfrancis notifications@github.com wrote:
Sir When we are giving the z value at the inputs file .... When the value of z is below about 40 its giving us an error message....and at 40 or above it its giving us the results. And sir we are using a camera at a fixed position
On Thu, Mar 28, 2019, 9:28 PM Rob Holman notifications@github.com wrote:
I’m not sure what you are looking at, but the position of the UAV camera should be input if you know it. So not just the height, but the x,y,z. If you are solving using the beta vector, these are the first three elements. They can change with every frame (cuz it is a UAV).
On Mar 27, 2019, at 10:54 PM, shimonfrancis < notifications@github.com> wrote:
sir, In Argus systems you are taking the height of the camera mounted as 40 meters . If I'm changing the height of the camera as 10 meters or etc. How could I change it in the UAV processing toolbox. With Regards Shimon Francis
On Wed, Mar 13, 2019 at 11:13 PM Rob Holman < notifications@github.com> wrote:
In fact, the extension from one to multiple cameras is not difficult. But we haven’t made a version that works with independent information (i.e. not using the Argus database). It can become a higher priority if there is a need.
On Mar 13, 2019, at 2:30 AM, shimonfrancis < notifications@github.com
wrote:
Sir, So we cannot combine data from more than one camera using uav tool box.Right sir?
On Tue, Mar 12, 2019 at 9:44 PM Rob Holman < notifications@github.com
wrote:
There is a function called makeMerge that does this (along with some support routines) but we haven’t yet put it on GitHub. This is a topic we will likely broach in future bootcamps.
YHS
On Mar 12, 2019, at 3:50 AM, shimonfrancis < notifications@github.com
wrote:
Sir, How could we add more than one camera in rectification.such as in argus system through the UAV processing toolbox
On Wed, Mar 6, 2019 at 3:57 AM Rob Holman < notifications@github.com> wrote:
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis < notifications@github.com
wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos < notifications@github.com
wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-477658611 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJGwDTU4fCqUTN2wiRh4gGyXPGEFnks5vbOa9gaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-477662617, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJPw1I-5EnO6g-d-Ajtt6Fik_NEikks5vbOjxgaJpZM4PkjNN .
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
Sir Sorry for making you stressed . I'm attaching the inputs file. Please find the attached file. Regards Shimon Francis
On Thu, Mar 28, 2019 at 10:00 PM shimon francis shimon.francis@gmail.com wrote:
Sir, The line is ' inputs.zCam = 10;' if I'm entering the value as 10 then after plotting gcp's its giving an error. When I change the value as 40 then the rectification can be performed smoothly....
On Thu, Mar 28, 2019 at 9:38 PM Rob Holman notifications@github.com wrote:
Can you tell me the actual line where you are entering this. Cut and paste a segment of the file.
On Mar 28, 2019, at 9:02 AM, shimonfrancis notifications@github.com wrote:
Sir When we are giving the z value at the inputs file .... When the value of z is below about 40 its giving us an error message....and at 40 or above it its giving us the results. And sir we are using a camera at a fixed position
On Thu, Mar 28, 2019, 9:28 PM Rob Holman notifications@github.com wrote:
I’m not sure what you are looking at, but the position of the UAV camera should be input if you know it. So not just the height, but the x,y,z. If you are solving using the beta vector, these are the first three elements. They can change with every frame (cuz it is a UAV).
On Mar 27, 2019, at 10:54 PM, shimonfrancis < notifications@github.com> wrote:
sir, In Argus systems you are taking the height of the camera mounted as 40 meters . If I'm changing the height of the camera as 10 meters or etc. How could I change it in the UAV processing toolbox. With Regards Shimon Francis
On Wed, Mar 13, 2019 at 11:13 PM Rob Holman < notifications@github.com> wrote:
In fact, the extension from one to multiple cameras is not difficult. But we haven’t made a version that works with independent information (i.e. not using the Argus database). It can become a higher priority if there is a need.
On Mar 13, 2019, at 2:30 AM, shimonfrancis < notifications@github.com
wrote:
Sir, So we cannot combine data from more than one camera using uav tool box.Right sir?
On Tue, Mar 12, 2019 at 9:44 PM Rob Holman < notifications@github.com
wrote:
There is a function called makeMerge that does this (along with some support routines) but we haven’t yet put it on GitHub. This is a topic we will likely broach in future bootcamps.
YHS
On Mar 12, 2019, at 3:50 AM, shimonfrancis < notifications@github.com
wrote:
Sir, How could we add more than one camera in rectification.such as in argus system through the UAV processing toolbox
On Wed, Mar 6, 2019 at 3:57 AM Rob Holman < notifications@github.com> wrote:
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis < notifications@github.com
wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos < notifications@github.com
wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-477658611 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJGwDTU4fCqUTN2wiRh4gGyXPGEFnks5vbOa9gaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-477662617, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJPw1I-5EnO6g-d-Ajtt6Fik_NEikks5vbOjxgaJpZM4PkjNN .
I don’t see an attachment. Why don’t you send it directly to my email at holman@coas.oregonstate.edu
On Mar 28, 2019, at 10:37 AM, shimonfrancis notifications@github.com wrote:
Sir Sorry for making you stressed . I'm attaching the inputs file. Please find the attached file. Regards Shimon Francis
On Thu, Mar 28, 2019 at 10:00 PM shimon francis shimon.francis@gmail.com wrote:
Sir, The line is ' inputs.zCam = 10;' if I'm entering the value as 10 then after plotting gcp's its giving an error. When I change the value as 40 then the rectification can be performed smoothly....
On Thu, Mar 28, 2019 at 9:38 PM Rob Holman notifications@github.com wrote:
Can you tell me the actual line where you are entering this. Cut and paste a segment of the file.
On Mar 28, 2019, at 9:02 AM, shimonfrancis notifications@github.com wrote:
Sir When we are giving the z value at the inputs file .... When the value of z is below about 40 its giving us an error message....and at 40 or above it its giving us the results. And sir we are using a camera at a fixed position
On Thu, Mar 28, 2019, 9:28 PM Rob Holman notifications@github.com wrote:
I’m not sure what you are looking at, but the position of the UAV camera should be input if you know it. So not just the height, but the x,y,z. If you are solving using the beta vector, these are the first three elements. They can change with every frame (cuz it is a UAV).
On Mar 27, 2019, at 10:54 PM, shimonfrancis < notifications@github.com> wrote:
sir, In Argus systems you are taking the height of the camera mounted as 40 meters . If I'm changing the height of the camera as 10 meters or etc. How could I change it in the UAV processing toolbox. With Regards Shimon Francis
On Wed, Mar 13, 2019 at 11:13 PM Rob Holman < notifications@github.com> wrote:
In fact, the extension from one to multiple cameras is not difficult. But we haven’t made a version that works with independent information (i.e. not using the Argus database). It can become a higher priority if there is a need.
On Mar 13, 2019, at 2:30 AM, shimonfrancis < notifications@github.com
wrote:
Sir, So we cannot combine data from more than one camera using uav tool box.Right sir?
On Tue, Mar 12, 2019 at 9:44 PM Rob Holman < notifications@github.com
wrote:
There is a function called makeMerge that does this (along with some support routines) but we haven’t yet put it on GitHub. This is a topic we will likely broach in future bootcamps.
YHS
On Mar 12, 2019, at 3:50 AM, shimonfrancis < notifications@github.com
wrote:
Sir, How could we add more than one camera in rectification.such as in argus system through the UAV processing toolbox
On Wed, Mar 6, 2019 at 3:57 AM Rob Holman < notifications@github.com> wrote:
A while back I mentioned that you need to use debug in matlab to put a breakpoint in just before the nlinfit call and check that you are passing in proper arguments. If unsure you can call findUVnDOF with the same arguments to ensure this works (which it apparently won’t). If you don’t know how to use debug in matlab, find someone who is local who can help.
YHS…Rob
On Mar 4, 2019, at 10:16 PM, shimonfrancis < notifications@github.com
wrote:
Sir I'm using uav tool box right nowand im getting an error such as 'Error using nlinfit (line 239) No usable observations after removing NaNs in Y and in the result of evaluating MODELFUN at the initial value BETA0.
Error in initUAVAnalysis (line 42) beta = nlinfit(xyz,[UV(:,1); UV(:,2)],'findUVnDOF',in.beta0);
Error in sampleAerielleVideoDemoPIX (line 93) [betas(1,:),meta.refPoints] = initUAVAnalysis(I, gcp, inputs, meta);' could you help me to solve this error . Regards Shimon Francis
On Fri, Mar 1, 2019 at 11:36 AM Kilian Vos < notifications@github.com
wrote:
-
Matlab 2016b or higher
2Hz video frames in .jpg format
Coordinates of at least 4 ground control points visible on the frames
Values for the intrinsic parameters and distortion coefficients of the camera: –Focal length (fu, fv) –Principal point offset (u0, v0) –2 radial distortion coefficients –2 tangential distortion coefficients
A geotagged snapshot taken right before starting to record the video [optional]
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <
,
or mute the thread <
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-477658611 , or mute the thread < https://github.com/notifications/unsubscribe-auth/AsZXJGwDTU4fCqUTN2wiRh4gGyXPGEFnks5vbOa9gaJpZM4PkjNN
.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Coastal-Imaging-Research-Network/UAV-Processing-Toolbox/issues/60#issuecomment-477662617, or mute the thread https://github.com/notifications/unsubscribe-auth/AsZXJPw1I-5EnO6g-d-Ajtt6Fik_NEikks5vbOjxgaJpZM4PkjNN .
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
Hi All,
I have a video that was captured during full sun on a light colored sandy beach with white aerial targets. I'm finding that the intensity threshold value for individual reference points is struggling to discern the targets from the background beach over time. I think the light must be changing through time during the video, and the threshold value that I set to find the target in the first frame does not work at all times during the video. However, the intensity of the target and background sand is so close that I can't find a single value that is able to track the target through the entire video.
Is it possible to feed an expression into the threshold value, e.g. to find the 99%ile intensity in the search window, or similar so that the actual threshold value is changing with each frame as the light changes?
Many thanks!