Closed yousefayyash92 closed 2 years ago
@yousefayyash92 That's a great idea, unfortunately, we are using one of Apple's APIs and specifying the tolerance ahead of time...when objects are off by more than the tolerance they aren't included in your vision request results.
One option, which I have used in my own projects, is to use perspective correction for objects that look skewed...take a look at Apple's Core Image perspective correction filter if that's something you need to do for a downstream task.
Cheers, Jon
for example if the degree tolorence is set to 35, if the object is more than 35degrees off 90 is it possible to relay that to the user or at least get the information