Open Pablo-Paniagua opened 3 years ago
Thanks a lot Pablo for reporting this issue. I already expected that the whole move_tolerance feature might have some weird results. I'll look into it later.
Just out of curiosity:
And yeah - this damn parallel executor makes things faster. But also more complex 🙈
The compared images are all RGB. And I am working with small images (full size is 128*160 px) so the identified difference is rather close to the edge on all cases. I have some masking in there and my images are so small that I had to go in and change the image placeholder start and end points to be exactly what I tell them to in the json file. Otherwise I kept eating over elements I wanted to keep.
I'll check the move tolerance issue with smaller images in the next days. I'm pretty sure that I somewhere "overstep" a boundary of the small image region..
I did a few tests with small images (150x150px) that have moved text blocks and also noticed that different images are considered to be equal.
Short version:
Try to add the argument ignore_watermarks=False
to the Compare Images
keyword.
And check if it has an effect on your results.
Long version: The root cause is a hidden "feature" that I embedded specifically for the outputs in our company. They have a small "watermark" in the middle of the document, which shows the test environment name on which it was created. By default, the library will ignore failed checks, if
With ignore_watermarks=False
you can disable this feature
I ran several tests with small images having added the argument ignore_watermarks=False to the Compare Images keyword. It still seems to ignore difference in the pictures as soon as I add the move tolerance to them. The Compare Image is capable of finding differences with move_tolerance if the images are bigger (2 times the size I need). I added masks per section to try and ease the task but that doesn't help in any significant way.
Hey Many.
I have been experimenting and playing with the library a bunch, and believe the issue is not with the ignore_watermarks
feature, but in the move_tolerance
section itself (line 358 in VisualTest.py). The error is not showing because of the parallel executor being used (leading to the worker doing the task never finishing), but it is getting stuck in converting the image to grey in line 212. All of this happening because the self.BORDER_FOR_MOVE_TOLERANCE_CHECK
causing the cropped image margins to fall outside of the image itself and thus smaller than 0 (which becomes frequent with such small images).
It can be easily fixed by either setting self.BORDER_FOR_MOVE_TOLERANCE_CHECK = 0
(which I am avoiding) or changing line 358 to
search_area_candidate = candidate[(y - self.BORDER_FOR_MOVE_TOLERANCE_CHECK) if y >= self.BORDER_FOR_MOVE_TOLERANCE_CHECK else 0:y + h + self.BORDER_FOR_MOVE_TOLERANCE_CHECK, (x - self.BORDER_FOR_MOVE_TOLERANCE_CHECK) if x >= self.BORDER_FOR_MOVE_TOLERANCE_CHECK else 0:x + w + self.BORDER_FOR_MOVE_TOLERANCE_CHECK]
or something similar that sets the values to 0 when becoming negative.
I have not had time to check if the issue also rises when going outside of the image dimensions (falling outside of the original image area), so I will get back with those results when I can.
And... I am back.
I have now also tested what happens when going out of the reference image borders and find that it results in the min_val
of the cv2.matchTemplate
to tend to 0, detecting elements to be there when not. It can also be fixed by constraining the limits to remain within the image. I have done this by getting the reference image width and height (wr and hr respectively) wr, hr, _ = reference.shape
and then adjusting the line 358 to (and sorry for the horrible single line expression):
search_area_candidate = candidate[(y - self.BORDER_FOR_MOVE_TOLERANCE_CHECK) if y >= self.BORDER_FOR_MOVE_TOLERANCE_CHECK else 0:(y + h + self.BORDER_FOR_MOVE_TOLERANCE_CHECK) if hr >= (y + h + self.BORDER_FOR_MOVE_TOLERANCE_CHECK) else hr, (x - self.BORDER_FOR_MOVE_TOLERANCE_CHECK) if x >= self.BORDER_FOR_MOVE_TOLERANCE_CHECK else 0:(x + w + self.BORDER_FOR_MOVE_TOLERANCE_CHECK) if wr >= (x + w + self.BORDER_FOR_MOVE_TOLERANCE_CHECK) else wr]
This seems to solve all the issues with small images and move_tolerance
.
Amazing job, thanks a lot for your investigation. I'll check and implement your suggestion in the next days. And I guess it's also a good idea to add a unit test for that special scenario.
@manykarim Thank you for developing this library. I found it useful for our image comparison tests. However, we faced this move_tolerance issue in the latest release available now [v0.8.1] and because of this, we are unable to use this library. Were you able to reproduce this issue? Is there any parameters/ workaround by which we can overcome this issue? Hope you will advice a solution for this. Thank You!
Issue Details: Image comparison pass when we specify 'move_tolerance' argument even though the compared images are different. An example screenshot is given here. Comparison code used:
Compare Images images/annotations_without_text1.jpg images/no_annotations.jpg move_tolerance=1
Compare Images images/annotations_without_text1.jpg images/no_annotations.jpg move_tolerance=1 ignore_watermarks=False
Thank you for your feedback @surekhakv . That does not sound good.
Unfortunately I did not mark this issue as closed, even though I'm sure I fixed something a while back. I will look into it.
In the meantime:
If the comparison worked as expected in a previous release, you could try installing that specific release by
pip install -U robotframework-doctestlibrary==0.8.0
Also just to confirm:
Without the move_tolerance
option, the test is FAILED, correct?
@manykarim
Without the move_tolerance option, the test is FAILED, correct?
Thank you for the quick response.
Yes. Test is FAILED without move_tolerance
option.
Though I checked with old versions from 0.3.1 to latest, test result obtained is same as in latest version. I could see that v.0.2.0 has a bug fix for "move_tolerance checks at the edge of image". However, I could not install that version because of the error "ERROR: No matching distribution found for robotframework-doctestlibrary==0.2.0
".
Kindly inform if you need any additional input on the test case/scenario I used. Thank you.
Hello. I am having some problems with the VisualTest part of the library. I am comparing two images without a move_tolerance and I get (as expected) that they are not the same. I tried adding some move_tolerance to see if that would affect and it certainly does. Even with a move_tolerance=0 it detectes that the images are the same, when they are not. I tried going into the library to find out, and see that the chech_for_differences is running in a parallel_executor, and when checking where the issue is I find that the library gets stuck without complaining when getting to find_partial_image_position. It doesn't get past img_gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) and the result shows the images are the same. I tired taking the check_for_differences outside of the parallel_exectuor and then get the error: (-215:Assertion failed) !_src.empty() in function 'cv::cvtColor' Any ideas of why this might be?
In case it is relevant: