Open paintedbicycle opened 6 years ago
thanks for reporting this issue. do you have some time to improve this and send PR?
I don't know Swift/Objective C at all, but is it in here?
I notice here that there is a boolean asking if we should "scale if smaller". Normally, this would be a definite no (would make the image look pixelated), but if there is a variance of, say, <5 pixels it would be fine. And this might be our issue? That the cropper isn't zooming in slightly to enforce the developers pixel size?
The image returned from iCloud Photo Library should be very large in my case (20+ MP) and therefore bigger than the pixel dimensions so I'm not 100% confident in my assessment.
My other thought is in the floor
functions below. Perhaps the rounding down is causing the pixel issue?
Remember that in my case a 1875x1275 is sometimes being returned at 1874x1275.
-(UIImage*)resizedImageToFitInSize:(CGSize)boundingSize scaleIfSmaller:(BOOL)scale
{
// get the image size (independant of imageOrientation)
CGImageRef imgRef = self.CGImage;
CGSize srcSize = CGSizeMake(CGImageGetWidth(imgRef), CGImageGetHeight(imgRef)); // not equivalent to self.size (which depends on the imageOrientation)!
// adjust boundingSize to make it independant on imageOrientation too for farther computations
UIImageOrientation orient = self.imageOrientation;
switch (orient) {
case UIImageOrientationLeft:
case UIImageOrientationRight:
case UIImageOrientationLeftMirrored:
case UIImageOrientationRightMirrored:
boundingSize = CGSizeMake(boundingSize.height, boundingSize.width);
break;
default:
// NOP
break;
}
// Compute the target CGRect in order to keep aspect-ratio
CGSize dstSize;
if ( !scale && (srcSize.width < boundingSize.width) && (srcSize.height < boundingSize.height) ) {
//NSLog(@"Image is smaller, and we asked not to scale it in this case (scaleIfSmaller:NO)");
dstSize = srcSize; // no resize (we could directly return 'self' here, but we draw the image anyway to take image orientation into account)
} else {
CGFloat wRatio = boundingSize.width / srcSize.width;
CGFloat hRatio = boundingSize.height / srcSize.height;
if (wRatio < hRatio) {
//NSLog(@"Width imposed, Height scaled ; ratio = %f",wRatio);
dstSize = CGSizeMake(boundingSize.width, floorf(srcSize.height * wRatio));
} else {
//NSLog(@"Height imposed, Width scaled ; ratio = %f",hRatio);
dstSize = CGSizeMake(floorf(srcSize.width * hRatio), boundingSize.height);
}
}
return [self resizedImageToSize:dstSize];
}
@end
I don't know Swift/Objective C at all, but is it in here?
I notice here that there is a boolean asking if we should "scale if smaller". Normally, this would be a definite no (would make the image look pixelated), but if there is a variance of, say, <5 pixels it would be fine. And this might be our issue? That the cropper isn't zooming in slightly to enforce the developers pixel size?
The image returned from iCloud Photo Library should be very large in my case (20+ MP) and therefore bigger than the pixel dimensions so I'm not 100% confident in my assessment.
My other thought is in the
floor
functions below. Perhaps the rounding down is causing the pixel issue?Remember that in my case a 1875x1275 is sometimes being returned at 1874x1275.
-(UIImage*)resizedImageToFitInSize:(CGSize)boundingSize scaleIfSmaller:(BOOL)scale { // get the image size (independant of imageOrientation) CGImageRef imgRef = self.CGImage; CGSize srcSize = CGSizeMake(CGImageGetWidth(imgRef), CGImageGetHeight(imgRef)); // not equivalent to self.size (which depends on the imageOrientation)! // adjust boundingSize to make it independant on imageOrientation too for farther computations UIImageOrientation orient = self.imageOrientation; switch (orient) { case UIImageOrientationLeft: case UIImageOrientationRight: case UIImageOrientationLeftMirrored: case UIImageOrientationRightMirrored: boundingSize = CGSizeMake(boundingSize.height, boundingSize.width); break; default: // NOP break; } // Compute the target CGRect in order to keep aspect-ratio CGSize dstSize; if ( !scale && (srcSize.width < boundingSize.width) && (srcSize.height < boundingSize.height) ) { //NSLog(@"Image is smaller, and we asked not to scale it in this case (scaleIfSmaller:NO)"); dstSize = srcSize; // no resize (we could directly return 'self' here, but we draw the image anyway to take image orientation into account) } else { CGFloat wRatio = boundingSize.width / srcSize.width; CGFloat hRatio = boundingSize.height / srcSize.height; if (wRatio < hRatio) { //NSLog(@"Width imposed, Height scaled ; ratio = %f",wRatio); dstSize = CGSizeMake(boundingSize.width, floorf(srcSize.height * wRatio)); } else { //NSLog(@"Height imposed, Width scaled ; ratio = %f",hRatio); dstSize = CGSizeMake(floorf(srcSize.width * hRatio), boundingSize.height); } } return [self resizedImageToSize:dstSize]; } @end
Hey, Have you resolved it?
No, it caused me to move to a different library
@paintedbicycle now which library you are using? can you please share.
Have the same issue.
Version
Tell us which versions you are using:
react-native: 0.52.0 react-native-image-crop-picker: 0.19.1
Platform
Tell us to which platform this issue is related
Expected behaviour
Crop respects inputted dimensions.
Actual behaviour
Crop sometimes returns an image with slightly different dimensions
Steps to reproduce
Set pixel size to 1875 x 1275
Choose a landscape image taken from a DSRL (2:3 ratio)
Don't zoom in
Crop image
Description
The issue here is that the crop picker should zoom the image in before cropping to ensure the developers selected crop size is respected. If the crop size and the image size are very close (2:3 image with a ~2:3 crop), it can return an incorrect image size.
This is causing unexpected behaviour with my print-on-demand service. Namely the printing service completely rejects the image.