Closed cltnschlosser closed 3 years ago
Hello. I took a quick look at this from the Twitter app and did not see any stack frames with the term "jpeg" or "decode" in them. I've intentionally run this without symbols. Can you provide an Instruments trace or some screenshots with what you are seeing?
Hmm, yeah I can definitely get that. I'll do it tomorrow. One thing is these UIImage
s are being served to SwiftUI.Image
s. Maybe SwiftUI is missing an optimization. I can double check if that is the issue too.
Not pictured is the same stack trace you have. It's on both the main and background threads.
It looks like a tick of a CADisplayLink timer is creating a Core Animation transaction that's kicking this off. I've not used SwiftUI, but in UIKit terms, I suspect the frame the view is being set to a size that does not match the underlying image that has been downloaded (and a contentMode of "fit").
Maybe you can set a breakpoint in setFrame
on the view and see where it's being set?
Hmm, there is no real setFrame equivalent, but I just tried replacing Image(uiImage: image)
with _UIImageView(image: image)
. and there is no decoding on the main thread, so seems like a bug in SwiftUI.Image.
private struct _UIImageView: UIViewRepresentable {
private let image: UIImage
init(image: UIImage) {
self.image = image
}
func makeUIView(context: Context) -> UIImageView {
return UIImageView()
}
func updateUIView(_ uiView: UIImageView, context: Context) {
uiView.image = image
}
typealias UIViewType = UIImageView
}
Describe the bug Even though there exists code in TIP to decode the image on the background thread, apple is still decoding the image again on the main thread. Seems some recent change caused the decoding in TIP to not function.
To Reproduce Load remote images under profiler and do a filter for jpeg.
Expected behavior Images are decompressed only on background thread.
Screenshots If applicable, add screenshots to help explain your problem.
Environment iOS 14 (coworker claims it happens on iOS 13 as well)
Additional context So there are 2 weird things: 1) As a part of decoding the image that is returned from the background render is never used, only the original UIImage instance is used. Also this does result in 2 different UIImage instances. Seems the original maintains unmodified, it's possible this was a more recent iOS change. 2) The image is rendered to 1x1. I assume this was a memory optimization, but it means the rendered image can't just be swapped in place of the original one.
I'll create a PR soon, need to get more familiar with the code and what is best practice for decoding images off the main thread. Wanted to create the issue now to increase awareness for those that are more familiar with this code.