pointfreeco / swift-snapshot-testing

📸 Delightful Swift snapshot testing.
https://www.pointfree.co/episodes/ep41-a-tour-of-snapshot-testing
MIT License
3.74k stars 568 forks source link

How to snapshot a view on macOS with provided screen scale? #428

Open darrarski opened 3 years ago

darrarski commented 3 years ago

When snapshot testing views on macOS recorded image size depends on the main display of the machine on which tests are run.

Example:

When snapshot testing views on iOS, I can provide a display scale trait collection to force the output image scale (as described in #427). Unfortunately, I haven't found a way to snapshot test macOS views, so the test results are not dependent on the display I am using (MacBook-embedded or external monitor).

I tried to work-around this issue with the code below:

extension Snapshotting where Value == NSViewController, Format == NSImage {
  static func unscalledImage(precision: Float = 1) -> Snapshotting {
    Snapshotting<NSView, NSImage>
      .unscalledImage(precision: precision)
      .pullback { $0.view }
  }
}

extension Snapshotting where Value == NSView, Format == NSImage {
  static func unscalledImage(precision: Float = 1) -> Snapshotting {
    SimplySnapshotting<NSImage>
      .image(precision: precision)
      .pullback { $0.toImage().unscaled() }
  }
}

private extension NSView {
  func toImage() -> NSImage {
    let cacheRep = bitmapImageRepForCachingDisplay(in: bounds)!
    cacheDisplay(in: bounds, to: cacheRep)
    let image = NSImage(size: bounds.size)
    image.addRepresentation(cacheRep)
    return image
  }
}

private extension NSImage {
  func unscaled() -> NSImage {
    let image = NSImage(size: size)
    image.addRepresentation(unscaledBitmapImageRep())
    return image
  }

  func unscaledBitmapImageRep() -> NSBitmapImageRep {
    let imageRep = NSBitmapImageRep(
      bitmapDataPlanes: nil,
      pixelsWide: Int(size.width),
      pixelsHigh: Int(size.height),
      bitsPerSample: 8,
      samplesPerPixel: 4,
      hasAlpha: true,
      isPlanar: false,
      colorSpaceName: .deviceRGB,
      bytesPerRow: 0,
      bitsPerPixel: 0
    )!
    imageRep.size = size
    NSGraphicsContext.saveGraphicsState()
    NSGraphicsContext.current = NSGraphicsContext(bitmapImageRep: imageRep)
    draw(at: .zero, from: .zero, operation: .sourceOver, fraction: 1)
    NSGraphicsContext.restoreGraphicsState()
    return imageRep
  }
}

And while it fixes the mismatch of the output image size (with the code above it will always equal to point-size of the view), it does not work as expected. It looks like scaling the image introduces minor distortion which causes test failures (for example, text on the snapshot image is slightly shifted 1 px).

Is there a way of forcing the scale of a snapshot image on macOS?

bzhoek commented 3 years ago

Hi @darrarski, did you get any further with this? I'm struggling with the same problem, but for me it's my laptop versus GitHub Actions.

darrarski commented 3 years ago

Unfortunately, I didn't have time to work on this issue and research for a solution.

bzhoek commented 3 years ago

Scaling the images down seems a dead end. Anti-aliasing and font-smoothing result in inconsistent results.

As a workaround I unscale the MacBook Retina display with https://github.com/th507/screen-resolution-switcher before I take the snapshots, and revert it afterwards. For my MacBook 16", it's like

scres -s 3072
xcodebuild ... test
scres -r 1536

This works for my use case of GitHub Actions.

DanKorkelia commented 1 year ago

Wonder if anyone has any luck making output sharper from macOS snapshots?