I am making an app that uses CAlayer to show contents like image in a NSWindow. The layer has shadows styled with shadowPath to make a better appearance. Finally to save/export the whole CAlayer and its contents the parent NSView is converted to an NSImage. The shadow in NSImage is entirely different from that of the actual shadow in CALayer. I can't get the reason why this is happening. Is it a normal thing on AppKit or am I doing it wrong?
These are the difference in shadows:
image(1) - CAlayer with shadowPath (Shadow only in bottom).
image(2) - NSImage created from the superview (Shadow in 4 sides).
This is how shadow is added in image(1):
layer?.masksToBounds = false
let size: CGFloat = 100
let distance: CGFloat = 200
let rect = CGRect(
x: -size,
y: layer.frame.height - (size * 0.4) + distance,
width: layer.frame.width + size * 2,
height: size
)
layer.shadowColor = .black
layer.shadowRadius = 100
layer.shadowOpacity = 1
layer.shadowPath = NSBezierPath(ovalIn: rect).cgPath
This is how the superview is converted to NSImage image(2):
let imageRep = view.bitmapImageRepForCachingDisplay(in: view.bounds)
view.cacheDisplay(in: view.bounds, to: imageRep!)
let image = NSImage(size: view.bounds.size)
image.addRepresentation(imageRep!)
let imageData = image.tiffRepresentation
return NSImage(data: imageData!)!

One option - may or may not be suitable:
Here's an attempt - note: I work with iOS, so lots of hard-coded values and possibly (likely) incorrect ways to do this:
Output when running:
Result of
let img = view.imageRepresentation():