我尝试了不同的解决方案(例如this one https://stackoverflow.com/questions/25146557/how-do-i-get-the-color-of-a-pixel-in-a-uiimage-with-swift),但我得到的颜色看起来与真实图像有点不同。我想这是因为图像只是RGB
, not RGBA
。这可能是一个问题吗?
相关问题:如果 UIImage 有contentMode = .scaleAspectFill
,我是否必须重新计算图像,或者我可以使用imageView.image
?
EDIT:
我尝试使用这个扩展:
extension CALayer {
func getPixelColor(point: CGPoint) -> CGColor {
var pixel: [CUnsignedChar] = [0, 0, 0, 0]
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
let context = CGContext(data: &pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
context!.translateBy(x: -point.x, y: -point.y)
self.render(in: context!)
let red: CGFloat = CGFloat(pixel[0]) / 255.0
let green: CGFloat = CGFloat(pixel[1]) / 255.0
let blue: CGFloat = CGFloat(pixel[2]) / 255.0
let alpha: CGFloat = CGFloat(pixel[3]) / 255.0
let color = UIColor(red:red, green: green, blue:blue, alpha:alpha)
return color.cgColor
}
}
但对于某些图像来说,坐标系似乎被翻转了,对于其他图像,我得到了非常错误的值......我在这里错过了什么?
EDIT 2:
我尝试使用这些图像:
https://dl.dropboxusercontent.com/u/119600/gradient.png https://dl.dropboxusercontent.com/u/119600/gradient.png
https://dl.dropboxusercontent.com/u/119600/[电子邮件受保护] https://dl.dropboxusercontent.com/u/119600/gradient@2x.png
但我确实得到了错误的价值观。它们嵌入在一个UIImageView
但我转换坐标:
private func convertScreenPointToImage(point: CGPoint) -> CGPoint {
let widthMultiplier = gradientImage.size.width / UIScreen.main.bounds.width
let heightMultiplier = gradientImage.size.height / UIScreen.main.bounds.height
return CGPoint(x: point.x * widthMultiplier, y: point.y * heightMultiplier)
}
This one
给我=== Optional((51, 76, 184, 255))
在 iPhone 7 模拟器上运行时,这是不正确的......