我想对一个执行一些操作CVPixelBufferRef
并出来一个cv::Mat
- 裁剪到感兴趣的区域
- 缩放到固定尺寸
- 均衡直方图
- 转换为灰度 - 每像素 8 位(
CV_8UC1
)
我不确定最有效的顺序是什么,但是,我确实知道所有操作都可以在 open:CV 矩阵上使用,所以我想知道如何转换它。
- (void) captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
cv::Mat frame = f(pixelBuffer); // how do I implement f()?
我在一些文章中找到了答案优秀的 GitHub 源代码 https://github.com/shihongzhi/CamShift-on-iOS/blob/05ab212064c697a96b77c0a3a313ec48f94d6a2c/CamShift/VideoCaptureViewController.mm。为了简单起见,我在这里对其进行了调整。它还为我进行灰度转换。
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
OSType format = CVPixelBufferGetPixelFormatType(pixelBuffer);
// Set the following dict on AVCaptureVideoDataOutput's videoSettings to get YUV output
// @{ kCVPixelBufferPixelFormatTypeKey : kCVPixelFormatType_420YpCbCr8BiPlanarFullRange }
NSAssert(format == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, @"Only YUV is supported");
// The first plane / channel (at index 0) is the grayscale plane
// See more infomation about the YUV format
// http://en.wikipedia.org/wiki/YUV
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
void *baseaddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
CGFloat width = CVPixelBufferGetWidth(pixelBuffer);
CGFloat height = CVPixelBufferGetHeight(pixelBuffer);
cv::Mat mat(height, width, CV_8UC1, baseaddress, 0);
// Use the mat here
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
我认为最好的顺序是:
- 转换为灰度(因为它几乎是自动完成的)
- 裁剪(这应该是一个快速操作,并且会减少要使用的像素数量)
- 缩小
- 均衡直方图
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)