AVPlayer不渲染到其AVPlayerLayer

我有一个AVPlayerLayer(CALayer的子类),我需要进入可以传递给QCRenderer的图像类型(QCRenderer接受NSImages和CIImages)。我可以将CALayer转换为CGImageRef,并将其转换为NSImage,但是内容始终清晰。

我已将其缩小到以下两个原因之一:

  • 我没有正确创建NSImage。
  • AVPlayer不呈现给AVPlayerLayer。
  • 我没有收到任何错误,并找到一些有关转换CALayers的文档。 另外,我将AVPlayerLayer添加到NSView,它仍然是空的,所以我认为2是问题。

    我正在使用Apple的AVPlayerDemo的AVPlayerDemoPlaybackViewController的修改版本。 我把它变成了一个NSObject,因为我把所有的接口代码都剥离了出来。

    我在创建AVPlayer时在(void)prepareToPlayAsset:withKeys:方法中创建AVPlayerLayer:(我只是将该图层添加到NSView以测试它是否工作。)

     if (![self player])
    {
        /* Get a new AVPlayer initialized to play the specified player item. */
        [self setPlayer:[AVPlayer playerWithPlayerItem:self.mPlayerItem]];  
    
        /* Observe the AVPlayer "currentItem" property to find out when any 
         AVPlayer replaceCurrentItemWithPlayerItem: replacement will/did 
         occur.*/
        [self.player addObserver:self 
                      forKeyPath:kCurrentItemKey 
                         options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                         context:AVPlayerDemoPlaybackViewControllerCurrentItemObservationContext];
        mPlaybackView = [AVPlayerLayer playerLayerWithPlayer:self.player];
        [self.theView setWantsLayer:YES];
        [mPlaybackView setFrame:self.theView.layer.bounds];
        [self.theView.layer addSublayer:mPlaybackView];
    }
    

    然后我创建一个NSRunLoop来每秒抓取一次AVPlayerLayer帧30次:

    framegrabTimer = [NSTimer timerWithTimeInterval:(1/30) target:self selector:@selector(grabFrameFromMovie) userInfo:nil repeats:YES];
    [[NSRunLoop currentRunLoop] addTimer:framegrabTimer forMode:NSDefaultRunLoopMode];
    

    这里是我用来抓取框架并将其传递给处理QCRenderer的类的代码:

    -(void)grabFrameFromMovie {
    CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);
    CGContextRef theContext = CGBitmapContextCreate(NULL, mPlaybackView.frame.size.width, mPlaybackView.frame.size.height, 8, 4*mPlaybackView.frame.size.width, colorSpace, kCGImageAlphaPremultipliedLast);
    [mPlaybackView renderInContext:theContext];
    CGImageRef CGImage = CGBitmapContextCreateImage(theContext);
    NSImage *image = [[NSImage alloc] initWithCGImage:CGImage size:NSMakeSize(mPlaybackView.frame.size.width, mPlaybackView.frame.size.height)];
    [[NSNotificationCenter defaultCenter] postNotificationName:@"AVPlayerLoadedNewFrame" object:[image copy]];
    CGContextRelease(theContext);
    CGColorSpaceRelease(colorSpace);
    CGImageRelease(CGImage); }
    

    我无法弄清楚为什么我只能说清楚。 任何帮助都非常感谢,因为OS X没有足够的AVFoundation文档。


    它的作品对我来说:

    AVAssetImageGenerator *gen = [[AVAssetImageGenerator alloc] initWithAsset:[[[self player] currentItem] asset]];
    CGImageRef capture = [gen copyCGImageAtTime:self.player.currentTime actualTime:NULL error:NULL];
    NSImage *img = [[NSImage alloc] initWithCGImage:capture size:self.playerView.frame.size];
    

    您可以将AVPlayerItemVideoOutput添加到AVPlayerItem,然后调用copyPixelBufferForItemTime来查询包含指定时间帧的CVPixelBufferRef对象,以下是示例代码:

    NSDictionary *pixBuffAttributes = @{                                                                                      
        (id)kCVPixelBufferWidthKey:@(nWidth),                                 
        (id)kCVPixelBufferHeightKey:@(nHeight),                                  
        (id)kCVPixelBufferCGImageCompatibilityKey:@YES,
    };
    m_output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
    
    ...
    
    m_buffer = [m_output copyPixelBufferForItemTime:time itemTimeForDisplay:NULL];
    
    CVPixelBufferLockBaseAddress(m_buffer, 0);
    auto *buffer = CVPixelBufferGetBaseAddress(m_buffer);
    frame->width = CVPixelBufferGetWidth(m_buffer);
    frame->height = CVPixelBufferGetHeight(m_buffer);
    frame->widthbytes = CVPixelBufferGetBytesPerRow(m_buffer);
    frame->bufferlen = frame->widthbytes * (uint32)CVPixelBufferGetHeight(m_buffer);
    
    auto &videoInfo = m_info.video;
    CGDataProviderRef dp = CGDataProviderCreateWithData(nullptr, buffer, frame->bufferlen, nullptr);
    CGColorSpaceRef cs = CGColorSpaceCreateDeviceRGB();
    m_image = CGImageCreate(frame->width,
                            frame->height,
                            8,
                            videoInfo.pixelbits,
                            frame->widthbytes,
                            cs,
                            kCGImageAlphaNoneSkipFirst,
                            dp,
                            nullptr,
                            true,
                            kCGRenderingIntentDefault);
    CGColorSpaceRelease(cs);
    CGDataProviderRelease(dp);
    

    你可以结账苹果的官方样品:

    实时timeVideoProcessingUsingAVPlayerItemVideoOutput

    链接地址: http://www.djcxy.com/p/56881.html

    上一篇: AVPlayer not rendering to its AVPlayerLayer

    下一篇: Fullscreen: AVPlayerLayer is not resized when resizing its parent view