AVPlayer not rendering to its AVPlayerLayer

I have a AVPlayerLayer (subclass of CALayer) and I need to get in into a image type that can be passed to a QCRenderer (QCRenderer accepts NSImages and CIImages.) I can convert the CALayer to a CGImageRef, and that to an NSImage, but the contents is always clear.

I've narrowed it down to one of two reasons:

  • I am not creating the NSImage correctly.
  • The AVPlayer is not rendering to the AVPlayerLayer.
  • I am not receiving any errors, and have found some documentation on converting CALayers. Also, I added the AVPlayerLayer to an NSView, which remains empty so I believe 2 is the problem.

    I'm using a modified version of Apple's AVPlayerDemo's AVPlayerDemoPlaybackViewController. I turned it into an NSObject since I stripped all of the interface code out of it.

    I create the AVPlayerLayer in the (void)prepareToPlayAsset:withKeys: method when I create the AVPlayer: (I'm only adding the layer to a NSView to test if it is working.)

     if (![self player])
    {
        /* Get a new AVPlayer initialized to play the specified player item. */
        [self setPlayer:[AVPlayer playerWithPlayerItem:self.mPlayerItem]];  
    
        /* Observe the AVPlayer "currentItem" property to find out when any 
         AVPlayer replaceCurrentItemWithPlayerItem: replacement will/did 
         occur.*/
        [self.player addObserver:self 
                      forKeyPath:kCurrentItemKey 
                         options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                         context:AVPlayerDemoPlaybackViewControllerCurrentItemObservationContext];
        mPlaybackView = [AVPlayerLayer playerLayerWithPlayer:self.player];
        [self.theView setWantsLayer:YES];
        [mPlaybackView setFrame:self.theView.layer.bounds];
        [self.theView.layer addSublayer:mPlaybackView];
    }
    

    I then create a NSRunLoop to grab a frame of the AVPlayerLayer 30 times per second:

    framegrabTimer = [NSTimer timerWithTimeInterval:(1/30) target:self selector:@selector(grabFrameFromMovie) userInfo:nil repeats:YES];
    [[NSRunLoop currentRunLoop] addTimer:framegrabTimer forMode:NSDefaultRunLoopMode];
    

    Here is the code I use to grab the frame and pass it to the class that handles the QCRenderer:

    -(void)grabFrameFromMovie {
    CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);
    CGContextRef theContext = CGBitmapContextCreate(NULL, mPlaybackView.frame.size.width, mPlaybackView.frame.size.height, 8, 4*mPlaybackView.frame.size.width, colorSpace, kCGImageAlphaPremultipliedLast);
    [mPlaybackView renderInContext:theContext];
    CGImageRef CGImage = CGBitmapContextCreateImage(theContext);
    NSImage *image = [[NSImage alloc] initWithCGImage:CGImage size:NSMakeSize(mPlaybackView.frame.size.width, mPlaybackView.frame.size.height)];
    [[NSNotificationCenter defaultCenter] postNotificationName:@"AVPlayerLoadedNewFrame" object:[image copy]];
    CGContextRelease(theContext);
    CGColorSpaceRelease(colorSpace);
    CGImageRelease(CGImage); }
    

    I can't figure out why I'm only getting clear. Any help with this is greatly appreciated, as there is not enough AVFoundation documentation for OS X.


    它的作品对我来说:

    AVAssetImageGenerator *gen = [[AVAssetImageGenerator alloc] initWithAsset:[[[self player] currentItem] asset]];
    CGImageRef capture = [gen copyCGImageAtTime:self.player.currentTime actualTime:NULL error:NULL];
    NSImage *img = [[NSImage alloc] initWithCGImage:capture size:self.playerView.frame.size];
    

    You can add a AVPlayerItemVideoOutput to AVPlayerItem, and then call copyPixelBufferForItemTime to querying CVPixelBufferRef object which contain the frame in specified time, here is the sample code:

    NSDictionary *pixBuffAttributes = @{                                                                                      
        (id)kCVPixelBufferWidthKey:@(nWidth),                                 
        (id)kCVPixelBufferHeightKey:@(nHeight),                                  
        (id)kCVPixelBufferCGImageCompatibilityKey:@YES,
    };
    m_output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
    
    ...
    
    m_buffer = [m_output copyPixelBufferForItemTime:time itemTimeForDisplay:NULL];
    
    CVPixelBufferLockBaseAddress(m_buffer, 0);
    auto *buffer = CVPixelBufferGetBaseAddress(m_buffer);
    frame->width = CVPixelBufferGetWidth(m_buffer);
    frame->height = CVPixelBufferGetHeight(m_buffer);
    frame->widthbytes = CVPixelBufferGetBytesPerRow(m_buffer);
    frame->bufferlen = frame->widthbytes * (uint32)CVPixelBufferGetHeight(m_buffer);
    
    auto &videoInfo = m_info.video;
    CGDataProviderRef dp = CGDataProviderCreateWithData(nullptr, buffer, frame->bufferlen, nullptr);
    CGColorSpaceRef cs = CGColorSpaceCreateDeviceRGB();
    m_image = CGImageCreate(frame->width,
                            frame->height,
                            8,
                            videoInfo.pixelbits,
                            frame->widthbytes,
                            cs,
                            kCGImageAlphaNoneSkipFirst,
                            dp,
                            nullptr,
                            true,
                            kCGRenderingIntentDefault);
    CGColorSpaceRelease(cs);
    CGDataProviderRelease(dp);
    

    And you can checkout the apple's official sample :

    Real-timeVideoProcessingUsingAVPlayerItemVideoOutput

    链接地址: http://www.djcxy.com/p/56882.html

    上一篇: 使用AVPlayer的多个视频

    下一篇: AVPlayer不渲染到其AVPlayerLayer