2015-09-24 10 views
6

Ich habe mit dem iOS9 SDK einige Probleme beim Zuschneiden.ios9 - Probleme beim Zuschneiden CVImageBuffer

Ich habe den folgenden Code, um ein Bild zu skalieren (Umwandlung von 4: 3 nach 16: 9 durch Zuschneiden in der Mitte). Dies funktionierte gut bis iOS8 SDK. Mit iOS 9 ist der untere Bereich leer.

(CMSampleBufferRef)resizeImage:(CMSampleBufferRef) sampleBuffer { 
    { 
     CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
     CVPixelBufferLockBaseAddress(imageBuffer,0); 

     int target_width = CVPixelBufferGetWidth(imageBuffer); 
     int target_height = CVPixelBufferGetHeight(imageBuffer); 
     int height = CVPixelBufferGetHeight(imageBuffer); 
     int width = CVPixelBufferGetWidth(imageBuffer); 

     int x=0, y=0; 

     // Convert 16:9 to 4:3 
     if (((target_width*3)/target_height) == 4) 
     { 
      target_height = ((target_width*9)/16); 
      target_height = ((target_height + 15)/16) * 16; 
      y = (height - target_height)/2; 
     } 
     else 
     if ((target_width == 352) && (target_height == 288)) 
     { 
      target_height = ((target_width*9)/16); 
      target_height = ((target_height + 15)/16) * 16; 
      y = (height - target_height)/2; 
     } 
     else 
     if (((target_height*3)/target_width) == 4) 
     { 
      target_width = ((target_height*9)/16); 
      target_width = ((target_width + 15)/16) * 16; 
       x = ((width - target_width)/2); 
     } 
     else 
     if ((target_width == 288) && (target_height == 352)) 
     { 
      target_width = ((target_height*9)/16); 
      target_width = ((target_width + 15)/16) * 16; 
       x = ((width - target_width)/2); 
     } 

     CGRect cropRect; 

     NSLog(@"resizeImage x %d, y %d, target_width %d, target_height %d", x, y, target_width, target_height); 
     cropRect = CGRectMake(x, y, target_width, target_height); 
     CFDictionaryRef empty; // empty value for attr value. 
     CFMutableDictionaryRef attrs; 
     empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary 
            NULL, 
            NULL, 
            0, 
            &kCFTypeDictionaryKeyCallBacks, 
            &kCFTypeDictionaryValueCallBacks); 
     attrs = CFDictionaryCreateMutable(kCFAllocatorDefault, 
              1, 
              &kCFTypeDictionaryKeyCallBacks, 
              &kCFTypeDictionaryValueCallBacks); 

     CFDictionarySetValue(attrs, 
           kCVPixelBufferIOSurfacePropertiesKey, 
           empty); 

     OSStatus status; 
     CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer]; //options: [NSDictionary dictionaryWithObjectsAndKeys:[NSNull null], kCIImageColorSpace, nil]]; 
     CVPixelBufferRef pixelBuffer; 
     status = CVPixelBufferCreate(kCFAllocatorSystemDefault, target_width, target_height, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, attrs, &pixelBuffer); 
     if (status != 0) 
     { 
      NSLog(@"CVPixelBufferCreate error %d", (int)status); 
     } 

     [ciContext render:ciImage toCVPixelBuffer:pixelBuffer bounds:cropRect colorSpace:nil]; 
     CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); 
     CVPixelBufferUnlockBaseAddress(imageBuffer,0); 

     CMSampleTimingInfo sampleTime = { 
      .duration = CMSampleBufferGetDuration(sampleBuffer), 
      .presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer), 
      .decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer) 
     }; 

     CMVideoFormatDescriptionRef videoInfo = NULL; 
     status = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &videoInfo); 
     if (status != 0) 
     { 
      NSLog(@"CMVideoFormatDescriptionCreateForImageBuffer error %d", (int)status); 
     } 
     CMSampleBufferRef oBuf; 
     status = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, true, NULL, NULL, videoInfo, &sampleTime, &oBuf); 
     if (status != 0) 
     { 
      NSLog(@"CMSampleBufferCreateForImageBuffer error %d", (int)status); 
     } 
     CFRelease(pixelBuffer); 
     ciImage = nil; 
     pixelBuffer = nil; 
     return oBuf; 
    } 
} 

Irgendwelche Ideen oder Vorschläge diesbezüglich? Ich habe versucht, das Rechteck zu ändern, aber ohne Effekt.

Dank

+0

Wenn ich x und y auf Null d. H. X = 0 y = 0, funktioniert es, aber es schneidet nicht aus der Mitte, es entfernt stattdessen die Oberseite des Rahmens. – manishg

Antwort

1

Sind Sie sich bewusst, dass der Doc-Kommentar der Funktion [CIContext toCVPixelBuffer: bounds: colorSpace:] sagt über iOS8- und iOS9 +? (Ich hätte jede Online-Ressource nicht gefunden wird, zu verknüpfen though.)

/* Render 'image' to the given CVPixelBufferRef. 
* The 'bounds' parameter has the following behavior: 
* In OS X and iOS 9 and later: The 'image' is rendered into 'buffer' so that 
*  point (0,0) of 'image' aligns to the lower left corner of 'buffer'. 
*  The 'bounds' acts like a clip rect to limit what region of 'buffer' is modified. 
* In iOS 8 and earlier: The 'bounds' parameter acts to specify the region of 'image' to render. 
*  This region (regarless of its origin) is rendered at upper-left corner of 'buffer'. 
*/ 

es in Anbetracht löste ich mein Problem, das das gleiche wie bei Ihnen aussieht.