2016-05-02 25 views
0

Es ist ein gut dokumentiertes Problem auf SO, wo AVAsets nach dem Schreiben in eine Datei gedreht werden, entweder AVAssetWriter oder AVComposition. Und es gibt Lösungen, wie zum Beispiel die Transformation der Videospur zu betrachten und zu sehen, wie das Asset gedreht wird, so dass es für Ihren speziellen Anwendungsfall in die gewünschte Ausrichtung gedreht werden kann.AVAsset Rotation

Was ich jedoch wissen möchte ist, warum dies passiert und ob es möglich ist, dies zu verhindern. Ich stolperte nicht nur beim Schreiben von benutzerdefinierten Videodateien, sondern auch beim Umwandeln von Videos in Gifs mithilfe von CGImageDestination, wobei das Ausgabegif gut aussieht, außer dass es gedreht wird.

Um einen schnellen Bezugspunkt für meinen Code zu geben, der eine Bereicherung für Datei schreibt:

let destinationURL = url ?? NSURL(fileURLWithPath: "\(NSTemporaryDirectory())\(String.random()).mp4") 
     if let writer = try? AVAssetWriter(URL: destinationURL, fileType: AVFileTypeMPEG4), 
      videoTrack = self.asset.tracksWithMediaType(AVMediaTypeVideo).last, 
      firstBuffer = buffers.first { 
      let videoCompressionProps = [AVVideoAverageBitRateKey: videoTrack.estimatedDataRate] 
      let outputSettings: [String: AnyObject] = [ 
       AVVideoCodecKey: AVVideoCodecH264, 
       AVVideoWidthKey: width, 
       AVVideoHeightKey: height, 
       AVVideoCompressionPropertiesKey: videoCompressionProps 
      ] 
      let writerInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings, sourceFormatHint: (videoTrack.formatDescriptions.last as! CMFormatDescription)) 
      writerInput.expectsMediaDataInRealTime = false 

      let rotateTransform = CGAffineTransformMakeRotation(Utils.degreesToRadians(-90)) 
      writerInput.transform = CGAffineTransformScale(rotateTransform, -1, 1) 

      let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: writerInput, sourcePixelBufferAttributes: nil) 
      writer.addInput(writerInput) 
      writer.startWriting() 
      writer.startSessionAtSourceTime(CMSampleBufferGetPresentationTimeStamp(firstBuffer)) 


      for (sample, newTimestamp) in Array(Zip2Sequence(buffers, timestamps)) { 
       if let imageBuffer = CMSampleBufferGetImageBuffer(sample) { 
        while !writerInput.readyForMoreMediaData { 
         NSThread.sleepForTimeInterval(0.1) 
        } 
        pixelBufferAdaptor.appendPixelBuffer(imageBuffer, withPresentationTime: newTimestamp) 
       } 
      } 
      writer.finishWritingWithCompletionHandler { 
       // completion code 
      } 

Wie Sie oben sehen, das ausgegebene Video Portrait zurück eine einfache Transformation dreht können. Wenn ich jedoch ein Landschaftsvideo habe, funktioniert diese Transformation nicht mehr. Und wie ich bereits erwähnt habe, führt die Umwandlung des Videos in ein gif genau die gleiche 90-Grad-Drehung auf meinem Asset durch.

Meine Gefühle können in diesen beiden Gifs zusammengefasst werden:

http://giphy.com/gifs/jon-stewart-why-lYKvaJ8EQTzCU

http://giphy.com/gifs/the-office-no-steve-carell-12XMGIWtrHBl5e

Antwort

-3
i have also find same Problem then i changed rotated my video to 90'its works fine 

Here is solution 
//in videoorientation.h 

#import <UIKit/UIKit.h> 
#import <AVFoundation/AVFoundation.h> 
@interface videoorientationViewController : UIViewController 
@property AVMutableComposition *mutableComposition; 
@property AVMutableVideoComposition *mutableVideoComposition; 
@property AVMutableAudioMix *mutableAudioMix; 
@property AVAssetExportSession *exportSession; 
- (void)performWithAsset : (NSURL *)moviename; 
@end 

In //viewcontroller.m 

- (void)performWithAsset : (NSURL *)moviename 
{ 

    self.mutableComposition=nil; 
    self.mutableVideoComposition=nil; 
    self.mutableAudioMix=nil; 

// NSString* filename = [NSString stringWithFormat:@"temp1.mov"]; 
//  
// NSLog(@"file name== %@",filename); 
//  
// [[NSUserDefaults standardUserDefaults]setObject:filename forKey:@"currentName"]; 
// NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename]; 

    // NSLog(@"file number %i",_currentFile); 

    // NSURL* url = [NSURL fileURLWithPath:path]; 

    // NSString *videoURL = [[NSBundle mainBundle] pathForResource:@"Movie" ofType:@"m4v"]; 

    AVAsset *asset = [[AVURLAsset alloc] initWithURL:moviename options:nil]; 

    AVMutableVideoCompositionInstruction *instruction = nil; 
    AVMutableVideoCompositionLayerInstruction *layerInstruction = nil; 
    CGAffineTransform t1; 
    CGAffineTransform t2; 

    AVAssetTrack *assetVideoTrack = nil; 
    AVAssetTrack *assetAudioTrack = nil; 
    // Check if the asset contains video and audio tracks 
    if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) { 
     assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0]; 
    } 
    if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0) { 
     assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0]; 
    } 

    CMTime insertionPoint = kCMTimeZero; 
    NSError *error = nil; 


    // Step 1 
    // Create a composition with the given asset and insert audio and video tracks into it from the asset 
    if (!self.mutableComposition) { 

     // Check whether a composition has already been created, i.e, some other tool has already been applied 
     // Create a new composition 
     self.mutableComposition = [AVMutableComposition composition]; 

     // Insert the video and audio tracks from AVAsset 
     if (assetVideoTrack != nil) { 
      AVMutableCompositionTrack *compositionVideoTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
      [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetVideoTrack atTime:insertionPoint error:&error]; 
     } 
     if (assetAudioTrack != nil) { 
      AVMutableCompositionTrack *compositionAudioTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; 
      [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetAudioTrack atTime:insertionPoint error:&error]; 
     } 

    } 


    // Step 2 
    // Translate the composition to compensate the movement caused by rotation (since rotation would cause it to move out of frame) 
    t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.height, 0.0); 
     float width=assetVideoTrack.naturalSize.width; 
    float height=assetVideoTrack.naturalSize.height; 
    float toDiagonal=sqrt(width*width+height*height); 
    float toDiagonalAngle = radiansToDegrees(acosf(width/toDiagonal)); 
    float toDiagonalAngle2=90-radiansToDegrees(acosf(width/toDiagonal)); 

    float toDiagonalAngleComple; 
    float toDiagonalAngleComple2; 
    float finalHeight = 0.0; 
    float finalWidth = 0.0; 

    float degrees=90; 

    if(degrees>=0&&degrees<=90){ 

     toDiagonalAngleComple=toDiagonalAngle+degrees; 
     toDiagonalAngleComple2=toDiagonalAngle2+degrees; 

     finalHeight=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple))); 
     finalWidth=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple2))); 

     t1 = CGAffineTransformMakeTranslation(height*sinf(degreesToRadians(degrees)), 0.0); 
    } 
    else if(degrees>90&&degrees<=180){ 


     float degrees2 = degrees-90; 

     toDiagonalAngleComple=toDiagonalAngle+degrees2; 
     toDiagonalAngleComple2=toDiagonalAngle2+degrees2; 

     finalHeight=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple2))); 
     finalWidth=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple))); 

     t1 = CGAffineTransformMakeTranslation(width*sinf(degreesToRadians(degrees2))+height*cosf(degreesToRadians(degrees2)), height*sinf(degreesToRadians(degrees2))); 
    } 
    else if(degrees>=-90&&degrees<0){ 

     float degrees2 = degrees-90; 
     float degreesabs = ABS(degrees); 

     toDiagonalAngleComple=toDiagonalAngle+degrees2; 
     toDiagonalAngleComple2=toDiagonalAngle2+degrees2; 

     finalHeight=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple2))); 
     finalWidth=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple))); 

     t1 = CGAffineTransformMakeTranslation(0, width*sinf(degreesToRadians(degreesabs))); 

    } 
    else if(degrees>=-180&&degrees<-90){ 

     float degreesabs = ABS(degrees); 
     float degreesplus = degreesabs-90; 

     toDiagonalAngleComple=toDiagonalAngle+degrees; 
     toDiagonalAngleComple2=toDiagonalAngle2+degrees; 

     finalHeight=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple))); 
     finalWidth=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple2))); 

     t1 = CGAffineTransformMakeTranslation(width*sinf(degreesToRadians(degreesplus)), height*sinf(degreesToRadians(degreesplus))+width*cosf(degreesToRadians(degreesplus))); 

    } 


    // Rotate transformation 
    t2 = CGAffineTransformRotate(t1, degreesToRadians(degrees)); 
    //t2 = CGAffineTransformRotate(t1, -90); 


    // Step 3 
    // Set the appropriate render sizes and rotational transforms 
    if (!self.mutableVideoComposition) { 

     // Create a new video composition 
     self.mutableVideoComposition = [AVMutableVideoComposition videoComposition]; 
     // self.mutableVideoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.height,assetVideoTrack.naturalSize.width); 
     self.mutableVideoComposition.renderSize = CGSizeMake(finalWidth,finalHeight); 

     self.mutableVideoComposition.frameDuration = CMTimeMake(1,30); 

     // The rotate transform is set on a layer instruction 
     instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
     instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [self.mutableComposition duration]); 
     layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:(self.mutableComposition.tracks)[0]]; 
     [layerInstruction setTransform:t2 atTime:kCMTimeZero]; 

    } else { 

     self.mutableVideoComposition.renderSize = CGSizeMake(self.mutableVideoComposition.renderSize.height, self.mutableVideoComposition.renderSize.width); 

     // Extract the existing layer instruction on the mutableVideoComposition 
     instruction = (self.mutableVideoComposition.instructions)[0]; 
     layerInstruction = (instruction.layerInstructions)[0]; 

     // Check if a transform already exists on this layer instruction, this is done to add the current transform on top of previous edits 
     CGAffineTransform existingTransform; 

     if (![layerInstruction getTransformRampForTime:[self.mutableComposition duration] startTransform:&existingTransform endTransform:NULL timeRange:NULL]) { 
      [layerInstruction setTransform:t2 atTime:kCMTimeZero]; 
     } else { 
      // Note: the point of origin for rotation is the upper left corner of the composition, t3 is to compensate for origin 
      CGAffineTransform t3 = CGAffineTransformMakeTranslation(-1*assetVideoTrack.naturalSize.height/2, 0.0); 
      CGAffineTransform newTransform = CGAffineTransformConcat(existingTransform, CGAffineTransformConcat(t2, t3)); 
      [layerInstruction setTransform:newTransform atTime:kCMTimeZero]; 
     } 

    } 


    // Step 4 
    // Add the transform instructions to the video composition 
    instruction.layerInstructions = @[layerInstruction]; 
    self.mutableVideoComposition.instructions = @[instruction]; 


    // Step 5 
    // Notify AVSEViewController about rotation operation completion 
    // [[NSNotificationCenter defaultCenter] postNotificationName:AVSEEditCommandCompletionNotification object:self]; 

    [self performWithAssetExport]; 
} 

- (void)performWithAssetExport 
{ 
    // Step 1 
    // Create an outputURL to which the exported movie will be saved 

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); 
    NSString *outputURL = paths[0]; 
    NSFileManager *manager = [NSFileManager defaultManager]; 
    [manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil]; 
    outputURL = [outputURL stringByAppendingPathComponent:@"output.mov"]; 
    // Remove Existing File 
    [manager removeItemAtPath:outputURL error:nil]; 


    // Step 2 
    // Create an export session with the composition and write the exported movie to the photo library 
    self.exportSession = [[AVAssetExportSession alloc] initWithAsset:[self.mutableComposition copy] presetName:AVAssetExportPreset1280x720]; 

    self.exportSession.videoComposition = self.mutableVideoComposition; 
    self.exportSession.audioMix = self.mutableAudioMix; 
    self.exportSession.outputURL = [NSURL fileURLWithPath:outputURL]; 
    self.exportSession.outputFileType=AVFileTypeQuickTimeMovie; 


    [self.exportSession exportAsynchronouslyWithCompletionHandler:^(void){ 
     switch (self.exportSession.status) { 
      case AVAssetExportSessionStatusCompleted: 

       //[self playfunction]; 

       [[NSNotificationCenter defaultCenter]postNotificationName:@"Backhome" object:nil]; 



       // Step 3 
       // Notify AVSEViewController about export completion 
       break; 
      case AVAssetExportSessionStatusFailed: 
       NSLog(@"Failed:%@",self.exportSession.error); 
       break; 
      case AVAssetExportSessionStatusCancelled: 
       NSLog(@"Canceled:%@",self.exportSession.error); 
       break; 
      default: 
       break; 
     } 
    }]; 



} 
+0

ich gefragt wurde, warum es passiert, nicht, wie es zu beheben. – barndog