在Swift中连接两个音频文件并播放它们

前端之家收集整理的这篇文章主要介绍了在Swift中连接两个音频文件并播放它们前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
我尝试在swift中连接.wav音频文件.

这是我的代码

func merge(audio1: NSURL,audio2:  NSURL) {


    var error:NSError?

    var ok1 = false
    var ok2 = false


    var documentsDirectory:String = paths[0] as! String

    //Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
    var composition = AVMutableComposition()
    var compositionAudioTrack1:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio,preferredTrackID: CMPersistentTrackID())
    var compositionAudioTrack2:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio,preferredTrackID: CMPersistentTrackID())

    //create new file to receive data
    var documentDirectoryURL = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory,inDomains: .UserDomainMask).first! as! NSURL
    var fileDestinationUrl = documentDirectoryURL.URLByAppendingPathComponent("resultmerge.wav")
    println(fileDestinationUrl)


    var url1 = audio1
    var url2 = audio2


    var avAsset1 = AVURLAsset(URL: url1,options: nil)
    var avAsset2 = AVURLAsset(URL: url2,options: nil)

    var tracks1 =  avAsset1.tracksWithMediaType(AVMediaTypeAudio)
    var tracks2 =  avAsset2.tracksWithMediaType(AVMediaTypeAudio)

    var assetTrack1:AVAssetTrack = tracks1[0] as! AVAssetTrack
    var assetTrack2:AVAssetTrack = tracks2[0] as! AVAssetTrack


    var duration1: CMTime = assetTrack1.timeRange.duration
    var duration2: CMTime = assetTrack2.timeRange.duration

    var timeRange1 = CMTimeRangeMake(kCMTimeZero,duration1)
    var timeRange2 = CMTimeRangeMake(duration1,duration2)


    ok1 = compositionAudioTrack1.insertTimeRange(timeRange1,ofTrack: assetTrack1,atTime: kCMTimeZero,error: nil)
    if ok1 {

        ok2 = compositionAudioTrack2.insertTimeRange(timeRange2,ofTrack: assetTrack2,atTime: duration1,error: nil)

        if ok2 {
            println("success")
        }
    }

    //AVAssetExportPresetPassthrough => concatenation
    var assetExport = AVAssetExportSession(asset: composition,presetName: AVAssetExportPresetPassthrough)
    assetExport.outputFileType = AVFileTypeWAVE
    assetExport.outputURL = fileDestinationUrl
    assetExport.exportAsynchronouslyWithCompletionHandler({
        switch assetExport.status{
        case  AVAssetExportSessionStatus.Failed:
            println("Failed \(assetExport.error)")
        case AVAssetExportSessionStatus.Cancelled:
            println("cancelled \(assetExport.error)")
        default:
            println("complete")
            var audioPlayer = AVAudioPlayer()
            audioPlayer = AVAudioPlayer(contentsOfURL: fileDestinationUrl,error: nil)
            audioPlayer.prepareToPlay()
            audioPlayer.play()
        }

    })

}

并在终端中运行此错误(在iPhone上运行):

文件:///var/mobile/Containers/Data/Application/3F49D360-B363-4600-B3BB-EE0810501910/Documents/resultmerge.wav

成功

失败错误域= AVFoundationErrorDomain代码= -11838“操作interrompue”UserInfo = 0x174269ac0 {NSLocalizedDescription =Opérationinterpompue,NSLocalizedFailureReason = L’opérationn’estpas prize en charge pour cecontenumultimédia.}

但我不知道为什么我会收到这个错误.我非常感谢你能给我的任何帮助:)

我通过改变两件事来使你的代码工作:

>预设名称:从AVAssetExportPresetPassthrough到AVAssetExportPresetAppleM4A
>输出文件类型:从AVFileTypeWAVE到AVFileTypeAppleM4A

像这样修改您的assetExport声明:

var assetExport = AVAssetExportSession(asset: composition,presetName: AVAssetExportPresetAppleM4A)
assetExport.outputFileType = AVFileTypeAppleM4A

然后它将正确合并文件.

看起来AVAssetExportSession只导出M4A格式并忽略其他预设.可能有办法让它导出其他格式(通过继承它?),虽然我还没有探索过这种可能性.

原文链接:https://www.f2er.com/swift/320030.html

猜你在找的Swift相关文章