ios – 通过CMSampleBufferRef数据传输到音频输出插孔

前端之家收集整理的这篇文章主要介绍了ios – 通过CMSampleBufferRef数据传输到音频输出插孔前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
我正在开发一个应用程序,我需要通过输出音频插孔通过音频捕获,同时记录和保存视频.

我已经研究了aurio touch apple示例代码并实现了音频传输.

我也通过AVCaptureSession实现了录像.
以上功能单独完成和工作完美.

但是当我合并功能音频通过不工作,因为AVCapturesession的音频会话.

我也试图通过我从AVCaptureSession代理方法得到的音频数据.以下是我的代码

OSStatus err = noErr;


AudioBufferList audioBufferList;
CMBlockBufferRef blockBuffer;
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer,NULL,&audioBufferList,sizeof(audioBufferList),&blockBuffer);
CMItemCount numberOfFrames = CMSampleBufferGetNumSamples(sampleBuffer); // corresponds to the number of CoreAudio audio frames

currentSampleTime += (double)numberOfFrames;

AudioTimeStamp timeStamp;
memset(&timeStamp,sizeof(AudioTimeStamp));
timeStamp.mSampleTime = currentSampleTime;
timeStamp.mFlags |= kAudioTimeStampSampleTimeValid;

AudioUnitRenderActionFlags flags = 0;
aurioTouchAppDelegate *THIS = (aurioTouchAppDelegate *)[[UIApplication sharedApplication]delegate];
 err = AudioUnitRender(self.rIoUnit,&flags,&timeStamp,1,numberOfFrames,&audioBufferList);

if (err) { printf("PerformThru: error %d\n",(int)err); }

但它是给错误.请告知可以尽快做些进一步的工作.我已经研究了很多文档和许多代码,但找不到任何解决方案.请帮忙..

解决方法

这里有一些更好的错误处理代码.它返回什么错误?您可以在文档中搜索错误描述.
static void CheckError (OSStatus error,const char *operation) {
    if (error == noErr) return;

    char str[20] = {};
    // see if it appears to be a 4 char code
    *(UInt32*)(str + 1) = CFSwapInt32HostToBig(error);
    if (isprint(str[1]) && isprint(str[2]) && isprint(str[3]) && isprint(str[4])) {
        str[0] = str[5] = '\'';
        str[6] = '\0';
    } else {
        sprintf(str,"%d",(int)error);
    }

    fprintf(stderr,"Error: %s(%s)\n",operation,str);
    exit(1);
}

- (void)yourFunction
{
    AudioBufferList audioBufferList;
    CMBlockBufferRef blockBuffer;
    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer,&blockBuffer);
    CMItemCount numberOfFrames = CMSampleBufferGetNumSamples(sampleBuffer); // corresponds to the number of CoreAudio audio frames

    currentSampleTime += (double)numberOfFrames;

    AudioTimeStamp timeStamp;
    memset(&timeStamp,sizeof(AudioTimeStamp));
    timeStamp.mSampleTime = currentSampleTime;
    timeStamp.mFlags |= kAudioTimeStampSampleTimeValid;

    AudioUnitRenderActionFlags flags = 0;
    aurioTouchAppDelegate *THIS = (aurioTouchAppDelegate *)[[UIApplication sharedApplication]delegate];
    CheckError(AudioUnitRender(self.rIoUnit,&audioBufferList),"Error with AudioUnitRender");
}
原文链接:/iOS/336758.html

猜你在找的iOS相关文章