我正在编写一个使用MediaCodec记录屏幕捕获和音频的应用程序.我使用MediaMuxer多媒体视频和音频来创建mp4文件.我成功地设法分别编写视频和音频,但是当我尝试将它们复制在一起时,结果是意想不到的.无论是音频播放还是视频播放,或者音频播放完毕.我的猜测是,我做错了时间戳,但我不知道究竟是什么.我已经看过这些例子:
https://github.com/OnlyInAmerica/HWEncoderExperiments/tree/audiotest/HWEncoderExperiments/src/main/java/net/openwatch/hwencoderexperiments和bigflake.com上的那些,并没有找到答案.
以下是我的媒体格式配置:
mVideoFormat = createMediaFormat(); private static MediaFormat createVideoFormat() { MediaFormat format = MediaFormat.createVideoFormat( Preferences.MIME_TYPE,mScreenWidth,mScreenHeight); format.setInteger(MediaFormat.KEY_COLOR_FORMAT,MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface); format.setInteger(MediaFormat.KEY_BIT_RATE,Preferences.BIT_RATE); format.setInteger(MediaFormat.KEY_FRAME_RATE,Preferences.FRAME_RATE); format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,Preferences.IFRAME_INTERVAL); return format; } mAudioFormat = createAudioFormat(); private static MediaFormat createAudioFormat() { MediaFormat format = new MediaFormat(); format.setString(MediaFormat.KEY_MIME,"audio/mp4a-latm"); format.setInteger(MediaFormat.KEY_AAC_PROFILE,MediaCodecInfo.CodecProfileLevel.AACObjectLC); format.setInteger(MediaFormat.KEY_SAMPLE_RATE,44100); format.setInteger(MediaFormat.KEY_CHANNEL_COUNT,1); format.setInteger(MediaFormat.KEY_BIT_RATE,64000); return format; }
音视频编码器,复用器:
mVideoEncoder = MediaCodec.createEncoderByType(Preferences.MIME_TYPE); mVideoEncoder.configure(mVideoFormat,null,MediaCodec.CONFIGURE_FLAG_ENCODE); mInputSurface = new InputSurface(mVideoEncoder.createInputSurface(),mSavedEglContext); mVideoEncoder.start(); if (recordAudio){ audioBufferSize = AudioRecord.getMinBufferSize(44100,AudioFormat.CHANNEL_CONFIGURATION_MONO,AudioFormat.ENCODING_PCM_16BIT); mAudioRecorder = new AudioRecord(MediaRecorder.AudioSource.MIC,44100,AudioFormat.ENCODING_PCM_16BIT,audioBufferSize); mAudioRecorder.startRecording(); mAudioEncoder = MediaCodec.createEncoderByType("audio/mp4a-latm"); mAudioEncoder.configure(mAudioFormat,MediaCodec.CONFIGURE_FLAG_ENCODE); mAudioEncoder.start(); } try { String fileId = String.valueOf(System.currentTimeMillis()); mMuxer = new MediaMuxer(dir.getPath() + "/Video" + fileId + ".mp4",MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4); } catch (IOException ioe) { throw new RuntimeException("MediaMuxer creation Failed",ioe); } mVideoTrackIndex = -1; mAudioTrackIndex = -1; mMuxerStarted = false;
我用它设置视频时间戳:
mInputSurface.setPresentationTime(mSurfaceTexture.getTimestamp()); drainVideoEncoder(false);
并设置音频时间戳:
lastQueuedPresentationTimeStampUs = getNextQueuedPresentationTimeStampUs(); if(endOfStream) mAudioEncoder.queueInputBuffer(inputBufferIndex,audioBuffer.length,lastQueuedPresentationTimeStampUs,MediaCodec.BUFFER_FLAG_END_OF_STREAM); else mAudioEncoder.queueInputBuffer(inputBufferIndex,0); mAudioBufferInfo.presentationTimeUs = getNextDeQueuedPresentationTimeStampUs(); mMuxer.writeSampleData(mAudioTrackIndex,encodedData,mAudioBufferInfo); lastDequeuedPresentationTimeStampUs = mAudioBufferInfo.presentationTimeUs; private static long getNextQueuedPresentationTimeStampUs(){ long nextQueuedPresentationTimeStampUs = (lastQueuedPresentationTimeStampUs > lastDequeuedPresentationTimeStampUs) ? (lastQueuedPresentationTimeStampUs + 1) : (lastDequeuedPresentationTimeStampUs + 1); Log.i(TAG,"nextQueuedPresentationTimeStampUs: " + nextQueuedPresentationTimeStampUs); return nextQueuedPresentationTimeStampUs; } private static long getNextDeQueuedPresentationTimeStampUs(){ Log.i(TAG,"nextDequeuedPresentationTimeStampUs: " + (lastDequeuedPresentationTimeStampUs + 1)); lastDequeuedPresentationTimeStampUs ++; return lastDequeuedPresentationTimeStampUs; }
我从这个例子https://github.com/OnlyInAmerica/HWEncoderExperiments/blob/audiotest/HWEncoderExperiments/src/main/java/net/openwatch/hwencoderexperiments/AudioEncodingTest.java中拿出来,以避免“timestampUs XXX< lastTimestampUs XXX”错误 有人可以帮我弄清楚问题吗?
解决方法
看起来你正在使用系统提供的视频时间戳,而是一个简单的音频计数器.除非以某种方式使用视频时间戳,否则每帧都会播放音频,并且上面没有显示.
要使音频和视频同步播放,您需要在预期同时显示的音频和视频帧上具有相同的演示时间戳.
另见这related question.