Android使用FFMpeg生成pcm格式音頻,並利用AudioTrack播放出來


一、場景

   1.可任意選擇一個媒體文件(avi、mp4、ts、mp3),解析除媒體文件的音頻,並保存為pcm,然后利用AudioTrack播放pcm。

   2.主要類介紹Java文件:

      a.PcmDecoder.java 用於和jni通訊的java類,里面定義了三個方法,初始化:init、解碼:decode、銷毀:destroy

      b.UserAudioTrackPlayPCMActivity.java類,用於選擇媒體文件,初始化編碼器以及播放音頻pcm文件

      c.AudioTrackPlayer.java 用於播放pcm文件

  3.主要C++文件:

      a.pcm_decoder.cpp 其中的方法和PcmDecoder.java文件功能對應,用於java調用jni

      b.pcm_deocder_controller.cpp pcm解碼器封裝類

      c.pcm_real_decoder.cpp 具體的初始化以及解碼類

  4.代碼框架圖:

    

    5.其實大體上就分兩部分:第一部分是解碼部分,將媒體文件解碼為pcm。第二部分是播放,利用AudioTrack播放器播放解碼出PCM文件。注意,解碼出的pcm是裸數據由於沒有給pcm裸數據增加頭,所以一般的播放器是播放不出來的。所以要用AudioTrack進行驗證。

二、代碼演示

  1.UserAudioTrackPlayPCMActivity.java 

package com.yw.ffmpeg;

import android.os.Bundle;
import android.widget.Button;
import android.widget.TextView;

import androidx.annotation.Nullable;

import com.yw.ffmpeg.decoder.PcmDecoder;
import com.yw.ffmpeg.player.AudioTrackPlayer;

import java.io.File;

/**
 * @ProjectName: AndroidFFMpeg
 * @Package: com.yw.ffmpeg
 * @ClassName: 使用AudioTrack播放PCM文件
 * @Description: 使用AudioTrack播放PCM文件
 * @Author: wei.yang
 * @CreateDate: 2021/8/20 10:02
 * @UpdateUser: 更新者:wei.yang
 * @UpdateDate: 2021/8/20 10:02
 * @UpdateRemark: 更新說明:
 * @Version: 1.0
 */
public class UserAudioTrackPlayPCMActivity extends BaseActivity {
    //選擇文件
    private Button btnChoiceVideo;
    //選擇文件的路徑
    private TextView tvVideoFilePath;
    //生成pcm文件
    private Button btnGeneratePCM;
    //生成的pcm路徑
    private TextView tvPcmPath;
    //使用AudioTrack播放pcm裸數據
    private Button btnUseAudioTrackPaly;
    private PcmDecoder pcmDecoder;
    private String orgPath = null;

    /**
     * 測試步驟
     * 1.選擇一個視頻文件
     * 2.將視頻文件和音頻文件分離
     * 3.將分離后的音頻文件解碼並保存為PCM
     * 4.利用AudioTrack播放PCM
     *
     * @param savedInstanceState
     */
    @Override
    protected void onCreate(@Nullable Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_user_audio_track_play_pcm);
        initViews();
        pcmDecoder = new PcmDecoder();
    }

    private void initViews() {
        btnChoiceVideo = findViewById(R.id.btnChoiceVideo);
        btnChoiceVideo.setOnClickListener(v -> {
            choiceVideo();
        });
        tvVideoFilePath = findViewById(R.id.tvVideoFilePath);
        btnGeneratePCM = findViewById(R.id.btnGeneratePCM);
        btnGeneratePCM.setOnClickListener(v -> {
            if (pcmDecoder != null) {
                if (orgPath != null) {
//                    new Thread() {
//                        @Override
//                        public void run() {
                    try {
                        int ret = pcmDecoder.init(orgPath, "/mnt/sdcard/110119.pcm");
                        if (ret >= 0) {
                            pcmDecoder.decode();
                            pcmDecoder.destroy();
                        }
                    } catch (Exception e) {
                        e.printStackTrace();
                    }
//                        }
//                    }.start();


                }
            }
        });
        tvPcmPath = findViewById(R.id.tvPcmPath);
        btnUseAudioTrackPaly = findViewById(R.id.btnUseAudioTrackPaly);
        btnUseAudioTrackPaly.setOnClickListener(v -> {
            AudioTrackPlayer.getAudioTrackPlayer().start(new File("/mnt/sdcard/110119.pcm"));
        });

    }

    @Override
    public void videoPathCallback(String videoPath) {
        orgPath = videoPath;
    }

    static {
        System.loadLibrary("native-lib");
    }

    @Override
    protected void onDestroy() {
        super.onDestroy();
        if (pcmDecoder != null) {
            pcmDecoder.destroy();
        }
        AudioTrackPlayer.getAudioTrackPlayer().release();
    }





}

  

  2.AudioTrackPlayer.java

package com.yw.ffmpeg.player;

import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.net.rtp.AudioStream;

import com.yw.ffmpeg.utils.LogUtil;

import java.io.File;
import java.io.FileInputStream;

/**
 * @ProjectName: AndroidFFMpeg
 * @Package: com.yw.ffmpeg.player
 * @ClassName: AudioTrackPlayer
 * @Description: 使用AudioTrack播放音頻文件
 * @Author: wei.yang
 * @CreateDate: 2021/8/31 10:26
 * @UpdateUser: 更新者:wei.yang
 * @UpdateDate: 2021/8/31 10:26
 * @UpdateRemark: 更新說明:
 * @Version: 1.0
 * * ps:
 *  * 1.AudioTrack是Android SDK提供的最底層的音頻播放API,因此只允許輸入裸數據
 *  * <p>
 *  * AudioTrack的工作流程
 *  * 1.根據音頻參數信息配置出一個AudioTrack實例
 *  * 2.調用play方法將AudioTrack切換到播放狀態
 *  * 3.啟動播放線程,循環向AudioTrack緩沖區中寫入音頻數據
 *  * 4.當數據寫完或者停止播放的時候,停止播放線程並釋放所有資源
 *  * <p>
 *  * AudioTrack的構造函數
 *  * AudioTrack(int streamType,int sampleRateInHz,int channelConfig,int audioFormat,int bufferSizeInBytes,int mode)
 *  * 1.streamType:音頻管理策略。如:在AudioManager中的STREAM_VOICE_CALL:電話聲音、STREAM_SYSTEM:系統聲音、STREAM_RING:鈴聲、STREAM_MUSIC:音樂聲、STREAM_ALARM:警告聲,STREAM_NOTIFICATION:通知聲
 *  * 2.sampleRateInHz:采樣率,即播放音頻每秒鍾會有多少次采樣。如:8000,16000,22050、24000、44100、48000
 *  * 3.channelConfig:聲道。AudioFormat中的可選值為CHANNEL_IN_MONO(單聲道)。CHANNEL_IN_STEREO(立體聲)
 *  * 4.audioFormat:采樣格式,ENCODING_PCM_16BIT(16bit)、ENCODING_PCM_8BIT(8bit),注意,前者是可以兼容所有Android手機的。
 *  * 5.bufferSizeInBytes:AudioTrack內部的音頻緩沖區的大小,通過getMinBufferSize來確定這個緩沖區的大小
 *  * 6.mode:播放模式,MODE_STATIC:需要一次性將所有的數據都寫入播放緩沖區中,簡單高效,通常用於播放鈴聲、系統提醒的音頻片段、MODE_STREAM:需要按照一定的時間間隔不間斷地寫入音頻數據,理論上它可以應用於任何音頻播放的場景
 */
public class AudioTrackPlayer {
    private static final String TAG = "AudioTrackPlayer:";
    private static final int HZ = 0xac44;
    private AudioTrack audioTrack;

    private AudioTrackPlayer() {
        int minBufferSize = AudioTrack.getMinBufferSize(HZ, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
        LogUtil.log(TAG + "minBufferSize:" + minBufferSize);
        audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, HZ, AudioFormat.CHANNEL_CONFIGURATION_STEREO,
                AudioFormat.ENCODING_PCM_16BIT, minBufferSize * 2, AudioTrack.MODE_STREAM);
        audioTrack.setStereoVolume(1.0f, 1.0f);// 設置當前音量大小
        audioTrack.play();//調用play方法將AudioTrack切換到播放狀態
    }

    private static AudioTrackPlayer instance = null;

    public static synchronized AudioTrackPlayer getAudioTrackPlayer() {
        if (instance == null) {
            instance = new AudioTrackPlayer();
        }
        return instance;
    }

    /**
     * 開始播放pcm文件
     *
     * @param pcmFile
     */
    public void start(File pcmFile) {
        new Thread() {
            @Override
            public void run() {
                try {
                    FileInputStream fileInputStream = new FileInputStream(pcmFile);
                    fileInputStream.skip(0x2c);
                    byte buffer[] = new byte[16 * 10000];
                    while (fileInputStream.read(buffer) >= 0) {
                        System.out.println("write pcm data");
                        audioTrack.write(buffer, 0, buffer.length);
                    }
                    fileInputStream.close();
                    fileInputStream = null;
                    audioTrack.stop();
                    audioTrack.release();
                    audioTrack = null;
                } catch (Exception e) {
                    e.printStackTrace();
                }
            }
        }.start();
    }

    /**
     * 停止播放
     */
    public void stop() {
        if (audioTrack != null) {
            audioTrack.stop();
        }
    }

    /**
     * 銷毀
     */
    public void release() {
        if (audioTrack != null) {
            audioTrack.stop();
            audioTrack.release();
            audioTrack = null;
        }
    }
}

  

  3.PcmDecoder.java

public class PcmDecoder {
    /**
     * 初始化解碼器
     * @param mediaFilePath 原始媒體文件路徑
     * @param pcmFilePath 目標pcm文件生成路徑
     * @return
     */
    public native int init(String mediaFilePath,String pcmFilePath);

    /**
     * 開始解碼
     */
    public native void decode();

    /**
     * 銷毀解碼器
     */
    public native void destroy();
}

  

  4.pcm_decoder.cpp

extern "C" {
/**
 * 初始化
 * @param env
 * @param obj
 * @param mediaFilePath 待解碼的媒體文件路徑
 * @param pcmFilePath  生成pcm文件的路徑
 * @return
 */
PCMDecoderController *controller;
JNIEXPORT jint JNICALL
Java_com_yw_ffmpeg_decoder_PcmDecoder_init(JNIEnv *env, jobject obj, jstring mediaFilePath,
                                           jstring pcmFilePath) {
    //將jni字符串轉換為c能識別的字符數組
    const char *pcmPath = env->GetStringUTFChars(pcmFilePath, NULL);
    const char *mediaPath = env->GetStringUTFChars(mediaFilePath, NULL);
    controller = new PCMDecoderController();
    controller->init(mediaPath, pcmPath);
    //銷毀字符串
    env->ReleaseStringUTFChars(mediaFilePath, mediaPath);
    env->ReleaseStringUTFChars(pcmFilePath, pcmPath);
    return 0;
}

/**
 * 解碼
 * @param env
 * @param obj
 */
JNIEXPORT void JNICALL Java_com_yw_ffmpeg_decoder_PcmDecoder_decode(JNIEnv *env, jobject obj) {
    if (controller) {
        controller->decode();
    }
}
/**
 * 銷毀
 * @param env
 * @param obj
 */
JNIEXPORT void JNICALL Java_com_yw_ffmpeg_decoder_PcmDecoder_destroy(JNIEnv *env, jobject obj) {
    if (controller) {
        controller->destroy();
        delete controller;
        controller = NULL;
    }
}
}

  

  5.pcm_deocder_controller.cpp

#include "pcm_decoder_controller.h"

PCMDecoderController::PCMDecoderController() {
    realDecoder = NULL;
    pcmFile = NULL;
}

PCMDecoderController::~PCMDecoderController() {

}

void PCMDecoderController::init(const char *orgFilePath, const char *pcmFilePath) {
    //初始化decoder
    RealDecoder *tempDecoder = new RealDecoder();
    int metaData[2];
    tempDecoder->getMusicMeta(orgFilePath, metaData);
    delete tempDecoder;
    //初始化采樣率
    sampleRate = metaData[0];
    int byteCountPreSec = sampleRate * 2 * 16 / 8;
    packetBufferSize = (int) ((byteCountPreSec / 2) * 0.2);
    realDecoder = new RealDecoder();
    realDecoder->init(orgFilePath, packetBufferSize);
    __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "init PCMDecoderController success");
    pcmFile = fopen(pcmFilePath, "wb+");
    __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "init pcmFile success");

}

void PCMDecoderController::decode() {
    __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "start decode audio data");
    while (true) {
        AudioPacket *packet = realDecoder->decodePacket();
        if (-1 == packet->size) {
            break;
        }
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "write buffer to file");
        fwrite(packet->buffer, sizeof(short), packet->size, pcmFile);
    }
}

void PCMDecoderController::destroy() {
    if (NULL != realDecoder) {
        realDecoder->destroy();
        delete realDecoder;
        realDecoder = NULL;
    }
    if (NULL != pcmFile) {
        fclose(pcmFile);
        pcmFile = NULL;
    }
}

  

  6.pcm_real_decoder.cpp

#include "pcm_real_decoder.h"

RealDecoder::RealDecoder() {
    this->seek_seconds = 0.0f;
    this->seek_req = false;
    this->seek_resp = false;
    mediaFilePath = NULL;
}

RealDecoder::~RealDecoder() {
    if (NULL != mediaFilePath) {
        delete[] mediaFilePath;
        mediaFilePath = NULL;
    }

}

int RealDecoder::getMusicMeta(const char *fileString, int *metaData) {
    init(fileString);
    int sampleRate = avCodecContext->sample_rate;
    __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "%d", "sampleRate:", sampleRate);
    int bitRate = avCodecContext->bit_rate;
    __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "%d", "bitRate:", bitRate);
    destroy();
    metaData[0] = sampleRate;
    metaData[1] = bitRate;
    return 0;
}

void RealDecoder::init(const char *fileString, int packetBufferSizeParam) {
    init(fileString);
    packetBufferSize = packetBufferSizeParam;
}

int RealDecoder::init(const char *audioFile) {
    __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "init:");
    audioBuffer = NULL;
    position = -1.0f;
    audioBufferCursor = 0;
    audioBufferSize = 0;
    swrContext = NULL;
    swrBuffer = NULL;
    swrBufferSize = 0;
    seek_success_read_frame_success = true;
    isNeedFirstFrameCorrectFlag = true;
    firstFrameCorrectionInSecs = 0.0f;
    av_register_all();
    avFormatContext = avformat_alloc_context();
    //打開輸入文件
    __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "Open Input File");
    if (NULL == mediaFilePath) {
        int length = strlen(audioFile);
        mediaFilePath = new char[length + 1];
        //由於最后一個是'\0' 所以memset的長度要設置為length+1
        //作用是在一段內存塊中填充某個給定的值,它對較大的結構體或數組進行清零操作的一種最快方法。
        memset(mediaFilePath, 0, length + 1);
        memcpy(mediaFilePath, audioFile, length + 1);
    }

    int result = avformat_open_input(&avFormatContext, audioFile, NULL, NULL);
    if (result != 0) {
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "Open Input File Fail");
        return -1;
    } else {
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "Open Input File success");
    }
    avFormatContext->max_analyze_duration = 50000;
    //檢查文件中的流信息
    result = avformat_find_stream_info(avFormatContext, NULL);
    if (result < 0) {
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "find stream Fail");
        return -1;
    } else {
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "find stream success");
    }
    stream_index = av_find_best_stream(avFormatContext, AVMEDIA_TYPE_AUDIO, -1, -1, NULL, 0);
    if (stream_index == -1) {
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "find audio fail");
        return -1;
    }
    //獲取音頻流
    AVStream *audioStream = avFormatContext->streams[stream_index];
    if (audioStream->time_base.den && audioStream->time_base.num) {
        time_base = av_q2d(audioStream->time_base);
    } else if (audioStream->codec->time_base.den && audioStream->codec->time_base.num) {
        time_base = av_q2d(audioStream->codec->time_base);
    }
    //獲取音頻流解碼器上下文
    avCodecContext = audioStream->codec;
    //根據加碼器上下文找到解碼器
    AVCodec *avCodec = avcodec_find_decoder(avCodecContext->codec_id);
    if (avCodec == NULL) {
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "find decode fail");
        return -1;
    } else {
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "find decode success");
    }
    //打開解碼器
    result = avcodec_open2(avCodecContext, avCodec, NULL);
    if (result < 0) {
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "open decode fail");
        return -1;
    } else {
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "open decode success");
    }
    //判斷是否需要resampler,重采樣
    if (!audioCodecIsSupported()) {
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder",
                            "because of audio Codec Is Not Supported so we will init swresampler...");
        /**
          * 初始化resampler
          * @param s               Swr context, can be NULL
          * @param out_ch_layout   output channel layout (AV_CH_LAYOUT_*)
          * @param out_sample_fmt  output sample format (AV_SAMPLE_FMT_*).
          * @param out_sample_rate output sample rate (frequency in Hz)
          * @param in_ch_layout    input channel layout (AV_CH_LAYOUT_*)
          * @param in_sample_fmt   input sample format (AV_SAMPLE_FMT_*).
          * @param in_sample_rate  input sample rate (frequency in Hz)
          * @param log_offset      logging level offset
          * @param log_ctx         parent logging context, can be NULL
          */
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "start init swrContext");
        swrContext = swr_alloc_set_opts(NULL, av_get_default_channel_layout(OUT_PUT_CHANNELS),
                                        AV_SAMPLE_FMT_S16, avCodecContext->sample_rate,
                                        av_get_default_channel_layout(avCodecContext->channels),
                                        avCodecContext->sample_fmt, avCodecContext->sample_rate, 0,
                                        NULL);
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "end init swrContext");
        if (!swrContext || swr_init(swrContext)) {
            if (swrContext)
                swr_free(&swrContext);
            avcodec_close(avCodecContext);
            __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "init resampler failed...");
            return -1;
        }


    }
    //初始化AVFrame
    audioFrame = av_frame_alloc();
    __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "open create audioFrame success");
    return -1;
}

/**
 * 檢測音頻編碼是否支持S16
 * @return
 */
bool RealDecoder::audioCodecIsSupported() {
    if (avCodecContext->sample_fmt == AV_SAMPLE_FMT_S16) {
        return true;
    }
    return false;
}

/**
 * 解碼Packet
 * @return
 */
AudioPacket *RealDecoder::decodePacket() {
    short *samples = new short[packetBufferSize];
    int stereoSampleSize = readSamples(samples, packetBufferSize);
    AudioPacket *samplePacket = new AudioPacket();
    if (stereoSampleSize > 0) {
        //構造成一個packet
        samplePacket->buffer = samples;
        samplePacket->size = stereoSampleSize;
        /** 這里由於每一個packet的大小不一樣有可能是200ms 但是這樣子position就有可能不准確了 **/
        samplePacket->position = position;
    } else {
        samplePacket->size = -1;
    }
    return samplePacket;
}

int RealDecoder::readSamples(short *samples, int size) {
    if (seek_req) {
        audioBufferCursor = audioBufferSize;
    }
    int sampleSize = size;
    while (size > 0) {
        if (audioBufferCursor < audioBufferSize) {
            int audioBufferDataSize = audioBufferSize - audioBufferCursor;
            int copySize = MIN(size, audioBufferDataSize);
            memcpy(samples + (sampleSize - size), audioBuffer + audioBufferCursor, copySize * 2);
            size -= copySize;
            audioBufferCursor += copySize;
        } else {
            if (readFrame() < 0) {
                break;
            }
        }
//		LOGI("size is %d", size);
    }
    int fillSize = sampleSize - size;
    if (fillSize == 0) {
        return -1;
    }
    return fillSize;
}

void RealDecoder::seek_frame() {
    __android_log_print(ANDROID_LOG_ERROR, "RealDecoder",
                        "\n seek frame firstFrameCorrectionInSecs is %.6f, seek_seconds=%f, position=%f \n",
                        firstFrameCorrectionInSecs, seek_seconds, position);
    float targetPosition = seek_seconds;
    float currentPosition = position;
    float frameDuration = duration;
    if (targetPosition < currentPosition) {
        this->destroy();
        this->init(mediaFilePath);
        //這里GT的測試樣本會差距25ms 不會累加
        currentPosition = 0.0;
    }
    int readFrameCode = -1;
    while (true) {
        av_init_packet(&packet);
        readFrameCode = av_read_frame(avFormatContext, &packet);
        if (readFrameCode >= 0) {
            currentPosition += frameDuration;
            if (currentPosition >= targetPosition) {
                break;
            }
        }
//		LOGI("currentPosition is %.3f", currentPosition);
        av_free_packet(&packet);
    }
    seek_resp = true;
    seek_req = false;
    seek_success_read_frame_success = false;
}

/**
 * 讀取視頻中的每一幀
 * @return
 */
int RealDecoder::readFrame() {
    if (seek_req) {
        this->seek_frame();
    }
    int ret = 1;
    av_init_packet(&packet);
    int gotFrame = -1;
    int readFrameCode = -1;
    while (true) {
        readFrameCode = av_read_frame(avFormatContext, &packet);
        if (readFrameCode >= 0) {
            if (packet.stream_index == stream_index) {//說明是音頻數據
                int len = avcodec_decode_audio4(avCodecContext, audioFrame, &gotFrame, &packet);
                if (len < 0) {
                    __android_log_print(ANDROID_LOG_ERROR, "RealDecoder",
                                        "decode audio error, skip packet");
                }
                if (gotFrame) {
                    int numChannels = OUT_PUT_CHANNELS;
                    int numFrames = 0;
                    void *audioData;
                    if (swrContext) {
                        const int ratio = 2;
                        const int bufSize = av_samples_get_buffer_size(NULL,
                                                                       numChannels,
                                                                       audioFrame->nb_samples *
                                                                       ratio,
                                                                       AV_SAMPLE_FMT_S16, 1);
                        if (!swrBuffer || swrBufferSize < bufSize) {
                            swrBufferSize = bufSize;
                            swrBuffer = realloc(swrBuffer, swrBufferSize);
                        }
                        byte *outbuf[2] = {(byte *) swrBuffer, NULL};
                        numFrames = swr_convert(swrContext, outbuf,
                                                audioFrame->nb_samples * ratio,
                                                (const uint8_t **) audioFrame->data,
                                                audioFrame->nb_samples);
                        if (numFrames < 0) {
                            __android_log_print(ANDROID_LOG_ERROR, "RealDecoder",
                                                "fail resample audio");
                            ret = -1;
                            break;
                        }
                        audioData = swrBuffer;
                    } else {
                        if (avCodecContext->sample_fmt != AV_SAMPLE_FMT_S16) {
                            __android_log_print(ANDROID_LOG_ERROR, "RealDecoder",
                                                "bucheck, audio format is invalid");
                            ret = -1;
                            break;
                        }
                        audioData = audioFrame->data[0];
                        numFrames = audioFrame->nb_samples;
                    }
                    if (isNeedFirstFrameCorrectFlag && position >= 0) {
                        float expectedPosition = position + duration;
                        float actualPosition =
                                av_frame_get_best_effort_timestamp(audioFrame) * time_base;
                        firstFrameCorrectionInSecs = actualPosition - expectedPosition;
                        isNeedFirstFrameCorrectFlag = false;
                    }
                    duration = av_frame_get_pkt_duration(audioFrame) * time_base;
                    position = av_frame_get_best_effort_timestamp(audioFrame) * time_base -
                               firstFrameCorrectionInSecs;
                    if (!seek_success_read_frame_success) {
                        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "position is %.6f",
                                            position);
                        actualSeekPosition = position;
                        seek_success_read_frame_success = true;
                    }
                    audioBufferSize = numFrames * numChannels;
//					LOGI(" \n duration is %.6f position is %.6f audioBufferSize is %d\n", duration, position, audioBufferSize);
                    audioBuffer = (short *) audioData;
                    audioBufferCursor = 0;
                    break;
                }
            }
        } else {
            ret = -1;
            break;
        }
    }
    av_free_packet(&packet);
    return ret;
}


void RealDecoder::destroy() {
//	LOGI("start destroy!!!");
    if (NULL != swrBuffer) {
        free(swrBuffer);
        swrBuffer = NULL;
        swrBufferSize = 0;
    }
    if (NULL != swrContext) {
        swr_free(&swrContext);
        swrContext = NULL;
    }
    if (NULL != audioFrame) {
        av_free(audioFrame);
        audioFrame = NULL;
    }
    if (NULL != avCodecContext) {
        avcodec_close(avCodecContext);
        avCodecContext = NULL;
    }
    if (NULL != avFormatContext) {
        __android_log_print(ANDROID_LOG_ERROR, "RealDecoder", "leave LiveReceiver::destory");
        avformat_close_input(&avFormatContext);
        avFormatContext = NULL;
    }
//	LOGI("end destroy!!!");
}

  

三、源代碼下載

  Gitee


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM