国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁 > 系統 > Android > 正文

讓IjkPlayer支持插入自定義的GPU濾鏡方法

2019-10-22 18:16:20
字體:
來源:轉載
供稿:網友

最近因為工作的原因,需要提供一個將我們的AiyaEffectsSDK插入IjkPlayer中的示例,就不得不好好看了下IjkPlayer的代碼。在IjkPlayer中并沒有提供設置自定義GPU濾鏡的接口,所以最后只能自己動手,以求豐衣足食了。不得不說,Bilibili開源的這個IjkPlayer播放器的確非常強大,代碼設計的非常清晰,仔細看看,能學到不少東西。

IjkPlayer源碼獲取及編譯方法

源碼地址,編譯參考readme即可:

# 獲取ijk源碼git clone https://github.com/Bilibili/ijkplayer.git ijkplayer-android# 進入源碼目錄cd ijkplayer-android# checkout 最新版本git checkout -B latest k0.8.0# 執行腳本,此腳本會下載ijk依賴的源碼,比如ffmpeg./init-android.sh# 編譯ffmpeg, all可以換成指定版本,如armv7acd android/contrib./compile-ffmpeg.sh clean./compile-ffmpeg.sh all# 編譯ijkplayer,all可以換成指定版本,如armv7acd .../compile-ijk.sh all

IjkPlayer分析及修改

Android 版的IjkPlayer示例工程中,播放視頻界面為tv.danmaku.ijk.media.example.activities.VideoActivity,在VideoActivity中使用的是IjkVideoView來播放視頻的,位于tv.danmaku.ijk.media.example.widget.media包下。

IjkVideoView使用與Android的VideoView基本一致,在IjkVideoView中,設置視頻源調用setVideoURI方法,而此方法又會調用private屬性的openVideo方法。在openVideo方法中,會根據mSettings.getPlayer()的值創建一個IMediaPlayer:

public IMediaPlayer createPlayer(int playerType) {  IMediaPlayer mediaPlayer = null;  switch (playerType) {   case Settings.PV_PLAYER__IjkExoMediaPlayer: {    IjkExoMediaPlayer IjkExoMediaPlayer = new IjkExoMediaPlayer(mAppContext);    mediaPlayer = IjkExoMediaPlayer;   }   break;   case Settings.PV_PLAYER__AndroidMediaPlayer: {    AndroidMediaPlayer androidMediaPlayer = new AndroidMediaPlayer();    mediaPlayer = androidMediaPlayer;   }   break;   case Settings.PV_PLAYER__IjkMediaPlayer:   default: {    IjkMediaPlayer ijkMediaPlayer = null;    if (mUri != null) {     ijkMediaPlayer = new IjkMediaPlayer();     ijkMediaPlayer.native_setLogLevel(IjkMediaPlayer.IJK_LOG_DEBUG);     if (mSettings.getUsingMediaCodec()) {      ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "mediacodec", 1);      if (mSettings.getUsingMediaCodecAutoRotate()) {       ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "mediacodec-auto-rotate", 1);      } else {       ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "mediacodec-auto-rotate", 0);      }      if (mSettings.getMediaCodecHandleResolutionChange()) {       ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "mediacodec-handle-resolution-change", 1);      } else {       ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "mediacodec-handle-resolution-change", 0);      }     } else {      ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "mediacodec", 0);     }     //省略其他參數設置的的代碼    }    mediaPlayer = ijkMediaPlayer;   }   break;  }  if (mSettings.getEnableDetachedSurfaceTextureView()) {   mediaPlayer = new TextureMediaPlayer(mediaPlayer);  }  return mediaPlayer; }

從上面代碼中可以看到,在IjkVideoView中多處用到了mSettings,mSettings的值主要是用戶設置的,通過SharedPreferences保存的,包括音視頻解碼設置、是否使用OpenSLES、渲染View等等。可以參看SettingsActivity界面。

根據playerType創建IjkMediaPlayer,前兩類分別為google的ExoPlayer和Android的MediaPlayer。除此之外才是真正的創建的IjkPlayer。

mSettings中的其他參數最終會轉換后通過IjkMediaPlayer的setOption方法進行設置,而IjkMediaPlayer.setOption又是直接調用native方法。進入IjkMediaPlayer可以發現,IjkMediaPlayer中的許多方法都是native方法,或者調用了native方法。

增加setGLFilter接口

在ijkmedia文件夾下全局搜索其中一個方法_setDataSource,得到內容大致如下:

F:/cres/C/ijkplayer-android/ijkmedia/ijkplayer/android/ijkplayer_jni.c:static void IjkMediaPlayer_setDataSourceAndHeaders( JNIEnv *env, jobject thiz, jstring path, jobjectArray keys, jobjectArray values)...static voidIjkMediaPlayer_setDataSourceFd(JNIEnv *env, jobject thiz, jint fd){ MPTRACE("%s/n", __func__);...static voidIjkMediaPlayer_setDataSourceCallback(JNIEnv *env, jobject thiz, jobject callback){ MPTRACE("%s/n", __func__);...static JNINativeMethod g_methods[] = { {   "_setDataSource",  "(Ljava/lang/String;[Ljava/lang/String;[Ljava/lang/String;)V",   (void *) IjkMediaPlayer_setDataSourceAndHeaders },  { "_setDataSourceFd",  "(I)V",  (void *) IjkMediaPlayer_setDataSourceFd },  { "_setDataSource",   "(Ltv/danmaku/ijk/media/player/misc/IMediaDataSource;)V", (void *)IjkMediaPlayer_setDataSourceCallback }, { "_setAndroidIOCallback", "(Ltv/danmaku/ijk/media/player/misc/IAndroidIO;)V", (void *)IjkMediaPlayer_setAndroidIOCallback },

即知,在Android版本的ijkplayer,入口即為ijkmedia/ijkplayer/android/ijkplayer_jni.c

在IjkMediaPlayer.java及ijkplayer_jni.c文件中增加setGLFilter方法:

//增加setGLFilter方法static void IjkMediaPlayer_native_setGLFilter(JNIEnv *env, jclass clazz,jobject filter){}// ----------------------------------------------------------------------------static JNINativeMethod g_methods[] = { {  "_setDataSource",  "(Ljava/lang/String;[Ljava/lang/String;[Ljava/lang/String;)V",  (void *) IjkMediaPlayer_setDataSourceAndHeaders }, { "_setDataSourceFd",  "(I)V",  (void *) IjkMediaPlayer_setDataSourceFd }, { "_setDataSource",   "(Ltv/danmaku/ijk/media/player/misc/IMediaDataSource;)V", (void *)IjkMediaPlayer_setDataSourceCallback }, //···省略其他方法 { "_setGLFilter",   "(Ltv/danmaku/ijk/media/player/IjkFilter;)V", (void *) IjkMediaPlayer_native_setGLFilter },};

渲染時回調用戶設置的GLFilter相關方法

在ijkmedia/ijksdl/gles2/internal.h文件中,IJK_GLES2_Renderer結構體內增加:

typedef struct IJK_GLES2_Renderer{ //... GLuint frame_buffers[1]; GLuint frame_textures[1]; int hasFilter=0; void (* func_onCreate)(void); void (* func_onSizeChanged)(int width,int height); void (* func_onDrawFrame)(int textureId); //...} IJK_GLES2_Renderer;

這三個方法即為IjkFlter的三個方法,將會在Jni里面,將Java中設置的IjkFilter對象的三個方法與之對應起來。

全局搜索glDraw搜索到只有一個在renderer.c的IJK_GLES2_Renderer_renderOverlay方法中,渲染工作也是此方法執行的。當然,IjkPlayer利用OpenGLES渲染時,會根據從視頻中解碼出來的數據具體格式來進行渲染,比如yuv420p、yuv420sp、rbg565等等諸多格式。具體在ijkmedia/ijksdl/gles2/下找到,renderer_rgb.c/renderer.yuv420p.c等等都是。

當用戶在Java層設置了GLFilter時,GLFilter的三個方法應該在合適的時候被C回調,從名字可以看出來,這三個方法,和GLSurfaceView.Renderer接口中定義的三個方法其實是一樣的。

具體在IJK_GLES2_Renderer_renderOverlay方法中的修改如下:

GLboolean IJK_GLES2_Renderer_renderOverlay(IJK_GLES2_Renderer *renderer, SDL_VoutOverlay *overlay){ /*用戶設置了fitler,而且沒有創建過framebuffer,創建framebuffer,依舊利用 IjkPlayer里面原來的流程,就yuv或rgb的數據,渲染到一個texture上面去,然后將 這個texture作為原始數據傳遞給java層進行處理。 */ if(renderer->hasFilter&&!renderer->frame_buffers[0]&&renderer->frame_width>0&&renderer->frame_height>0){  //創建一個texture,用來接受將不同格式的視頻幀數據,渲染成一個紋理  glGenTextures(1,renderer->frame_textures);  glBindTexture(GL_TEXTURE_2D,renderer->frame_textures[0]);  glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,renderer->frame_width,renderer->frame_height,0,GL_RGBA,GL_UNSIGNED_BYTE,NULL);   glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);  glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);  glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);  glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);  glBindTexture(GL_TEXTURE_2D,0);  //創建framebuffer來掛載紋理,以渲染視頻幀  glGenFramebuffers(1,renderer->frame_buffers);  glBindFramebuffer(GL_FRAMEBUFFER,renderer->frame_buffers[0]);  glFramebufferTexture2D(GL_FRAMEBUFFER,GL_COLOR_ATTACHMENT0,GL_TEXTURE_2D,renderer->frame_textures[0],0);  // int r;  // if ((r = glCheckFramebufferStatus(GL_FRAMEBUFFER)) != GL_FRAMEBUFFER_COMPLETE)  // {  //  ALOGE("wuwang: Error in Framebuffer 0x%x", r);  // }else{  //  ALOGE("wuwang: glCheckFramebufferStatus 0x%x", r);  // }  glBindFramebuffer(GL_FRAMEBUFFER,0);  //調用Java傳遞進來的Filter的onCreated方法及onSizeChanged方法  renderer->func_onCreated();  renderer->func_onSizeChanged(renderer->frame_width,renderer->frame_height);  ALOGE("wuwang: create frame_buffers and textures %d,%d",renderer->frame_width,renderer->frame_height); } //用戶設置了Filter,就掛載frameBuffer,否則就按照原來的流程直接渲染到屏幕上 if(renderer->hasFilter&&renderer->frame_buffers[0]){  GLint bindFrame;  glGetIntegerv(GL_FRAMEBUFFER_BINDING,&bindFrame);  ALOGE("wuwang: default frame binding %d",bindFrame);  glBindFramebuffer(GL_FRAMEBUFFER,renderer->frame_buffers[0]);  // glFramebufferTexture2D(GL_FRAMEBUFFER,GL_COLOR_ATTACHMENT0,GL_TEXTURE_2D,renderer->frame_textures[0],0);  /* 這句一定要加上,否則無法增加了Filter之后,啟用了其他GLProgram,無法渲染原始視頻到Texture上去了 */  IJK_GLES2_Renderer_use(renderer); } if (!renderer || !renderer->func_uploadTexture)  return GL_FALSE; glClear(GL_COLOR_BUFFER_BIT);    IJK_GLES2_checkError_TRACE("glClear"); ALOGE("wuwang: frame buffer id:%d",renderer->frame_buffers[0]); GLsizei visible_width = renderer->frame_width; GLsizei visible_height = renderer->frame_height; if (overlay) {  visible_width = overlay->w;  visible_height = overlay->h;  if (renderer->frame_width != visible_width ||   renderer->frame_height != visible_height ||   renderer->frame_sar_num != overlay->sar_num ||   renderer->frame_sar_den != overlay->sar_den) {   renderer->frame_width = visible_width;   renderer->frame_height = visible_height;   renderer->frame_sar_num = overlay->sar_num;   renderer->frame_sar_den = overlay->sar_den;   renderer->vertices_changed = 1;  }  renderer->last_buffer_width = renderer->func_getBufferWidth(renderer, overlay);  if (!renderer->func_uploadTexture(renderer, overlay)){   return GL_FALSE;  } } else {  // NULL overlay means force reload vertice  renderer->vertices_changed = 1; } GLsizei buffer_width = renderer->last_buffer_width; if (renderer->vertices_changed ||  (buffer_width > 0 &&   buffer_width > visible_width &&   buffer_width != renderer->buffer_width &&   visible_width != renderer->visible_width)){  if(renderer->hasFilter&&renderer->frame_buffers[0]){   renderer->func_onSizeChanged(renderer->frame_width,renderer->frame_height);  }  renderer->vertices_changed = 0;  IJK_GLES2_Renderer_Vertices_apply(renderer);  IJK_GLES2_Renderer_Vertices_reset(renderer);  IJK_GLES2_Renderer_Vertices_reloadVertex(renderer);  renderer->buffer_width = buffer_width;  renderer->visible_width = visible_width;  GLsizei padding_pixels  = buffer_width - visible_width;  GLfloat padding_normalized = ((GLfloat)padding_pixels) / buffer_width;  IJK_GLES2_Renderer_TexCoords_reset(renderer);  IJK_GLES2_Renderer_TexCoords_cropRight(renderer, padding_normalized);  IJK_GLES2_Renderer_TexCoords_reloadVertex(renderer); } glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);  IJK_GLES2_checkError_TRACE("glDrawArrays"); //用戶設置了Filter,就取消掛載FrameBuffer,并調用Filter的onDrawFrame方法。 if(renderer->hasFilter&&renderer->frame_buffers[0]){  glBindFramebuffer(GL_FRAMEBUFFER,0);  renderer->func_onDrawFrame(renderer->frame_textures[0]);  // renderer->func_onDrawFrame(renderer->plane_textures[0]); } return GL_TRUE;}

GLFilter從設置到調用實現分析

上面已經完成的接口的編寫,也做好執行的編寫。現在需要將接口傳遞進來的GLFilter的三個方法與執行的三個方法對應起來,才能是用戶的Filter真正發揮作用。

IJK_GLES2_Renderer的創建是根據SDL_VoutOverlay來的,在查找SDL_VoutOverlay從哪里來的,一路可以搜索到ijkmedia/ijksdl/android/ijksdl_vout_android_nativewindow.c中的func_create_overlay方法:

static SDL_VoutOverlay *func_create_overlay(int width, int height, int frame_format, SDL_Vout *vout){ SDL_LockMutex(vout->mutex); SDL_VoutOverlay *overlay = func_create_overlay_l(width, height, frame_format, vout); SDL_UnlockMutex(vout->mutex); return overlay;}

SDL_VoutOverlay的創建和SDL_Vout有關,再查找SDL_Vout的來源,可以找到

ijkmedia/ijksdl/android/ijksdl_vout_android_nativewindow.c中的SDL_VoutAndroid_CreateForANativeWindow:

SDL_Vout *SDL_VoutAndroid_CreateForANativeWindow(){ SDL_Vout *vout = SDL_Vout_CreateInternal(sizeof(SDL_Vout_Opaque)); if (!vout)  return NULL; SDL_Vout_Opaque *opaque = vout->opaque; opaque->native_window = NULL; if (ISDL_Array__init(&opaque->overlay_manager, 32))  goto fail; if (ISDL_Array__init(&opaque->overlay_pool, 32))  goto fail; opaque->egl = IJK_EGL_create(); if (!opaque->egl)  goto fail; vout->opaque_class = &g_nativewindow_class; vout->create_overlay = func_create_overlay; vout->free_l   = func_free_l; vout->display_overlay = func_display_overlay; return vout;fail: func_free_l(vout); return NULL;}

它的初始化,再沒啥可以關聯了,那就只能找調用它的地方了。搜索后發現SDL_VoutAndroid_CreateForANativeWindow只在ijkmedia/ijksdl/android/ijksdl_vout_android_surface.c的SDL_VoutAndroid_CreateForAndroidSurface方法:

SDL_Vout *SDL_VoutAndroid_CreateForAndroidSurface(){ return SDL_VoutAndroid_CreateForANativeWindow();}

看來是被包了一層皮,那就接著查找SDL_VoutAndroid_CreateForAndroidSurface,搜索到調用它的為ijkmedia/ijkplayer/android/ijkplayer_android.c的ijkmp_android_create方法:

IjkMediaPlayer *ijkmp_android_create(int(*msg_loop)(void*)){ IjkMediaPlayer *mp = ijkmp_create(msg_loop); if (!mp)  goto fail; mp->ffplayer->vout = SDL_VoutAndroid_CreateForAndroidSurface(); if (!mp->ffplayer->vout)  goto fail; mp->ffplayer->pipeline = ffpipeline_create_from_android(mp->ffplayer); if (!mp->ffplayer->pipeline)  goto fail; ffpipeline_set_vout(mp->ffplayer->pipeline, mp->ffplayer->vout); return mp;fail: ijkmp_dec_ref_p(&mp); return NULL;}

可以看到ijkmp_android_create返回了一個IjkMediaPlayer,這個在Java中也有一個這樣的類,那么曙光應該就不遠了。

再搜索ijkmp_android_create,結果調用它的只有ijkmedia/ijkplayer/android/ijkplayer_jni.c中的IjkMediaPlayer_native_setup方法,到了這里,就可以將IjkFilter傳遞下去了。

在上面的過程中,可以看到從renderer.c到jni,IJK_GLES2_Renderer的創建,依賴SDL_VoutOverlay。SDL_VoutOverlay的創建依賴SDL_Vout。SDL_Vout是FFPlayer的成員,而FFPlayer又是IjkMediaPlayer的成員變量。Java層的IjkMediaPlayer依賴于IjkMediaPlayer

從GLFilter到IJK_GLES2_Renderer

根據上面的分析,可以知道,在jni中增加的setGLFilter的方法中,我們可以將GLFilter的方法傳遞給IjkMediaPlayer->FFPlayer->SDLVout,然后再傳遞給SDL_VoutOverlay,再由SDL_VoutOverlay傳遞給IJK_GLES2_Renderer即可。這樣將增加Filter的功能增加進去了,也不會影響IjkPlayer的流程,讓IOS同樣能夠快速的實現增加GPU濾鏡的功能。

先在SDL_VoutOverlay和SDL_Vout中的結構體定義中(ijkmedia/ijksdl/ijksdl_vout.h文件中)同樣加入在IJK_GLES2_Renderer中增加的成員:

struct SDL_VoutOverlay { //... int hasFilter; void (* func_onCreated)(void); void (* func_onSizeChanged)(int width,int height); int (* func_onDrawFrame)(int textureId); //...};struct SDL_Vout { //... int hasFilter; void (* func_onCreated)(void); void (* func_onSizeChanged)(int width,int height); int (* func_onDrawFrame)(int textureId); //...};

然后在上述幾個方法,分別完成這幾個成員數據從SDL_Vout到SDL_VoutOverlay,再到IJK_GLES2_Renderer的傳遞。

分別為:

ijkmedia/ijksdl/gles2/renderer.c文件中IJK_GLES2_Renderer_create方法

IJK_GLES2_Renderer *IJK_GLES2_Renderer_create(SDL_VoutOverlay *overlay){ if (!overlay)  return NULL; //中間省略... renderer->format = overlay->format; //增加的內容 renderer->hasFilter=overlay->hasFilter; renderer->func_onCreated=overlay->func_onCreated; renderer->func_onSizeChanged=overlay->func_onSizeChanged; renderer->func_onDrawFrame=overlay->func_onDrawFrame; return renderer;}

ijksdl/android/ijksdl_vout_android_nativewindow.c文件中func_create_overlay方法

static SDL_VoutOverlay *func_create_overlay(int width, int height, int frame_format, SDL_Vout *vout){ SDL_LockMutex(vout->mutex); SDL_VoutOverlay *overlay = func_create_overlay_l(width, height, frame_format, vout); //增加的內容 overlay->hasFilter=vout->hasFilter; overlay->func_onCreated=vout->func_onCreated; overlay->func_onSizeChanged=vout->func_onSizeChanged; overlay->func_onDrawFrame=vout->func_onDrawFrame; SDL_UnlockMutex(vout->mutex); return overlay;}

最后還需要完成SDL_Vout中這幾個成員的賦值,并調用Java傳入的GLFilter對象的相關方法(ijkplayer_jni.c文件中):

static JNIEnv * mEnv;static jobject mFilter;static jmethodID onCreatedMethod;static jmethodID onSizeChangedMethod;static jmethodID onDrawFrameMethod;void onCreated(){ if(!mEnv){  (*g_jvm)->AttachCurrentThread(g_jvm,&mEnv,NULL);  jclass filterClass=(*mEnv)->GetObjectClass(mEnv,mFilter);  onCreatedMethod=(*mEnv)->GetMethodID(mEnv,filterClass,"onCreated","()V");  onSizeChangedMethod=(*mEnv)->GetMethodID(mEnv,filterClass,"onSizeChanged","(II)V");  onDrawFrameMethod=(*mEnv)->GetMethodID(mEnv,filterClass,"onDrawFrame","(I)I");  (*g_jvm)->DetachCurrentThread(g_jvm); } if(onCreatedMethod){  (*g_jvm)->AttachCurrentThread(g_jvm,&mEnv,NULL);  (*mEnv)->CallVoidMethod(mEnv,mFilter,onCreatedMethod);  (*g_jvm)->DetachCurrentThread(g_jvm); }}void onSizeChanged(int width,int height){ if(onSizeChangedMethod){  (*g_jvm)->AttachCurrentThread(g_jvm,&mEnv,NULL);  (*mEnv)->CallVoidMethod(mEnv,mFilter,onSizeChangedMethod,width,height);  (*g_jvm)->DetachCurrentThread(g_jvm); }}int onDrawFrame(int textureId){ if(onDrawFrameMethod){  (*g_jvm)->AttachCurrentThread(g_jvm,&mEnv,NULL);  int ret=(*mEnv)->CallIntMethod(mEnv,mFilter,onDrawFrameMethod,textureId);  (*g_jvm)->DetachCurrentThread(g_jvm);  return ret; } return textureId;}/*注意不能直接保存env,filter然后在onDrawFrame等方法中使用,因為這三個方法的調用與setGLFilter不是在同一個線程中*/static void IjkMediaPlayer_native_setGLFilter(JNIEnv *env, jobject clazz, jobject filter){ if(mFilter){  (*env)->DeleteGlobalRef(env,mFilter); } IjkMediaPlayer *mp = jni_get_media_player(env, clazz); if(filter!=NULL){  mFilter=(*env)->NewGlobalRef(env,filter);  mp->ffplayer->vout->hasFilter=1;  mp->ffplayer->vout->func_onCreated=onCreated;  mp->ffplayer->vout->func_onSizeChanged=onSizeChanged;  mp->ffplayer->vout->func_onDrawFrame=onDrawFrame; }else{  mp->ffplayer->vout->hasFilter=0; }}

至此,Java一直到sdl中的renderer就算連通了。在Java中的處理就和GLSurfaceView設置Renderer基本一致了,不同的是,我們給IjkPlayer中增加的GLFilter,已經提供了一個原始的視頻圖像作為onDrawFrame的參數,在GLFilter中,只需要處理這個Texture并渲染出來就可以了。

插入濾鏡示例

將修改后的代碼重新編譯下,編譯后的庫會自動更新到Ijkplayer的Android工程下,設置自定義的濾鏡后,不出意外就可以看到效果了。以下分別為原始視頻、黑白濾鏡處理后的視頻、增加了AiyaEffectsSDK并設置了特效的視頻效果,因為CSDN對圖片大小限制的問題,都是截取了一小段:

IjkPlayer,插入,自定義,GPU濾鏡 IjkPlayer,插入,自定義,GPU濾鏡 IjkPlayer,插入,自定義,GPU濾鏡

工程太大了,修改的地方也不算多,就不上傳代碼了。有需要的朋友根據以上流程下載ijkplayer源碼自行修改即可,同時也可以看看Ijkplayer的源碼。

以上這篇讓IjkPlayer支持插入自定義的GPU濾鏡方法就是小編分享給大家的全部內容了,希望能給大家一個參考,也希望大家多多支持VEVB武林網。


注:相關教程知識閱讀請移步到Android開發頻道。
發表評論 共有條評論
用戶名: 密碼:
驗證碼: 匿名發表
主站蜘蛛池模板: 柞水县| 瑞昌市| 白城市| 湖口县| 额敏县| 昌乐县| 靖安县| 乐山市| 贵港市| 平顺县| 响水县| 额尔古纳市| 武陟县| 涟源市| 东安县| 紫阳县| 卢氏县| 辽宁省| 内丘县| 富顺县| 余江县| 青岛市| 荆门市| 新丰县| 九龙县| 滁州市| 海丰县| 宾阳县| 全州县| 金乡县| 河津市| 房产| 芦山县| 巴林右旗| 鹤岗市| 桂东县| 阜平县| 夏津县| 宁晋县| 博湖县| 浠水县|