简介:

安卓通过MediaPlayer这个类提供了一整套接口呈现给给客户实现视音频的播放。可是凡事必究其根,我们今天就来看看安卓的MediaPlayer框架(基于Android 8.0)究竟是怎么实现的。因为框架层全是C/C++代码,建议读者拥有相关基础,没有也没关系,都能看懂。

概要:

先给出网上扒的MediaPlayer状态图,MediaPlayer常用的方法也就这些了。

因为音视频相关的编解码,解封装,渲染等操作需要大量的运算,所以谷歌将这些方法通过底层C/C++代码来实现了。但是java怎么调用的C++代码呢?于是JNI(java native interface)就应运而生了。JNI允许Java代码和其他语言写的代码进行交互。JNI一开始是为了本地已编译语言,尤其是C和C++而设计的,但是它并不妨碍你使用其他编程语言,只要调用约定受支持就可以了。使用java与本地已编译的代码交互,通常会丧失平台可移植性。

因此我们可以给出MediaPlayer框架的层次架构图:

代码分析:

我们写java程序如果要用到视音频播放肯定少不了MediaPlayer,常见的调用方法大致如下:

mediaPlayer = MediaPlayer.create(MainActivity.this, R.raw.test);
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setDisplay(surfaceHolder);
mediaPlayer.setLooping(true);
mediaPlayer.start();

是不是很熟悉?先通过creat()函数创建一个mediaPlayer,再设置音频流类型,设计播放窗口,就start(),一般都是固定的几步,可是它的底层是怎么运作的呢?我们现在来分析一下。就看creat()函数吧,跟着它往下读。

creat()

当然先去MediaPlayer.java里找一找了。

目录:xref: /frameworks/base/media/java/android/media/MediaPlayer.java

   //往下调用public static MediaPlayer create(Context context, int resid) {int s = AudioSystem.newAudioSessionId();return create(context, resid, null, s > 0 ? s : 0);}//还是要预先创建一个自己,然后调用setDataSource()函数public static MediaPlayer create(Context context, int resid,AudioAttributes audioAttributes, int audioSessionId) {try {AssetFileDescriptor afd = context.getResources().openRawResourceFd(resid);if (afd == null) return null;MediaPlayer mp = new MediaPlayer();final AudioAttributes aa = audioAttributes != null ? audioAttributes :new AudioAttributes.Builder().build();mp.setAudioAttributes(aa);mp.setAudioSessionId(audioSessionId);mp.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());afd.close();mp.prepare();return mp;} catch (IOException ex) {Log.d(TAG, "create failed:", ex);// fall through} catch (IllegalArgumentException ex) {Log.d(TAG, "create failed:", ex);// fall through} catch (SecurityException ex) {Log.d(TAG, "create failed:", ex);// fall through}return null;}public void setDataSource(FileDescriptor fd, long offset, long length)throws IOException, IllegalArgumentException, IllegalStateException {_setDataSource(fd, offset, length);}private native void _setDataSource(FileDescriptor fd, long offset, long length)throws IOException, IllegalArgumentException, IllegalStateException;

我把这几个相关的函数穿了起来,层层调用。java对上层那么多的重载函数,在底层都是互相调用,最终归为一两个特别的函数上。还记得初学java的MediaPlayer时,网上说可以new一个MediaPlayer也可以不new,直接调用creat()函数,我觉得不可思议,现在算是真相大白了。creat()函数在内部先new了一个MediaPlayer来处理业务,然后调用MediaPlayer的setDataSource()函数来设置播放资源,setDataSource()函数是我们研究框架的开路先锋,他是最重要的一个函数了。注意最后一个函数的native标志,它标示了这是一个底层函数的声明,你在当前目录是找不到它的定义的,这就是JNI。

但是看native层的代码之前,我们注意到在MediaPlayer.java的构造函数之前有一段静态代码段:

static {System.loadLibrary("media_jni");native_init();}private static native final void native_init();

嗯?提前调用了native层的native_init方法。没完呢,我们注意到在MediaPlayer.java的构造函数里也有一段native代码:

 native_setup(new WeakReference<MediaPlayer>(this));private native final void native_setup(Object mediaplayer_this);

这两个native函数在new一个MediaPlayer时就会调用,别放过,很重要,我们马上就看看。

JNI:

JNI的全称就是Java Native Interface,就是Java和C/C++相互通信的接口,实现了一个工程,多种语言并存。那么什么时候需要用到JNI呢?

1.需要调用java语言不支持的依赖于操作系统平台的特性的一些功能;

2.为了整合一些以前使用非java语言开发的某些系统,例如C和C++;

3.为了节省程序的运行时间,必须采用一些低级或中级语言。

关于JNI的用法请看:JNI的用法

MediaPlayer.java调用了底层JNI,根据JNI特有的命名规则,我们要找android_media_MediaPlayer.cpp文件。好了我们上面看到java层调用了native层的函数,有native_init(),native_setup(),_setDataSource(),然后我们兴高采烈的在android_media_MediaPlayer.cpp文件中并没有找到这三个函数的定义,什么情况?

不慌,细心的人就会发现JNI有这么一块代码,实现了对函数换名!

 xref: /frameworks/base/media/jni/android_media_MediaPlayer.cpp static const JNINativeMethod gMethods[] = {{"nativeSetDataSource","(Landroid/os/IBinder;Ljava/lang/String;[Ljava/lang/String;""[Ljava/lang/String;)V",(void *)android_media_MediaPlayer_setDataSourceAndHeaders},{"_setDataSource",      "(Ljava/io/FileDescriptor;JJ)V",    (void *)android_media_MediaPlayer_setDataSourceFD},{"_setDataSource",      "(Landroid/media/MediaDataSource;)V",(void *)android_media_MediaPlayer_setDataSourceCallback },{"_setVideoSurface",    "(Landroid/view/Surface;)V",        (void *)android_media_MediaPlayer_setVideoSurface},{"getDefaultBufferingParams", "()Landroid/media/BufferingParams;", (void *)android_media_MediaPlayer_getDefaultBufferingParams},{"getBufferingParams", "()Landroid/media/BufferingParams;", (void *)android_media_MediaPlayer_getBufferingParams},{"setBufferingParams", "(Landroid/media/BufferingParams;)V", (void *)android_media_MediaPlayer_setBufferingParams},{"_prepare",            "()V",                              (void *)android_media_MediaPlayer_prepare},{"prepareAsync",        "()V",                              (void *)android_media_MediaPlayer_prepareAsync},{"_start",              "()V",                              (void *)android_media_MediaPlayer_start},{"_stop",               "()V",                              (void *)android_media_MediaPlayer_stop},{"native_init",         "()V",                              (void *)android_media_MediaPlayer_native_init},{"native_setup",        "(Ljava/lang/Object;)V",            (void *)android_media_MediaPlayer_native_setup},
}

原来我们上层的函数在这里进行了统一的名称替换(如果对JNI的语法规则不明白的自行百度,这里不需要深究)。这就真相大白了,android_media_MediaPlayer_native_init()函数只是设置JNI层相关参数,感兴趣请自行查看,我们看看android_media_MediaPlayer_native_setup()函数:

//代码目录: xref: /frameworks/base/media/jni/android_media_MediaPlayer.cpp
static void
android_media_MediaPlayer_native_setup(JNIEnv *env, jobject thiz, jobject weak_this)
{ALOGV("native_setup");sp<MediaPlayer> mp = new MediaPlayer();if (mp == NULL) {jniThrowException(env, "java/lang/RuntimeException", "Out of memory");return;}// create new listener and give it to MediaPlayersp<JNIMediaPlayerListener> listener = new JNIMediaPlayerListener(env, thiz, weak_this);mp->setListener(listener);// Stow our new C++ MediaPlayer in an opaque field in the Java object.setMediaPlayer(env, thiz, mp);
}

先创建了一个MediaPlayer,不过这个MediaPlayer是C++版的,用来与MediaPlayerService进行通信。然后调用setMediaPlayer将这个mp与上层java关联了起来(记住这个地方,后面会用到)。

好的,那我们继续寻找android_media_MediaPlayer_setDataSourceFD()函数,下面给出代码:

static void
android_media_MediaPlayer_setDataSourceFD(JNIEnv *env, jobject thiz, jobject fileDescriptor, jlong offset, jlong length)
{sp<MediaPlayer> mp = getMediaPlayer(env, thiz);if (mp == NULL ) {jniThrowException(env, "java/lang/IllegalStateException", NULL);return;}if (fileDescriptor == NULL) {jniThrowException(env, "java/lang/IllegalArgumentException", NULL);return;}int fd = jniGetFDFromFileDescriptor(env, fileDescriptor);ALOGV("setDataSourceFD: fd %d", fd);process_media_player_call( env, thiz, mp->setDataSource(fd, offset, length), "java/io/IOException", "setDataSourceFD failed." );
}

这里有必要说一下,JNIEnv是用来向上层回调,jobject是java的类,后面三个不说了。sp<>是什么玩意儿?不慌,strong pointer,强指针,此外还有wp<>,weak pointer,弱指针。这两个都是C++的模板类,实现了对指针的封装,我们直接将它们看成指针即可,ASOP中有太多这样的用法了。wp<>可以实现对原指针的更改,wp<>不可以。

好了我们看函数,显式调用getMediaPlayer()函数,得到上面new出来的C++版的mediaPlayer,最后调用了C++版的MediaPlayer的setDataSource()函数,然后是process_media_player_call()是将处理结果返回给上层java。至此,我们正式进入MediaPlayer框架层,下面给出类图:

MediaPlayer框架采用C/S架构,分别处在两个进程上,采用Bindler机制进行进程间通信。我们发现native层的大部分类都是采用IXXX,BpXXX,BnXXX形式的。在MediaPlayer框架层,由IMediaPlayer,IMediaPlayerService,IMediaPlayerClient三大元老组成了基本框架,由IBinder,BBinder(准确来说叫BnBinder比较合适),BpBinder将其粘合。此外,IXXX类里总是一些虚抽象函数,不存在定义,由BpXXX和BnXXX继承它,BpXXX作为Client端的代理类,发起服务的请求,服务的实现则统一放在BnXXX类里。

MediaPlayer.cpp:

setDataSource():

接着上面,我们来看MediaPlayer的setDataSource()函数,代码贴上:

//代码目录: xref: /frameworks/av/media/libmedia/mediaplayer.cpp
status_t MediaPlayer::setDataSource(int fd, int64_t offset, int64_t length)
{ALOGV("setDataSource(%d, %" PRId64 ", %" PRId64 ")", fd, offset, length);status_t err = UNKNOWN_ERROR;const sp<IMediaPlayerService> service(getMediaPlayerService());if (service != 0) {sp<IMediaPlayer> player(service->create(this, mAudioSessionId));if ((NO_ERROR != doSetRetransmitEndpoint(player)) ||(NO_ERROR != player->setDataSource(fd, offset, length))) {player.clear();}err = attachNewPlayer(player);}return err;
}

getMediaPlayerService()函数:一眼望去,就是请求Service无疑了。MediaPlayer.cpp中并没有这个函数的实现方法,所以我们去他的父类IMediaDeathNotify寻找,嘿,果然在这儿!

//代码码目录:/frameworks/av/media/libmedia/IMediaDeathNotifier.cpp
/*static*/const sp<IMediaPlayerService>
IMediaDeathNotifier::getMediaPlayerService()
{ALOGV("getMediaPlayerService");Mutex::Autolock _l(sServiceLock);if (sMediaPlayerService == 0) {sp<IServiceManager> sm = defaultServiceManager();sp<IBinder> binder;do {binder = sm->getService(String16("media.player"));if (binder != 0) {break;}ALOGW("Media player service not published, waiting...");usleep(500000); // 0.5 s} while (true);if (sDeathNotifier == NULL) {sDeathNotifier = new DeathNotifier();}binder->linkToDeath(sDeathNotifier);sMediaPlayerService = interface_cast<IMediaPlayerService>(binder);}ALOGE_IF(sMediaPlayerService == 0, "no media player service!?");return sMediaPlayerService;
}

这段代码就是Client端的请求服务了,通过调用defaultServiceManager()得到IServiceManager,通过调用IServiceManager的getService()函数来查询“media.player”是否注册,如果注册则返回对应的IBinder,留给Client进行通信。然后就是通过interface_cast将IBinder转化为服务端IMediaPlayerService的指针返回。可是这个inteface_cast()是什么呢?是一个强制类型转换吗?不不不,一叶障目罢了,我们来看看它的定义:

代码目录:frameworks/native/include/binder/IInterface.h
template<typename INTERFACE>
inline sp<INTERFACE> interface_cast(const sp<IBinder>& obj)
{return INTERFACE::asInterface(obj);
}

好家伙,直接返回自身的,即IMediaPlayerService::asInteface(),我们继续追,额,我就不贴代码了,你会发现IMediaPlayerService中并没有这个函数的定义,怎么回事儿?去父类看看!一对比就能发现蹊跷了:

/frameworks/native/include/binder/IInterface.h
// ----------------------------------------------------------------------#define DECLARE_META_INTERFACE(INTERFACE)                               \static const ::android::String16 descriptor;                        \static ::android::sp<I##INTERFACE> asInterface(                     \const ::android::sp<::android::IBinder>& obj);              \virtual const ::android::String16& getInterfaceDescriptor() const;  \I##INTERFACE();                                                     \virtual ~I##INTERFACE();                                            \#define IMPLEMENT_META_INTERFACE(INTERFACE, NAME)                       \const ::android::String16 I##INTERFACE::descriptor(NAME);           \const ::android::String16&                                          \I##INTERFACE::getInterfaceDescriptor() const {              \return I##INTERFACE::descriptor;                                \}                                                                   \::android::sp<I##INTERFACE> I##INTERFACE::asInterface(              \const ::android::sp<::android::IBinder>& obj)               \{                                                                   \::android::sp<I##INTERFACE> intr;                               \if (obj != NULL) {                                              \intr = static_cast<I##INTERFACE*>(                          \obj->queryLocalInterface(                               \I##INTERFACE::descriptor).get());               \if (intr == NULL) {                                         \intr = new Bp##INTERFACE(obj);                          \}                                                           \}                                                               \return intr;                                                    \}                                                                   \I##INTERFACE::I##INTERFACE() { }                                    \I##INTERFACE::~I##INTERFACE() { }                                   \#define CHECK_INTERFACE(interface, data, reply)                         \if (!(data).checkInterface(this)) { return PERMISSION_DENIED; }     \

IInterface中有这么一段奇怪的代码段,不妨,仔细看一下,哦,原来是一对宏声明和定义!而IMediaPlayerService里刚好有这两个宏的调用!那么就见泰山了。我们将IMediaPlayerService置换进去,就能看到IBinder转IMediaPlayerService的实现了!我就不再贴出了。

好了扯远了,我们通过getDefaultService得到了一个注册名为“mediapalyer"的服务,并通过interface_cast转换为一个IMediaPlayerService的指针返回。我们继续往下看:

sp<IMediaPlayer> player(service->create(this, mAudioSessionId));

原来是调用IMediaPlayer的creat()函数,我们去看看:

代码目录:/frameworks/av/media/libmedia/IMediaPlayerService.cpp
virtual sp<IMediaPlayer> create(const sp<IMediaPlayerClient>& client, audio_session_t audioSessionId) {Parcel data, reply;data.writeInterfaceToken(IMediaPlayerService::getInterfaceDescriptor());data.writeStrongBinder(IInterface::asBinder(client));data.writeInt32(audioSessionId);remote()->transact(CREATE, data, &reply);return interface_cast<IMediaPlayer>(reply.readStrongBinder());}

asBinder()是直接将client转化为binder接口,而没有经过ServiceManager这个中介,说明这是个匿名管道,只能在这两个进程间进行通信。来看一下:

// static
sp<IBinder> IInterface::asBinder(const sp<IInterface>& iface)
{if (iface == NULL) return NULL;return iface->onAsBinder();
}
template<typename INTERFACE> inline IBinder* BpInterface<INTERFACE>::onAsBinder() { return remote(); }

remote()得到的就是远端的BpBinder。remote() ->transact(),这个函数要好好说道一下

1.BpBinder,BBinder,IBinder是安桌Binder机制的抽象,其中BpBinder不在这些继承关系中。

2.remote()是在BpRefBase的子类中实现的,返回的就是一个BpBinder。

3.BpBinder的transact实现,就是直接调用IPCThreadState::self()->transact()发送数据。

4.Service端通过IPCThreadState接收到client的请求后,首先会调用BBinder的transact()方法。

5.BBinder的transact方法又会调用子类实现的虚拟方法onTransact。这个虚拟方法是在BnXXXService中实现的

所以,我们直接在BnMediaPlayerService中寻找onTransact()的CREAT实现:

xref: /frameworks/av/media/libmedia/IMediaPlayerService.cpp
status_t BnMediaPlayerService::onTransact(uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags)
{switch (code) {case CREATE: {CHECK_INTERFACE(IMediaPlayerService, data, reply);sp<IMediaPlayerClient> client =interface_cast<IMediaPlayerClient>(data.readStrongBinder());audio_session_t audioSessionId = (audio_session_t) data.readInt32();sp<IMediaPlayer> player = create(client, audioSessionId);reply->writeStrongBinder(IInterface::asBinder(player));return NO_ERROR;} break;...}
}

首先又将BpBinder转回了sp<IMediaPlayerClient>,然后调用了creat()方法,可是我们发现BnMediaPlayerService中只有一个onTransact()的实现,所以这个creat()我们要去它的子类寻找,果然就在MediaPlayerService中:

xref: /frameworks/av/media/libmediaplayerservice/MediaPlayerService.cpp
sp<IMediaPlayer> MediaPlayerService::create(const sp<IMediaPlayerClient>& client,audio_session_t audioSessionId)
{pid_t pid = IPCThreadState::self()->getCallingPid();int32_t connId = android_atomic_inc(&mNextConnId);sp<Client> c = new Client(this, pid, connId, client, audioSessionId,IPCThreadState::self()->getCallingUid());ALOGD("Create new client(%d) from pid %d, uid %d, ", connId, pid,IPCThreadState::self()->getCallingUid());wp<Client> w = c;{Mutex::Autolock lock(mLock);mClients.add(w);}return c;
}

代码简单易懂,创建了它一个自身类Client并返回指针供远端调用,这个Client包含了上层java的大部分接口。好了,回到我们的开始地方:

//代码目录:/frameworks/av/media/libmedia/mediaplayer.cpp
status_t MediaPlayer::setDataSource(int fd, int64_t offset, int64_t length)
{ALOGV("setDataSource(%d, %" PRId64 ", %" PRId64 ")", fd, offset, length);status_t err = UNKNOWN_ERROR;const sp<IMediaPlayerService> service(getMediaPlayerService());//通过IPC机制获取一个远程服务if (service != 0) {sp<IMediaPlayer> player(service->create(this, mAudioSessionId));//通过MediaPlayerService端创建了一个Clientif ((NO_ERROR != doSetRetransmitEndpoint(player)) ||(NO_ERROR != player->setDataSource(fd, offset, length))) {//调用Client的setDataSource()player.clear();}err = attachNewPlayer(player);}return err;
}

这部分都是牵扯了Bindler机制,扯得有点远了。不过好坏算是回来了。

哦,后面就简单了许多,直接调用了Client的setDataSource()函数,我们去看看:

MediaPlayerService::Client::setDataSource():

//代码目录: xref: /frameworks/av/media/libmediaplayerservice/MediaPlayerService.cpp
status_t MediaPlayerService::Client::setDataSource(int fd, int64_t offset, int64_t length)
{struct stat sb;int ret = fstat(fd, &sb);if (ret != 0) {ALOGE("fstat(%d) failed: %d, %s", fd, ret, strerror(errno));return UNKNOWN_ERROR;}if (offset >= sb.st_size) {ALOGE("offset error");return UNKNOWN_ERROR;}if (offset + length > sb.st_size) {length = sb.st_size - offset;ALOGV("calculated length = %lld", (long long)length);}player_type playerType = MediaPlayerFactory::getPlayerType(this,fd,offset,length);sp<MediaPlayerBase> p = setDataSource_pre(playerType);if (p == NULL) {return NO_INIT;}// now set data sourcemStatus = setDataSource_post(p, p->setDataSource(fd, offset, length));ALOGD("[%d] %s done", mConnId, __FUNCTION__);return mStatus;
}

首先调用MediaPlayerFactory::getPlayerType()函数获得了播放器的类型,我们看看实现:

//代码目录: xref: /frameworks/av/media/libmediaplayerservice/MediaPlayerFactory.cpp
#define GET_PLAYER_TYPE_IMPL(a...)                      \Mutex::Autolock lock_(&sLock);                      \\player_type ret = STAGEFRIGHT_PLAYER;               \float bestScore = 0.0;                              \\for (size_t i = 0; i < sFactoryMap.size(); ++i) {   \\IFactory* v = sFactoryMap.valueAt(i);           \float thisScore;                                \CHECK(v != NULL);                               \thisScore = v->scoreFactory(a, bestScore);      \if (thisScore > bestScore) {                    \ret = sFactoryMap.keyAt(i);                 \bestScore = thisScore;                      \}                                               \}                                                   \\if (0.0 == bestScore) {                             \ret = getDefaultPlayerType();                   \}                                                   \\return ret;    player_type MediaPlayerFactory::getPlayerType(const sp<IMediaPlayer>& client,int fd,int64_t offset,int64_t length) {GET_PLAYER_TYPE_IMPL(client, fd, offset, length);
}

这个叫打分机制,sFactoryMap存储了各个播放器的工厂类,通过scoreFactory()来获取对应播放类型的权值最大的工厂类,并将其返回。一般情况下播放网流或者本地资源都是调用的NuPlayerFactory,可以看一下它的打分机制:

virtual float scoreFactory(const sp<IMediaPlayer>& /*client*/,int fd,int64_t offset,int64_t length,float /*curScore*/) {char buf[20];lseek(fd, offset, SEEK_SET);read(fd, buf, sizeof(buf));lseek(fd, offset, SEEK_SET);uint32_t ident = *((uint32_t*)buf);// Ogg vorbis?if (ident == 0x5367674f) // 'OggS': use NuPlayer to play OGG clipsreturn 1.0;if (ident == 0xc450fbff && length == 40541) // FIXME: for CTS testGapless1 test work aroundreturn 1.0;if (!memcmp(buf+3, " ftypM4A ", 9) && (length == 46180 || length == 83892)) // FIXME: for CTS testGapless2&3 test work aroundreturn 1.0;// FIXME: for AN7 CTS closed caption tests work around// testChangeSubtitleTrack// testDeselectTrackForSubtitleTracks// testGetTrackInfoForVideoWithSubtitleTracksif (length == 210936)return 1.0;// FIXME: for AN7 CTS NativeDecoderTest#testMuxerVp8if (length == 2829176)return 1.0;// FIXME: for AN8 CTS android.media.cts.MediaPlayerDrmTestif (length == 3012095) {return 1.0;}// FIXME: for AN8 GTS MediaPlayerTestif (length == 1481047 || length == 8127587)return 1.0;return 0.0;}

根据各个播放类型设置权值,并对应着返回。就不一一解释了,因为有些类型我也看不懂^_^。获得了播放器类型后就要创建播放器了,这一步是调用:

sp<MediaPlayerBase> p = setDataSource_pre(playerType)

来实现的,我们看看setDataSource_pre()的定义:

sp<MediaPlayerBase> MediaPlayerService::Client::setDataSource_pre(player_type playerType)
{ALOGV("player type = %d", playerType);// MStar Android Patch Beginif (playerType == MST_PLAYER) {mIsMstplayer = true;}// MStar Android Patch End// create the right type of playersp<MediaPlayerBase> p = createPlayer(playerType);if (p == NULL) {return p;}sp<IServiceManager> sm = defaultServiceManager();sp<IBinder> binder = sm->getService(String16("media.extractor"));if (binder == NULL) {ALOGE("extractor service not available");return NULL;}sp<ServiceDeathNotifier> extractorDeathListener =new ServiceDeathNotifier(binder, p, MEDIAEXTRACTOR_PROCESS_DEATH);binder->linkToDeath(extractorDeathListener);sp<ServiceDeathNotifier> codecDeathListener;if (property_get_bool("persist.media.treble_omx", true)) {// Treble IOmxsp<IOmx> omx = IOmx::getService();if (omx == nullptr) {ALOGE("Treble IOmx not available");return NULL;}codecDeathListener = new ServiceDeathNotifier(omx, p, MEDIACODEC_PROCESS_DEATH);omx->linkToDeath(codecDeathListener, 0);} else {// Legacy IOMXbinder = sm->getService(String16("media.codec"));if (binder == NULL) {ALOGE("codec service not available");return NULL;}codecDeathListener = new ServiceDeathNotifier(binder, p, MEDIACODEC_PROCESS_DEATH);binder->linkToDeath(codecDeathListener);}Mutex::Autolock lock(mLock);clearDeathNotifiers_l();mExtractorDeathListener = extractorDeathListener;mCodecDeathListener = codecDeathListener;if (!p->hardwareOutput()) {mAudioOutput = new AudioOutput(mAudioSessionId, IPCThreadState::self()->getCallingUid(),mPid, mAudioAttributes);static_cast<MediaPlayerInterface*>(p.get())->setAudioSink(mAudioOutput);}return p;
}

额,代码有点多,我们只抓重点,返回的p是什么?sp<MediaPlayerBase> p = createPlayer(playerType),原来如此,我们直接跳过去吧:

sp<MediaPlayerBase> MediaPlayerService::Client::createPlayer(player_type playerType)
{// determine if we have the right player typesp<MediaPlayerBase> p = mPlayer;if ((p != NULL) && (p->playerType() != playerType)) {ALOGV("delete player");p.clear();}if (p == NULL) {p = MediaPlayerFactory::createPlayer(playerType, this, notify, mPid);}if (p != NULL) {p->setUID(mUid);}return p;
}

mPlayer是上次播放调用的"播放器",重点在p = MediaPlayerFactory::createPlayer(playerType, this, notify, mPid),又调用了MediaPlayerFactory的函数,通过播放器类型创建了一个播放器吗?我们去看看:

sp<MediaPlayerBase> MediaPlayerFactory::createPlayer(player_type playerType,void* cookie,notify_callback_f notifyFunc,pid_t pid) {sp<MediaPlayerBase> p;IFactory* factory;status_t init_result;Mutex::Autolock lock_(&sLock);factory = sFactoryMap.valueFor(playerType);CHECK(NULL != factory);p = factory->createPlayer(pid);//....return p;
}

代码很简单,从工厂map中找出对应的播放器工厂,然后通过工厂创建一个“播放器”。我们来看一下工厂类型:

typedef KeyedVector<player_type, IFactory*> tFactoryMap; void MediaPlayerFactory::registerBuiltinFactories() {Mutex::Autolock lock_(&sLock);if (sInitComplete)return;registerFactory_l(new NuPlayerFactory(), NU_PLAYER);registerFactory_l(new TestPlayerFactory(), TEST_PLAYER);// MStar Android Patch Begin
#ifdef BUILD_WITH_MSTAR_MMregisterFactory_l(new MstPlayerFactory(), MST_PLAYER);
#ifdef SUPPORT_IMAGEPLAYERregisterFactory_l(new ImagePlayerFactory(), IMG_PLAYER);
#endif
#ifdef SUPPORT_MSTARPLAYERregisterFactory_l(new MstarPlayerFactory(), MSTAR_PLAYER);
#endif
#endif// MStar Android Patch EndsInitComplete = true;
}

可见一共注册了五种播放器,但是我们最多用的还是NuPlayer。我们继续看代码,调用NuPlayer的creatplayer()函数:

virtual sp<MediaPlayerBase> createPlayer(pid_t pid) {ALOGV(" create NuPlayer");return new NuPlayerDriver(pid);}

好敷衍,直接新建了一个NuPlayerDriver类,我们的NuPlayer播放器呢?不慌,我们看看NuPlayer的构造函数:

NuPlayerDriver::NuPlayerDriver(pid_t pid): mState(STATE_IDLE),mLooper(new ALooper),mPlayer(new NuPlayer(pid)),mAutoLoop(false) {
}
}

原来是在构造函数里新建了一个播放器。这样就明白了,所以现在回到setDataSource_pre()中,它反悔了一个NuPlayerDriver,我们继续看Client的setDataSource函数:

//代码目录: xref: /frameworks/av/media/libmediaplayerservice/MediaPlayerService.cpp
status_t MediaPlayerService::Client::setDataSource(int fd, int64_t offset, int64_t length)
{//...player_type playerType = MediaPlayerFactory::getPlayerType(this,fd,offset,length);sp<MediaPlayerBase> p = setDataSource_pre(playerType);if (p == NULL) {return NO_INIT;}// now set data sourcemStatus = setDataSource_post(p, p->setDataSource(fd, offset, length));ALOGD("[%d] %s done", mConnId, __FUNCTION__);return mStatus;
}

调用了NuPlayerDriver的setDataSource()函数啊,看来要进入安卓内置播放器了:

//代码目录: xref: /frameworks/av/media/libmediaplayerservice/nuplayer/NuPlayerDriver.cpp
status_t NuPlayerDriver::setDataSource(int fd, int64_t offset, int64_t length) {ALOGV("setDataSource(%p) file(%d)", this, fd);Mutex::Autolock autoLock(mLock);if (mState != STATE_IDLE) {return INVALID_OPERATION;}mState = STATE_SET_DATASOURCE_PENDING;mPlayer->setDataSourceAsync(fd, offset, length);while (mState == STATE_SET_DATASOURCE_PENDING) {mCondition.wait(mLock);}return mAsyncResult;
}

之前我们看到,mPlayer就是NuPlayer类型,NuPlauerDriver调用了NuPlayer的setdataSourceAsync()函数。这样就进入了内置播放器,其他函数的调用都大致如此,但是没有setDataSource这么复杂,抓住一条线,就能顺理成章的搞懂框架的调用关系。


NuPlayer:

安卓MediaPlayer源码跟读解析相关推荐

  1. 小象学院源码共读之Executor解析

    Executor解析 Executor解析 Executor类的初始化 线程池threadPool及taskReaperPool详解 子类TaskRunner详解 子类TaskReaper详解 参考内 ...

  2. 安卓系统源码编译系列(六)——单独编译内置浏览器WebView教程

    原文                   http://blog.csdn.net/zhaoxy_thu/article/details/18883015             本文主要对从安卓系统 ...

  3. Linux驱动开发 / fbtft源码速读

    哈喽,老吴又来分享学习心得啦~ 一.目标与体系 目标是关于你想要达到的结果,而体系是涉及导致这些结果的过程; 目标的意义在于确定大方向,但体系才能促进进步.完全忽略目标,只关注体系,仍然会成功. 结果 ...

  4. underscore.js源码整体框架解析

    源码框架 读一些库的源码时最头疼的其实不是里面各个函数的功能,而是整体结构框架,通常库的源码都很长,跟框架相关的代码并不是在一起放着的,导致你想明白起来就很困难. 我看过通过画图的方式去讲解框架的,但 ...

  5. TVMovie JAVA源代码1.2.4版原生安卓tv源码电视盒子APP开源全前端无后台

    b站视频教程:高清版可点击进入观看>> 高清演示视频看下方: 原生安卓tv源码电视盒子app TVMovie JAVA源代码 APP开源全前端无需后台 AndroidTV.机顶盒.电影 ...

  6. 【Spring源码这样读】-怎么阅读源码

    做开发要不要读源码?如果你天天996,真心的不建议你去读源码(我是不介意做一个劝退师的).读源码确确实实是一个费时费力的活,如果你每天都很忙,偶尔看一眼,想了解其中的奥秘,这很难办到.那我们需不需要读 ...

  7. Spring源码-applicationcontext.xml解析过程

    为什么80%的码农都做不了架构师?>>>    Spring源码-applicationcontext.xml解析过程 核心流程:Spring中对于applicationcontex ...

  8. 4月,诚邀你参加源码共读,学会看源码,打开新世界!开阔视野

    大家好,我是若川.很多关注我的新朋友可能不知道我组织了源码共读活动~ 也有很多人不知道我是谁.有人以为我是80后.有人以为我是全职自媒体等等.若川的 2021 年度总结,弹指之间 这篇文章写了我是16 ...

  9. 可能是全网首个前端源码共读活动,诚邀你加入一起学习

    大家好,我是若川.众所周知,从8月份开始,我组织了源码共读活动,每周学习200行左右的源码,到现在持续了3个多月,坚持答疑解惑.帮助了不少人,还是挺开心的.另外,涌现了很多优秀的读者朋友,比如有就职于 ...

最新文章

  1. IAR for msp430 MDK中 warning: #223-D: function xxx declared implicitly 解决方法
  2. ASP隐藏文件地址,并在下载时替换文件名
  3. java人名识别_HanLP中人名识别分析(示例代码)
  4. mysql 视图慢_第03问:磁盘 IO 报警,MySQL 读写哪个文件慢了?
  5. 洛谷 P3521 [POI2011]ROT-Tree Rotations 解题报告
  6. 虹桥地铁站附近沿线的有房源出租的社区和村落
  7. 使用代码删除IBASE object component
  8. 【ArcGIS风暴】实验:公路建设成本的计算
  9. [视频]Google Chrome背后的故事
  10. qpython能使用json吗l_现在还能使用土墙吗?
  11. 笔记十三:python之类的继承实例(猜诗歌)
  12. button 和 input 的submit ,reset的区别
  13. 浅谈限流式保护器在充电线路中电气防火的应用
  14. userdel: user user is currently used by process 18752解决办法
  15. java创作一个椭圆类_椭圆类——3 (类的复杂设计)
  16. 有云说 | 直播火爆的真正原因是什么?
  17. 链表实现电话簿(C++)
  18. openpbs环境下GPU版NAMD的作业提交问题
  19. BLOCK层代码分析(1)数据的组织BIO
  20. Sth about Gospel||Soul Rock||Punk||RnB||Hip-Hop

热门文章

  1. Java开发面试题及答案分享
  2. 书评与摘抄《白鹿原》
  3. 2055013-56-2,Ald-Ph-PEG2-amine TFA salt,CHO-Ph-PEG2-amine TFA
  4. R16 NR CDRX
  5. 2022年值得购买的拍照手机推荐 这五款出片率极强
  6. Intel Quiet System Technology (QST) Support In ...
  7. 【论文阅读】Learning Deep Features for Discriminative Localization
  8. 自考本科推荐计算机考研学校,过来人谈考研:一个自考本科生的考研成功自述...
  9. 椭圆曲线加密和签名算法
  10. 一家之言——智慧城市“十三五”建设着眼点