如何使用MediaPlayer在列表视图中播放多个video?

我试图实现与video列表视图,因为它的元素。 我正在使用此项目在纹理视图上显示video。 它在下面使用MediaPlayer。 当在同一时间加载两个video时,它(大部分时间)失败。

我得到的错误是:

TextureVideoView error. File or network related operation errors. MediaPlayer: error (1, -2147479551) 

这也是从磁盘加载文件时发生的

在error handling部分,我尝试重置URL。 然后,我大部分得到

 E/BufferQueueProducer: [unnamed-30578-12] disconnect(P): connected to another API (cur=0 req=3) 

错误。 我不清楚的是,从networking设置一些任意video将工作,但重试相同的URL将失败。

所以在OnErrorListener中:

 textureView.setVideo(item.getUriMp4(),MediaFensterPlayerController.DEFAULT_VIDEO_START); 

将失败,但是:

 textureView.setVideo("http://different.video" ... ) 

将工作得很好。

这对于特定文件也不是问题,因为滚动不同的video文件将失败。 有时那些失败的人下次还会工作

我也试过MediaCodecMediaExtractor组合,而不是MediaPlayer方法,但我碰到,看起来像什么, 设备特定的平台错误

任何提示? 有什么build议么?

谢谢

W上。

你可以试试这个,而不是一个库它取自谷歌在github上的示例:

将两个videostream同时解码到两个TextureView。

一个关键的特征是当由于方向改变而重新开始活动时video解码器不停止。 这是为了模拟实时videostream的回放。 如果活动由于“已完成”而暂停(表示我们正在离开活动一段时间),则video解码器将被closures。

待办事项:考虑closures屏幕时closures,以保存电池。

Java的:

DoubleDecodeActivity.java

 public class DoubleDecodeActivity extends Activity { private static final String TAG = MainActivity.TAG; private static final int VIDEO_COUNT = 2; //How many videos to play simultaneously. // Must be static storage so they'll survive Activity restart. private static boolean sVideoRunning = false; private static VideoBlob[] sBlob = new VideoBlob[VIDEO_COUNT]; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_double_decode); if (!sVideoRunning) { sBlob[0] = new VideoBlob((TextureView) findViewById(R.id.double1_texture_view), ContentManager.MOVIE_SLIDERS, 0); sBlob[1] = new VideoBlob((TextureView) findViewById(R.id.double2_texture_view), ContentManager.MOVIE_EIGHT_RECTS, 1); sVideoRunning = true; } else { sBlob[0].recreateView((TextureView) findViewById(R.id.double1_texture_view)); sBlob[1].recreateView((TextureView) findViewById(R.id.double2_texture_view)); } } @Override protected void onPause() { super.onPause(); boolean finishing = isFinishing(); Log.d(TAG, "isFinishing: " + finishing); for (int i = 0; i < VIDEO_COUNT; i++) { if (finishing) { sBlob[i].stopPlayback(); sBlob[i] = null; } } sVideoRunning = !finishing; Log.d(TAG, "onPause complete"); } /** * Video playback blob. * <p> * Encapsulates the video decoder and playback surface. * <p> * We want to avoid tearing down and recreating the video decoder on orientation changes, * because it can be expensive to do so. That means keeping the decoder's output Surface * around, which means keeping the SurfaceTexture around. * <p> * It's possible that the orientation change will cause the UI thread's EGL context to be * torn down and recreated (the app framework docs don't seem to make any guarantees here), * so we need to detach the SurfaceTexture from EGL on destroy, and reattach it when * the new SurfaceTexture becomes available. Happily, TextureView does this for us. */ private static class VideoBlob implements TextureView.SurfaceTextureListener { private final String LTAG; private TextureView mTextureView; private int mMovieTag; private SurfaceTexture mSavedSurfaceTexture; private PlayMovieThread mPlayThread; private SpeedControlCallback mCallback; /** * Constructs the VideoBlob. * * @param view The TextureView object we want to draw into. * @param movieTag Which movie to play. * @param ordinal The blob's ordinal (only used for log messages). */ public VideoBlob(TextureView view, int movieTag, int ordinal) { LTAG = TAG + ordinal; Log.d(LTAG, "VideoBlob: tag=" + movieTag + " view=" + view); mMovieTag = movieTag; mCallback = new SpeedControlCallback(); recreateView(view); } /** * Performs partial construction. The VideoBlob is already created, but the Activity * was recreated, so we need to update our view. */ public void recreateView(TextureView view) { Log.d(LTAG, "recreateView: " + view); mTextureView = view; mTextureView.setSurfaceTextureListener(this); if (mSavedSurfaceTexture != null) { Log.d(LTAG, "using saved st=" + mSavedSurfaceTexture); view.setSurfaceTexture(mSavedSurfaceTexture); } } /** * Stop playback and shut everything down. */ public void stopPlayback() { Log.d(LTAG, "stopPlayback"); mPlayThread.requestStop(); // TODO: wait for the playback thread to stop so we don't kill the Surface // before the video stops // We don't need this any more, so null it out. This also serves as a signal // to let onSurfaceTextureDestroyed() know that it can tell TextureView to // free the SurfaceTexture. mSavedSurfaceTexture = null; } @Override public void onSurfaceTextureAvailable(SurfaceTexture st, int width, int height) { Log.d(LTAG, "onSurfaceTextureAvailable size=" + width + "x" + height + ", st=" + st); // If this is our first time though, we're going to use the SurfaceTexture that // the TextureView provided. If not, we're going to replace the current one with // the original. if (mSavedSurfaceTexture == null) { mSavedSurfaceTexture = st; File sliders = ContentManager.getInstance().getPath(mMovieTag); mPlayThread = new PlayMovieThread(sliders, new Surface(st), mCallback); } else { // Can't do it here in Android <= 4.4. The TextureView doesn't add a // listener on the new SurfaceTexture, so it never sees any updates. // Needs to happen from activity onCreate() -- see recreateView(). //Log.d(LTAG, "using saved st=" + mSavedSurfaceTexture); //mTextureView.setSurfaceTexture(mSavedSurfaceTexture); } } @Override public void onSurfaceTextureSizeChanged(SurfaceTexture st, int width, int height) { Log.d(LTAG, "onSurfaceTextureSizeChanged size=" + width + "x" + height + ", st=" + st); } @Override public boolean onSurfaceTextureDestroyed(SurfaceTexture st) { Log.d(LTAG, "onSurfaceTextureDestroyed st=" + st); // The SurfaceTexture is already detached from the EGL context at this point, so // we don't need to do that. // // The saved SurfaceTexture will be null if we're shutting down, so we want to // return "true" in that case (indicating that TextureView can release the ST). return (mSavedSurfaceTexture == null); } @Override public void onSurfaceTextureUpdated(SurfaceTexture st) { //Log.d(TAG, "onSurfaceTextureUpdated st=" + st); } } /** * Thread object that plays a movie from a file to a surface. * <p> * Currently loops until told to stop. */ private static class PlayMovieThread extends Thread { private final File mFile; private final Surface mSurface; private final SpeedControlCallback mCallback; private MoviePlayer mMoviePlayer; /** * Creates thread and starts execution. * <p> * The object takes ownership of the Surface, and will access it from the new thread. * When playback completes, the Surface will be released. */ public PlayMovieThread(File file, Surface surface, SpeedControlCallback callback) { mFile = file; mSurface = surface; mCallback = callback; start(); } /** * Asks MoviePlayer to halt playback. Returns without waiting for playback to halt. * <p> * Call from UI thread. */ public void requestStop() { mMoviePlayer.requestStop(); } @Override public void run() { try { mMoviePlayer = new MoviePlayer(mFile, mSurface, mCallback); mMoviePlayer.setLoopMode(true); mMoviePlayer.play(); } catch (IOException ioe) { Log.e(TAG, "movie playback failed", ioe); } finally { mSurface.release(); Log.d(TAG, "PlayMovieThread stopping"); } } } } 

XML:

activity_double_decode.xml

 <?xml version="1.0" encoding="utf-8"?> <!-- portrait layout --> <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="fill_parent" android:layout_height="fill_parent" android:baselineAligned="false" android:orientation="vertical" > <LinearLayout android:layout_width="fill_parent" android:layout_height="fill_parent" android:orientation="horizontal" android:layout_weight="1" android:layout_marginBottom="8dp" > <TextureView android:id="@+id/double1_texture_view" android:layout_width="wrap_content" android:layout_height="wrap_content" /> </LinearLayout> <LinearLayout android:layout_width="fill_parent" android:layout_height="fill_parent" android:orientation="horizontal" android:layout_weight="1" > <TextureView android:id="@+id/double2_texture_view" android:layout_width="wrap_content" android:layout_height="wrap_content" /> </LinearLayout> </LinearLayout> 

将所有videopath添加到数组或ArrayList中,并实现mediaplayer.setOnMediaPlayerCompletionListener,当播放媒体时,将从这里调用此接口初始化提供新媒体的新媒体播放器实例,并调用start()

我只是告诉你的逻辑,我希望这会工作

使用VideoView而不是ListView它可能工作。 看看这里http://developer.android.com/reference/android/widget/VideoView.html

这个问题已经在这里得到了几个答案: stackoverflow.com/questions/31532893/i-want-to-display-multiple-video-in-listview-using-video-but-not-able-to-do-do-this 。 除非您的问题不同或更具体,否则此主题将被标记为重复。

目前的解决scheme:我build议JavaCV / OpenCV在Java中一次播放多个video。 它支持格式的分配。

教程 – http://ganeshtiwaridotcomdotnp.blogspot.co.nz/search/label/OpenCV-JavaCV

JavaFX也可以播放一些.MP4video格式。

旧的解决scheme: – 即使JMF可以同时播放多个video,它已经过时,不再维护。