Closed aklisanic closed 8 years ago
Found it. Updated code to use remote URL.
Could you maybe show what was needed to support this? It would be helpfull for others if we can merge it into this framework.
Well, it is just a small change. Instead of file I use URL for MediaPlayer's player.setDataSource() (around line 170 in VideoPlayerAndroid) method. So I created basically the same play() method but it takes String URL as param instead of FileHandle. And setOnCompletionListener needs to be changed too.
One other question though... How can I set position of my video player? I need it to be only part of the screen with some background texture. Now when I set dimensions it always gets scaled but it is stuck at lower left corner of the screen.
Ok, so the change was only for android? The reason why it wasn't implemented yet was because of the work for desktop.
About the positioning, if you want it on only part of the screen, you'll have to use VideoPlayer.createVideoPlayer (Viewport viewport) or the one which accepts a custom mesh. The resize method won't do anything in this case, and the video will simply be played on the viewport or the mesh.
Thnx. I will check on how to do streaming for the desktop too. I really need that feature. If I find something it will be posted here.
Libavcodec doesn't really care where the information comes from, so it would probably need a cross platform way of connecting through http, reading the stream into a buffer of some size, and supplying that data to libavcodec. Unfortunately, I don't have any time to help on this.
Good luck!
One more thing Robbie.... regarding this: "About the positioning, if you want it on only part of the screen, you'll have to use VideoPlayer.createVideoPlayer (Viewport viewport) or the one which accepts a custom mesh. The resize method won't do anything in this case, and the video will simply be played on the viewport or the mesh." Is there any chance that you have some code examples for this, or some code examples in general for this library?
I found this one somewhere on my disk. I'm not sure how up to date it is, but it should be working. It will create a simple cube which will have a video playing on each side.
/*******************************************************************************
* Copyright 2011 See AUTHORS file.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
******************************************************************************/
package com.badlogic.gdx.tests.extensions;
import java.io.FileNotFoundException;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.InputMultiplexer;
import com.badlogic.gdx.graphics.Color;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.Mesh;
import com.badlogic.gdx.graphics.PerspectiveCamera;
import com.badlogic.gdx.graphics.VertexAttributes.Usage;
import com.badlogic.gdx.graphics.g3d.Environment;
import com.badlogic.gdx.graphics.g3d.Material;
import com.badlogic.gdx.graphics.g3d.Model;
import com.badlogic.gdx.graphics.g3d.ModelBatch;
import com.badlogic.gdx.graphics.g3d.ModelInstance;
import com.badlogic.gdx.graphics.g3d.attributes.ColorAttribute;
import com.badlogic.gdx.graphics.g3d.environment.DirectionalLight;
import com.badlogic.gdx.graphics.g3d.utils.CameraInputController;
import com.badlogic.gdx.graphics.g3d.utils.DefaultShaderProvider;
import com.badlogic.gdx.graphics.g3d.utils.MeshBuilder;
import com.badlogic.gdx.graphics.g3d.utils.MeshPartBuilder.VertexInfo;
import com.badlogic.gdx.graphics.g3d.utils.ModelBuilder;
import com.badlogic.gdx.math.Vector3;
import com.badlogic.gdx.tests.utils.GdxTest;
import com.badlogic.gdx.video.VideoPlayer;
import com.badlogic.gdx.video.VideoPlayerCreator;
public class GdxVideoTest extends GdxTest {
public PerspectiveCamera cam;
public CameraInputController inputController;
public ModelInstance instance;
public Environment environment;
public VideoPlayer videoPlayer;
public Mesh mesh;
private final Vector3 tmpV1 = new Vector3();
private final Vector3 target = new Vector3();
@Override
public void create () {
environment = new Environment();
environment.set(new ColorAttribute(ColorAttribute.AmbientLight, .4f, .4f, .4f, 1f));
environment.add(new DirectionalLight().set(0.8f, 0.8f, 0.8f, -1f, -0.8f, -0.2f));
cam = new PerspectiveCamera(67, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
cam.position.set(10f, 10f, 10f);
cam.lookAt(0, 0, 0);
cam.near = 0.1f;
cam.far = 300f;
cam.update();
MeshBuilder meshBuilder = new MeshBuilder();
meshBuilder.begin(Usage.Position | Usage.TextureCoordinates, GL20.GL_TRIANGLES);
// @formatter:off
meshBuilder.box(5, 5, 5);
// @formatter:on
mesh = meshBuilder.end();
videoPlayer = VideoPlayerCreator.createVideoPlayer(cam, mesh, GL20.GL_TRIANGLES);
try {
videoPlayer.play(Gdx.files.internal("data/testvideo.ogv"));
} catch (FileNotFoundException e) {
e.printStackTrace();
}
Gdx.input.setInputProcessor(new InputMultiplexer(this, inputController = new CameraInputController(cam)));
Gdx.gl.glEnable(GL20.GL_CULL_FACE);
Gdx.gl.glCullFace(GL20.GL_BACK);
}
@Override
public void render () {
inputController.update();
Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
final float delta = Gdx.graphics.getDeltaTime();
tmpV1.set(cam.direction).crs(cam.up).y = 0f;
cam.rotateAround(target, tmpV1.nor(), delta * 20);
cam.rotateAround(target, Vector3.Y, delta * -30);
cam.update();
if (!videoPlayer.render()) { // As soon as the video is finished, we start the file again using the same player.
try {
videoPlayer.play(Gdx.files.internal("data/testvideo.ogv"));
} catch (FileNotFoundException e) {
e.printStackTrace();
}
}
}
@Override
public void dispose () {
}
public boolean needsGL20 () {
return true;
}
public void resume () {
}
public void resize (int width, int height) {
}
public void pause () {
}
}
Thanks for the example.
Since there are no plans to support this (Because of the amount of work required to get this to work on desktop), I'm going to close this issue.
If you happen to get it to work on desktop as well, I would love to see a PR.
Your example seems great, but cannot be compiled. I have corrected some functions, it can be built and runned, but there is only a black screen without audio and video.
package com.mygdx.game;
import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.InputMultiplexer;
import com.badlogic.gdx.InputProcessor;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.Mesh;
import com.badlogic.gdx.graphics.PerspectiveCamera;
import com.badlogic.gdx.graphics.VertexAttributes.Usage;
import com.badlogic.gdx.graphics.g3d.Environment;
import com.badlogic.gdx.graphics.g3d.ModelInstance;
import com.badlogic.gdx.graphics.g3d.attributes.ColorAttribute;
import com.badlogic.gdx.graphics.g3d.environment.DirectionalLight;
import com.badlogic.gdx.graphics.g3d.utils.CameraInputController;
import com.badlogic.gdx.graphics.g3d.utils.MeshBuilder;
import com.badlogic.gdx.math.Vector3;
import com.badlogic.gdx.video.VideoPlayer;
import com.badlogic.gdx.video.VideoPlayerCreator;
import java.io.FileNotFoundException;
public class MyGdxGame extends ApplicationAdapter implements InputProcessor {
public PerspectiveCamera cam;
public CameraInputController inputController;
public ModelInstance instance;
public Environment environment;
public VideoPlayer videoPlayer;
public Mesh mesh;
private final Vector3 tmpV1 = new Vector3();
private final Vector3 target = new Vector3();
@Override
public void create() {
environment = new Environment();
environment.set(new ColorAttribute(ColorAttribute.AmbientLight, .4f, .4f, .4f, 1f));
environment.add(new DirectionalLight().set(0.8f, 0.8f, 0.8f, -1f, -0.8f, -0.2f));
cam = new PerspectiveCamera(67, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
cam.position.set(10f, 10f, 10f);
cam.lookAt(0, 0, 0);
cam.near = 0.1f;
cam.far = 300f;
cam.update();
MeshBuilder meshBuilder = new MeshBuilder();
meshBuilder.begin(Usage.Position | Usage.TextureCoordinates, GL20.GL_TRIANGLES);
// @formatter:off
meshBuilder.box(5, 5, 5);
// @formatter:on
mesh = meshBuilder.end();
videoPlayer = VideoPlayerCreator.createVideoPlayer(cam, mesh, GL20.GL_TRIANGLES);
try {
videoPlayer.play(Gdx.files.internal("data/myconverted.ogv"));
} catch (FileNotFoundException e) {
e.printStackTrace();
}
Gdx.input.setInputProcessor(new InputMultiplexer(this, inputController = new CameraInputController(cam)));
Gdx.gl.glEnable(GL20.GL_CULL_FACE);
Gdx.gl.glCullFace(GL20.GL_BACK);
}
@Override
public void render() {
inputController.update();
Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
final float delta = Gdx.graphics.getDeltaTime();
tmpV1.set(cam.direction).crs(cam.up).y = 0f;
cam.rotateAround(target, tmpV1.nor(), delta * 20);
cam.rotateAround(target, Vector3.Y, delta * -30);
cam.update();
if (!videoPlayer.render()) { // As soon as the video is finished, we start the file again using the same player.
try {
videoPlayer.play(Gdx.files.internal("data/myconverted.ogv"));
} catch (FileNotFoundException e) {
e.printStackTrace();
}
}
}
@Override
public void dispose () {
}
public boolean needsGL20 () {
return true;
}
public void resume () {
}
public void resize (int width, int height) {
}
public void pause () {
}
@Override
public boolean keyDown(int keycode) {
return false;
}
@Override
public boolean keyUp(int keycode) {
return false;
}
@Override
public boolean keyTyped(char character) {
return false;
}
@Override
public boolean touchDown(int screenX, int screenY, int pointer, int button) {
return false;
}
@Override
public boolean touchUp(int screenX, int screenY, int pointer, int button) {
return false;
}
@Override
public boolean touchDragged(int screenX, int screenY, int pointer) {
return false;
}
@Override
public boolean mouseMoved(int screenX, int screenY) {
return false;
}
@Override
public boolean scrolled(int amount) {
return false;
}
}
Hello. I am interested in using gdx-video for playing streaming video. Video is received from the internet and should be played in libgdx game. Is there a way to expand this code to support this feature? I have working code that plays .mp4 video and would like to modify it to play streaming video. It would be enough if you can provide few tips on how to do this.