Closed leios closed 7 years ago
Oh its not, yet :( you'll need png2yuv and vpxenc. They were a bit confusing to install... On ubuntu its straightforward though:
sudo apt-get install mjpegtools
sudo apt-get install vpx-tools
using GLVisualize
window = glscreen()
_view(visualize(loadasset("cat.obj")))
frames = []
while isopen(window)
GLWindow.render_frame(window)
GLWindow.swapbuffers(window)
GLWindow.pollevents()
push!(frames, GLWindow.screenbuffer(window))
yield()
end
name = "test"
folder = "/home/s/"
resampling = 0 # no resizing
cd(folder)# this is optional and makes sure that we're in a valid location
create_video(frames, name, folder, 0)
destroy!(window)
This is really not optimal, since all frames will be held in RAM. But saving them with ImageMagick while recording completely destroys interactivity. But this recording software manages this feat with Open Source libraries, so it should be possible. When I have some time, I'll look into how they do it and see if I can do it similarly.
Hey Simon,
Thanks for the prompt response / great work! It works exactly as expected.
Out of curiosity, is there any chance we can do the visualization without pulling up a window? I don't know too much about OpenGL, so this might be impossible, but I figured I would ask anyway.
Thanks again, James
Thanks a lot! :) Yeah that's straight forward. I'll add a function for that. As for now, you can use this:
window = GLVisualize.current_screen() # if you don't already have a handle to the window, like its the case in Plots.jl
GLFW.HideWindow(GLWindow.nativewindow(window))
Great! To be clear: the HideWindow(...) function should be placed after a call to glscreen(), right? Something like:
using GLVisualize
window = glscreen()
#window = GLVisualize.current_screen()
GLFW.HideWindow(GLWindow.nativewindow(window))
_view(visualize(loadasset("cat.obj")))
frames = []
GLWindow.render_frame(window)
GLWindow.swapbuffers(window)
GLWindow.pollevents()
push!(frames, GLWindow.screenbuffer(window))
yield()
name = "test"
folder = "foldername"
resampling = 0 # no resizing
cd(folder)# this is optional and makes sure that we're in a valid location
create_video(frames, name, folder, 0)
If so, I still need an X11 environment to do the rendering. Is there any way to do the rendering without a window at all?
Yes, that's the correct place.
No, not really... It doesn't need to be X11, but I do need an OpenGL context ;) I'm planning to factor out the OpenGL part and allow for different backends, but this needs to be done and then I'd need to write a non OpenGL backend. (Actually there is a branch with these changes, so its slowly moving forward).
There is: https://developer.nvidia.com/nvemulate Not sure how that would work out, though. What are you trying to do anyways? Does the system have a GPU?
Hey Simon,
That's great news (to your previous comment). I tried running the code on a system without a GUI and received the following error:
ERROR: LoadError: InitError: X11: The DISPLAY environment variable is missing
in (::GLFW.##5#6)(::Int32, ::String) at /.../.julia/v0.5/GLFW/src/GLFW.jl:29
in ##ErrorCallbackWrapper#270(::Int32, ::Cstring) at /.../.julia/v0.5/GLFW/src/callback.jl:50
in Init() at /.../.julia/v0.5/GLFW/src/glfw3.jl:297
in __init__() at /.../.julia/v0.5/GLFW/src/GLFW.jl:30
during initialization of module GLFW
while loading /../test.jl, in expression starting on line 1
This happens on the first line of the above script when
using GLVisualize
It's obviously not so much a problem with GLVisualize as it is with GLFW. I'll look into it again when I have time. I'll also look into the NVEmulate option soon (sorry, I'm in Japan, so timezone-wise it's a little late in the day). Thanks for letting me know about it!
As to my intent: I run a lot of CUDA code on a local cluster (ssh, no gui typically) and was hoping to do the visualization there, that way I don't have to pass large data files back and forth to a local machine. There are a number of other cases where I simply don't have access to a graphical display, but the most common one is when I am ssh'd into other machines.
That said, there could be an obvious solution to this problem that I am missing.
Thanks again, James
Problem with that is, that I don't work at all with those machines, so my knowledge is restricted :( Are these Tesla GPU's or quadros? In other words, are they even usable for graphics? I found this PDF, which seems to be very elaborate: http://www.nvidia.com/content/pdf/remote-viz-tesla-gpus.pdf This library is mentioned as well: http://www.virtualgl.org/About/Background So it seems possible! Its just a requires some work to find the right approach and set it up!
One possibility might be to use EGL, which doesn't need an X-Server. https://devblogs.nvidia.com/parallelforall/egl-eye-opengl-visualization-without-x-server/
(and it seems one would only need to use EGL to set up the OpenGL context and then attach a FBO to it?)
Thanks @vchuravy, I completely missed your answer! That's what I also found in the linked PDF... Just need to get my hands on such a setup and some time to try thinks out.
If you are on ubuntu mesa-utils-extra
should have what you are looking for.
Any distro that can run wayland should be fine. (or a modern nvidia driver, I don't know the situation on the AMD side though)
I have recently gotten GLVisualize working on my machine(s) and have plotted the examples; however, I cannot seem to find an example using the "create_video()" function and am not sure of it's usage. It would seem that it concatenates a series of images into a video, but how do you get the series of images in the first place?
I apologize if this is described somewhere and I missed it.
Thanks, James