Closed blenderman94 closed 5 months ago
Here's how you could use Pass:fill:
sbsShader = lovr.graphics.newShader('fill', [[
vec4 lovrmain() {
vec2 newUV = clamp(UV, 0., 1.) * vec2(.5, 1.) + vec2(ViewIndex) * vec2(.5, 0.);
// Use this instead for top-bottom stereo
// vec2 newUV = clamp(UV, 0., 1.) * vec2(1., .5) + vec2(ViewIndex) * vec2(0., .5);
return getPixel(ColorTexture, newUV);
}
]])
-- pass:setShader(sbsShader)
-- pass:fill(sbsTexture)
(The fragment shader is the same but the vertex shader uses "fill" instead). Pass:fill
uses a special vertex shader, and so a custom shader should also use that vertex shader.
With regards to streaming, it's possible to do networking with either enet
(included with lovr) or luasocket
(https://github.com/brainrom/lovr-luasocket). However, streaming full images and sound at 90FPS or more is a lot of data! There needs to be some compression applied, but video and audio compression are complicated topics. There are native libraries that can do it (ffmpeg can compress video, opus can compress audio), but these are yet additional plugins you'd need to use, and nobody has created these plugins yet...
P.S. You can wrap your code in ``` so it's easier to read
good day. im working on a replacement for oculus link and steam vr. im rendering a 3d image what works just fine. but i dont realy know how to pull an image and a sound trough my local network from my computer to the headset. im doing this because i got tired of how difficult can be to display stuff on the quest plus i just like legacy opengl for vr ^^ (ps this thing is actualy already usable till some point if you use a ram disk to store your buffer and use a headphone to get the sound but its not so elegant it will be nicer to stream both via local network)
here is the code prototype. im using a plane cause pass.fill for some reason dont like the shader. and i found it easier. also in such way you can use whatewer custom renderer or even you old legacy opengl for vr.
local sbsTexture function lovr.load() sbsTexture = lovr.graphics.newTexture('frame.png', { mipmaps = false }) sbsShader = lovr.graphics.newShader([[ vec4 lovrmain() { return DefaultPosition; } ]], [[ vec4 lovrmain() { vec2 newUV = clamp(UV, 0., 1.) vec2(.5, 1.) + vec2(ViewIndex) vec2(.5, 0.);
]])
lovr.graphics.setBackgroundColor(0, 1,0) end
function lovr.update() sbsTexture = lovr.graphics.newTexture('frame.png', { mipmaps = false }) end
function lovr.draw(pass) pass:setMaterial(sbsTexture) pass:setShader(sbsShader) angle, ax, ay, az = lovr.headset.getOrientation(device) q = lovr.math.newQuat(angle, ax, ay, az, raw) x, y, z = lovr.headset.getPosition(device) direction = quat(lovr.headset.getOrientation(head)):direction() v = lovr.math.newVec3(x, y, z) l =v+direction scale=lovr.math.newVec3(1.9,2.1,1) m = lovr.math.newMat4(l, scale, q) pass:plane(m) end