ValveSoftware / openvr

OpenVR SDK
http://steamvr.com
BSD 3-Clause "New" or "Revised" License
6.13k stars 1.28k forks source link

HTC Vive: Display static image at infinity with OpenVR? #393

Open fairymistress opened 7 years ago

fairymistress commented 7 years ago

I am a beginner with HTC Vive and VR as a whole and I have the following very simple code for interaction with the HMD:

#include "HMD.h"

HMD::HMD() {
    Resolution(Vector2i(0, 0));
}

void HMD::render(GLuint leftTextureHandle, GLuint rightTextureHandle) {
    if (vrEnabled) {
        vr::TrackedDevicePose_t pose[vr::k_unMaxTrackedDeviceCount];
        vr::VRCompositor()->WaitGetPoses(pose, vr::k_unMaxTrackedDeviceCount, NULL, 0);

        vr::Texture_t leftEyeTexture = { (void*)leftTextureHandle, vr::TextureType_OpenGL, vr::ColorSpace_Gamma };
        vr::VRCompositor()->Submit(vr::Eye_Left, &leftEyeTexture);
        vr::Texture_t rightEyeTexture = { (void*)rightTextureHandle, vr::TextureType_OpenGL, vr::ColorSpace_Gamma };
        vr::VRCompositor()->Submit(vr::Eye_Right, &rightEyeTexture);

        glFinish();
    }
}

void HMD::init() {
    if (vr::VR_IsHmdPresent()) {

        vr::EVRInitError vrError = vr::VRInitError_None;
        vrSystem = vr::VR_Init(&vrError, vr::VRApplication_Scene);

        if (vrError != vr::VRInitError_None) {
            vrSystem = 0;
        }
        else {
            vrEnabled = true;

            uint32_t renderWidth;
            uint32_t renderHeight;
            vrSystem->GetRecommendedRenderTargetSize(&renderWidth, &renderHeight);

            Resolution(Vector2i(renderWidth, renderHeight));
        }
    }
}

void HMD::shutdown() {
    if (vrSystem) {
        vr::VR_Shutdown();
        vrSystem = 0;
    }
}

HMD::init() is called once and HMD::render() - on every rendering cycle.

My ultimate goal is to display two images which represent stereo content rendered with another application and exported as stereo image pairs. However, for now leftTextureHandle and rightTextureHandle are two different textures but containing the exactly the same 2D image.

Theoretically, if the center of my eye, the center of the lens and the center of the display are aligned on the same axis per eye, my eyes should not converge and the result should be as if I am looking at something positioned at infinity. Of course, not exactly, since the focus won't be at infinity but I hope you understand me.

This theory worked for Oculus DK2 with the Oculus C++ SDK (passing the same texture handles to their API), however, in HTC Vive I can clearly see that my eyes are actually diverging and that I cannot merge the inputs from my left and right eye.

If I shift the left texture ~30px to the right and the right texture ~30px to the left I start merging the content and looks OK. But it bothers me immensely that I don't understand what is going on and why is it happening like this. HTC Vive is set correctly, my IPD measured correctly and I see no visible artifacts for which I assume my eye and lens are well aligned.

Is it that the screen and the lens are not centered with respect to each other? How one could properly align two images? Using overlay is not an option for me.

spayne commented 7 years ago

The lenses are off-center. This link has some more details: https://www.gamedev.net/topic/683698-projection-matrix-model-of-the-htc-vive/