When I bought my Cardboard case a few months ago, it came with a special QR barcode label that allowed me to configure the Cardboard demo app provided by Google, that can also be used to configure other GVR SDK-based applications.
Do you guys have an URL to some documentation explaining how the QR data affects the calculations that the SDK perform to the various head, eye and projection matrices that developers use to create their own GVR-enabled apps?
As a matter of fact, do you also have some documentation on how does the GVR SDK actually create those very same matrices and the viewport angles?
When I bought my Cardboard case a few months ago, it came with a special QR barcode label that allowed me to configure the Cardboard demo app provided by Google, that can also be used to configure other GVR SDK-based applications.
Do you guys have an URL to some documentation explaining how the QR data affects the calculations that the SDK perform to the various head, eye and projection matrices that developers use to create their own GVR-enabled apps?
As a matter of fact, do you also have some documentation on how does the GVR SDK actually create those very same matrices and the viewport angles?