Open marcusx2 opened 1 year ago
@marcusx2 this is a known problem, I will try to get to it when I have a moment.
Hi @nickw1 , thanks for the reply. If it helps at all, MindAR uses aframe as well and it doesn't have this issue.
@marcusx2 thanks for the tip re. MindAR.
In the ar-threex-location-only
file and aframe-ar
(why both files?), there's this line navigator.mediaDevices&&navigator.mediaDevices.getUserMedia){const t={video:{width:1280,height:720,facingMode:"environment"}}
that seems to be the issue. It sets the width and height to a specific value...
On this file https://github.com/hiukim/mind-ar-js/blob/master/src/image-target/aframe.js
this code snippet seems to be the solution
navigator.mediaDevices.getUserMedia({audio: false, video: {
facingMode: 'environment',
}}).then((stream) => {
this.video.addEventListener( 'loadedmetadata', () => {
//console.log("video ready...", this.video);
this.video.setAttribute('width', this.video.videoWidth);
this.video.setAttribute('height', this.video.videoHeight);
this._startAR();
});
this.video.srcObject = stream;
}).catch((err) => {
console.log("getUserMedia error", err);
this.el.emit("arError", {error: 'VIDEO_FAIL'});
});
@nickw1 If I use videoTexture: false
the problem goes away. But then the cube stretches, so another bug comes up ;_; damn it. Should I create another issue for this one?
The scale of the cube is 1 1 1
, it shows up correctly if I set videoTexture: true
on the a-scene
's arjs
component, but then the video feed is the one that stretches lol.
Thanks for that. I think what I'll probably do is incorporate fixes for all your issues into a new version 3.5, with the agreement of @kalwalt, as the world-origin-as-original-GPS-position will be a breaking change. I'll also try and add the multiple cameras feature.
with the agreement of @kalwalt, as the world-origin-as-original-GPS-position will be a breaking change
It doesn't have to be breaking if it's optional! It can default to false to keep the functionality as it is.
I'll also try and add the multiple cameras feature.
Multiple cameras? Don't know anything about that. Must be some other feature request.
I just thought that, as you have made quite a few requests, it would be best to collate them and include them in a new release. And if the initial position as world origin is the preferred behaviour anyway, maybe it's best to make that the default by switching to that with a 3.5 release. It won't affect most use cases anyhow. (This will also fit in with some other work I'm doing to try and combine AR.js with the SLAM library AlvaAR).
Sorry, I lost track of who requested the multiple cameras feature!
And if the initial position as world origin is the preferred behaviour anyway, maybe it's best to make that the default by switching to that with a 3.5 release.
It is better for other frameworks(PlayCanvas and Unity at least, probably more) that want to use arjs like I explained because of the floating point issue. I also think it semantically makes sense as well that the experience starts at world origin.
(This will also fit in with some other work I'm doing to try and combine AR.js with the SLAM library AlvaAR)
AlvaAR integration would be amazing for sure! AlvaAR alone without geolocation would already be awesome: world tracking on the web!
Just to summarize this issue includes 2 bug reports:
1- Camera feed stretching when videoTexture: true
. No entity stretching.
2- Entity stretching when videoTexture: false
. No camera feed stretching.
@marcusx2 I have implemented your suggested fix for stretching for the videoTexture: true
mode (note that videoTexture: false
is deprecated for location-based AR in any case).
Do you want to try testing it? You need to check out the stretched-video-fix
branch of AR.js and include the built files from that branch into your project, e.g.
<script type='text/javascript' src='LOCATION_OF_ARJS_ON_YOUR_SYSTEM/three.js/build/ar-threex-location-only.js'></script>
<script type='text/javascript' src='LOCATION_OF_ARJS_ON_YOUR_SYSTEM/aframe/build/aframe-ar.js'></script>
I simply got the files ar-threex-location-only and aframe-ar from the branch and added to the same folder as the index.html.
Like this
<!DOCTYPE html>
<html>
<head>
<title>AR.js A-Frame Location-based</title>
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, minimum-scale=1.0, user-scalable=no" />
<script src="https://aframe.io/releases/1.0.4/aframe.min.js"></script>
<script type='text/javascript' src='./ar-threex-location-only.js'></script>
<script type='text/javascript' src='./aframe-ar.js'></script>
</head>
<body>
<a-scene vr-mode-ui='enabled: false' arjs='sourceType: webcam; videoTexture: true; debugUIEnabled: false' renderer='antialias: true; alpha: true'>
<a-camera look-controls-enabled='false' arjs-device-orientation-controls='smoothingFactor: 0.1' gps-new-camera='positionMinAccuracy: 100; gpsMinDistance: 5; simulateLatitude: 51.049; simulateLongitude: -0.723; simulateAltitude: 0; gpsTimeInterval: 0;' position='0 10 0'></a-camera>
<a-entity material='color: red' geometry='primitive: box' gps-new-entity-place="latitude: -21.1873371; longitude: -47.7996175" scale="1 1 1" ></a-entity>
</a-scene>
<script>
const entity = document.querySelector("[gps-new-entity-place]");
console.log(entity.components['gps-new-entity-place'].distance);//returns undefined
setTimeout(() => {
console.log(entity.components['gps-new-entity-place'].distance);//returns distance
}, 0);
</script>
</body>
</html>
Should be correct right? The fix didn't work here though, unless I missed something.
I have implemented your suggested fix for stretching for the videoTexture: true mode (note that videoTexture: false is deprecated for location-based AR in any case).
Please don't deprecate/remove it. videoTexture: false is better for projects that use AR.js as an API to get the information about positioning and apply it elsewhere. For example, for PlayCanvas I don't need the videoTexture to be true because I just get the information of the camera and entities, and use it to place the objects in PlayCanvas, which draws the scene in its own Canvas which overlays the video element. The threejs canvas is completely useless for cases like this, just consuming unnecessary resources, because I don't need aframe/threejs to actually draw anything.
If it helps anything, you can take a look at this issue from another repo. I tried the fix provided there but it didn't work here. Maybe the JeelizResizer and/or the JeelizThreeHelper can give some clues.
I tried checking out the new branch as well, just in case. Still didn't work =/. The problem happens on my Android Chrome and on iOS Safari, and also on Chrome for desktop (stretched horizontally).
OK this may be a tricky one rather than a quick fix. I'm not sure it will make it to the next bugfix release in that case, but I will include PRs #507 and #508 though.
Does the bug not happen with you? Hopefully it's something that you can debug.
I know, but it might be something that requires some research to fix. This is very much dependent on my time availability, which is a bit restricted right now - whereas the other two problems are already fixed. Hence, as soon as #507 and #508 have been reviewed, I can make a bugfix release for those two issues, at least.
Can this bug be worked around in current version of AR.js?
bump
Hey @nickw1 , I recently had this stretching issue with another app, and I fixed it with something like this
myclass.windowResizeEvent = function () {
if (window.innerHeight > window.innerWidth) {
if (myclass.firstResize !== "portrait" && myclass.firstResize !== "landscape") {
video.videoWidthbk = video.videoWidth;
video.videoHeightbk = video.videoHeight;
myclass.firstResize = "portrait";
} else {
if (myclass.firstResize == "portrait") {
video.videoWidthbk = video.videoWidth;
video.videoHeightbk = video.videoHeight;
} else {
video.videoWidthbk = video.videoHeight;
video.videoHeightbk = video.videoWidth;
}
}
dynCall_vii(callback, video.videoWidthbk, video.videoHeightbk)
} else {
if (myclass.firstResize !== "portrait" && myclass.firstResize !== "landscape") {
video.videoWidthbk = video.videoHeight;
video.videoHeightbk = video.videoWidth;
myclass.firstResize = "landscape";
} else {
if (myclass.firstResize == "landscape") {
video.videoWidthbk = video.videoHeight;
video.videoHeightbk = video.videoWidth;
} else {
video.videoWidthbk = video.videoWidth;
video.videoHeightbk = video.videoHeight;
}
}
dynCall_vii(callback, video.videoHeightbk, video.videoWidthbk)
}
};
window.addEventListener("resize", myclass.windowResizeEvent)
window.dispatchEvent(new Event("orientationchange"));
don't know if this applies to arjs or not, but the problem I was having on said app is that depending if the it started on landscape mode or portrait mode, I had to swap the video width with the video height and vice versa so that the aspect ratio was always correct and it didn't show up stretched. I had to check for the first resize event and detect if the app started in portrait or landscape...anyways. Don't know if this helps at all, but just throwing it here just in case, who knows.
I really need this videoTexture: false
to work properly
Can this bug be worked around in current version of AR.js?
@kbs1 I've found a workaround:
</a-scene>
<button id="start" onclick="start()" style="position: absolute; top: 0; width: 100%; height: 100%; z-index: 10;">Start</button>
</body>
</html>
<script>
var elem = document.documentElement;
/* View in fullscreen */
function start() {
if (elem.requestFullscreen) {
elem.requestFullscreen();
} else if (elem.webkitRequestFullscreen) { /* Safari */
elem.webkitRequestFullscreen();
} else if (elem.msRequestFullscreen) { /* IE11 */
elem.msRequestFullscreen();
}
document.querySelector("#start").remove()
document.querySelector('video').setAttribute('style', 'width: 100vw;height: 100vh;object-fit: cover;position: absolute;top: 0;left: 0;');
}
I'm full-screening because I want to but the only important bit is the button removal then setting the style attribute.
I found the style doesn't work if set directly, probably something to do with setting the style after AR.js is done setting up the scene. So I imagine you can get this to work without the 'start' button by just getting the start function to run after the first gps camera position is updated, although I haven't tried it:
let testEntityAdded = false;
document
.querySelector("#camera")
.addEventListener("gps-camera-update-position", (e) => {
if (!testEntityAdded) {
start()
Never-mind actually, just noticed that for some reason changing the video's CSS seems to destroy all entities as does setting a-scene to embedded
@marcusx2 @Platform-Group apologies for the late reply, I've had very little time to look at AR.js lately.
@marcusx2 could you submit your fix as a PR? Please provide a sample which works with the fix and doesn't work without the fix. This will ease the integration into the main repo.
@nickw1 I didn't apply anything to ARJS. It's just that I had a similar problem with something else and posted my solution here, which maybe can be used by ARJS with the same idea or something similar.
I've found out that the issue arises when camera aspect ratio is not the same as device (mobile phone) aspect ratio, or more specifically, usable viewport aspect ratio. I've found out that AR.js creates a PlaneBufferGeometry of width 1 and height 1, if you tweak these values to have the same aspect ratio as the viewport, you can then unstretch the video feed (video texture) on that particular device.
@kbs1 is this tweaking done by modifying ar.js's source code?
@Platform-Group yes, you can search the source for PlaneBufferGeometry, there will be 2 instances. You can find the active one based on your use case and tweak the values there. For my use case, I added reading from window (global) so the correct values can be calculated before initialising AR.js itself
@kbs1 thanks for that. Do you want to submit a PR to fix the issue?
@nickw1 the fix I used was not proper, and it required reading from magic window.....
variables at the patched PlaneBufferGeometry
creation. It also assumed AR.js will be full-width and full-height in the viewport, which might not be the case. It also did not work on iOS devices, so I added a workaround that it would not activate on those.
I'm not familiar with AR.js / aframe / three.js codebases enough to develop a proper fix, but for the fix itself, further research is also needed. The way I understand it AR.js creates a plane that is filled with video texture (the camera feed). This plane is 1x1 in dimensions. Somehow, this plane is stretched to "fit" the scene, and this creates the distortion (might be due to OrthographicCamera
being used).
The code in question is here: https://github.com/AR-js-org/AR.js/blob/3.4.5/aframe/src/location-based/arjs-webcam-texture.js#L6-L20
The proper fix would be to emulate a behavior like CSS's background-size: cover; background-position: center
for the video feed. That way, it would not matter what the wiewport aspect ratio is, and what the video feed aspect ratio is. It might however be needed to adjust the position of AR objects so they are truest to the shown camera feed, but that's outside of my abilities to judge.
@kbs1 I think you can still contribute your modifications as a PR, we will review it, and if your code is not compatible stylistically or otherwise with AR.js, we will modify it.
Even if it's not a full fix, if it partly solves some problems, it's still worth incorporating.
@nickw1 here, I had a crack at it, works for me:
const constraints = {
video: {
facingMode: "environment",
width: { ideal: 1920 },
height: { ideal: 1080 }
},
};
navigator.mediaDevices.getUserMedia(constraints).then(stream => {
let streamSettings = stream.getVideoTracks()[0].getSettings()
console.log('navigator by geom ', streamSettings)
let sourceAspectRatio = streamSettings.width / streamSettings.height;
let displayAspectRatio = window.innerWidth / window.innerHeight;
let geomX = 1;
let geomY = 1;
if (displayAspectRatio > sourceAspectRatio) {
// Display is wider than source
geomX = sourceAspectRatio / displayAspectRatio;
} else {
// Display is taller than source
geomY = displayAspectRatio / sourceAspectRatio;
}
console.log('geomY ', geomY, ' geomX ', geomX)
this.geom = new THREE.PlaneBufferGeometry(geomX, geomY);
this.texture = new THREE.VideoTexture(this.video);
this.material = new THREE.MeshBasicMaterial({ map: this.texture });
const mesh = new THREE.Mesh(this.geom, this.material);
this.texScene.add(mesh);
})
It still has problems though:
@Platform-Group thank you, that is esentially exactly the same workaround I have used. Outstanding issues are well written
Bad news, I found an issue with the fix @kbs1
This also made me realise that it's using the 0.5x zoom camera for everything which is weird
@Platform-Group I am using this code before initialising AR.js:
if (window.navigator.mediaDevices && window.navigator.mediaDevices.getUserMedia)
window.navigator.mediaDevices.getUserMedia({audio: false, video: {facingMode: {exact: 'environment'}}}).then(async stream => {
stream.getTracks().forEach(track => {
let trackSettings = track.getSettings();
let ios = () => {
if (typeof window === 'undefined' || typeof navigator === 'undefined') return false;
return /iPhone|iPad|iPod/i.test(navigator.userAgent || navigator.vendor || (window.opera && opera.toString() === '[object Opera]'));
};
if (ios()) {
this.deviceCameraInitialised = true;
} else if (trackSettings.width && trackSettings.height) {
console.log('webcam width ' + trackSettings.width);
console.log('webcam height ' + trackSettings.height);
console.log('webcam aspect ratio ' + (trackSettings.width / trackSettings.height));
window.ARJS_WEBCAM_ASPECT_RATIO = (trackSettings.width / trackSettings.height);
this.deviceCameraInitialised = true;
}
track.stop();
});
}).catch(e => console.log(e));
Then at AR.js side, I have the following code:
var w = 1;
var h = 1;
if (window.ARJS_WEBCAM_ASPECT_RATIO) {
if (window.ARJS_WEBCAM_ASPECT_RATIO < 1) {
// portrait (width < height) => stretch width
w = 1 / window.ARJS_WEBCAM_ASPECT_RATIO * 1.1;
} else {
// landscape (width >= height) => stretch height
h = 1 * window.ARJS_WEBCAM_ASPECT_RATIO * 1.3;
}
}
this.geom=new I.PlaneBufferGeometry(w, h);
This involves some magic constants, so the video feed is "as unstretched as possible". My primary goal was to make this work on a particular android device.
I have also added window resize
event listener, which re-initialises AR.js upon window resize (or device orientation change, e.g. portrait => landscape).
I am using Vue 3 in the project, so all aframe elements such as <a-scene>
are bound to a v-if condition, that becomes false upon resize, and the whole init flow is restarted after 500ms. This way, after the resize (or orientation change), whole aframe DOM is wiped, and re-inserted. Camera aspect ratio detection code and modified AR.js aspect ratio code runs again, producing proper results (at least in my use case). If I remember correctly, camera feed aspect ratio is reported differently when in portrait vs landscape mode.
@kbs1 How did you get around this issue? I've changed over to vue 3 too but this particular bug is rather annoying. I'm thinking it doesn't really matter as I'll add a high z-level border around the video anyway but still.
@Platform-Group I've just hidden all video
elements using CSS display: none !important
. The video element doesn't have to be visible, all rendering is done via a-scene
Working workaround
I fixed the camera stretching by putting the whole arjs scene inside a separate html file and then using an iframe to embed that within another page. From there I can set the iframe width and height to match the camera's aspect ratio. Here's the function I used to calculate the needed iFrame width and height, it however doesn't take into account rotating devices or window size adjustments:
// returns promise
function calculateEmbedDimensions () {
// Setting ideal width and height seems to actually screw up the aspect ratio so it isn't the true camera resolution
// And if you set it to a high value (around 4k or higher) it seems to break the video stream completely on some devices
// So just using a min and max to try and figure out what aspect ratio is default is our best bet I believe
const constraints = {
video: {
facingMode: "environment",
width: { min: 100, max: 4096 },
height: { min: 100, max: 2160 },
},
};
console.log('arjs webcam texture ', this)
navigator.mediaDevices.enumerateDevices().then(devices => {
console.log('devices available ', devices)
})
return navigator.mediaDevices.getUserMedia(constraints).then(stream => {
console.log('video tracks ', stream.getVideoTracks())
let streamSettings = stream.getVideoTracks()[0].getSettings()
console.log('stream settings ', streamSettings)
console.log('video track functions ', listFunctionsInObject(stream.getVideoTracks()[0]))
// console.log('capabilities ', stream.getVideoTracks()[0].getCapabilities())
let sourceAspectRatio = streamSettings.width / streamSettings.height;
let displayAspectRatio = window.innerWidth / window.innerHeight;
// A value of 1 represents using the full width (geomX) or height (geomY) of the display for the embed
let geomX = 1;
let geomY = 1;
if (displayAspectRatio > sourceAspectRatio) {
// Display is wider than source
geomX = sourceAspectRatio / displayAspectRatio;
} else {
// Display is taller than source
geomY = displayAspectRatio / sourceAspectRatio;
}
return { width: window.innerWidth * geomX, height: window.innerHeight * geomY }
})
}
function startViewingEmbed () {
// If iFrame isn't yet embedded, embed it
if (!document.querySelector('#arjs_embed')) {
calculateEmbedDimensions().then((embedDimensions) => {
console.log('embed dimensions are: ', embedDimensions)
let embed = document.createElement('iframe')
embed.setAttribute('id', 'arjs_embed')
embed.setAttribute('src', '/arembed.html?' + 'cachebuster=' + Date.now())
embed.setAttribute('width', embedDimensions.width)
embed.setAttribute('height', embedDimensions.height)
embed.classList.add('centered')
let viewing = document.querySelector("#viewing")
viewing.appendChild(embed)
viewing.classList.remove('hidden')
})
}
}
Bump!
Is this getting fixed anytime soon? T_T
@nickw1 I'm going to pay @DuncanPodmore to fix this properly and he is going to make a pull request. I need to know if you will be available to review it as soon as he makes the pull request, to know if the solution is proper. The pull request should be there sometime next month.
Hello, I used this kbs1 commented on Jul 26, 2023 solution and worked like a charm for arjs='sourceType: webcam; But when I run the Ionic Angular app on my phone I get an error: Sorry, media devices API not supported. How can I have the resize as a general solution? Not just for webcam?
Hello, I got stuck with this problem, because apart from the fact that the video image did not look as it should, the resolution is also horrible. I was looking for solutions everywhere, I even tried each of the solutions proposed here, but absolutely nothing worked.
After trial and error, I arrived at the following solution. When using arjs, you must import some libraries, including 'aframe.min.js' and 'aframe-ar.js', as you can see in the screenshots I am attaching. Within the aframe-ar.js file, there is a fragment within all the obfuscated code where the video settings, "constraints", are established.
There, there is only the following constraints "{ video: { facingMode: 'environment' } }", you must modify it in such a way that it is "{ video: { facingMode: 'environment', width: { ideal: 4096 }, height: { ideal: 2190 } }", which apart from increasing the horrible resolution that the camera has with the library, solves the error.
As the code is obfuscated, you must find this fragment in VSCode using CTRL + F (search: "mediadevices.getusermedia").
Perhaps, in a future version of arjs it would be useful to add a little more customization (perhaps it already does and I'm talking about ignorant). Well, I have already had to modify the code directly from the library to solve other types of things that are not customizable.
I will attach screenshots to clarify the situation:
From this:
To this:
I hope my answer can help resolve the error, greetings!
try to add aspect-ration in css, for make the height and width same to the resolution of the screen.
@marcusx2 apologies for late reply on this. To be honest I don't really have much available time to spend on AR.js these days (other than quick checks of PRs) and haven't looked on here for a while; hope this is fixed though.
Apologies for this but do please bear in mind the nature of open source projects is that contributors and maintainers are often doing it on a voluntary basis.
The video camera feed is stretched on my android chrome (ARJS Geolocation). I took a photo with ARJS, and then with the camera app of my phone so you can see the difference
ARJS:
Camera app