ericblade / quagga2

An advanced barcode-scanner written in Javascript and TypeScript - Continuation from https://github.com/serratus/quaggajs
MIT License
766 stars 85 forks source link

Unable to Scan using Iphone 12 pro max #419

Open Shefali-Upadhyay opened 2 years ago

Shefali-Upadhyay commented 2 years ago

The scanner is working perfectly fine on all the devices except a particular device(iPhone 12 pro max). On opening the scanner, the camera is not able to capture the barcode value. The device is of a client. Tried debugging by checking the settings for the scanner, but was unable to identify the issue. It is really weird as the same device is able to scan the UPC barcode when using a different Application. The Quagga2 has been integrated into a website. This is very peculiar as this is an issue only on that device. Haven’t seen this issue on any other devices. The ios version is 15.4.1. Are there any changes required to be done in the device settings? Need help as it is really confusing and need the issue to be resolved as soon as possible.

github-actions[bot] commented 2 years ago

Thank you for filing an issue! Please be patient. :-)

Lordodson commented 2 years ago

I'm seeing the same thing with a project I've been using quagga with. Scanner works with all other devices, but is not able to capture the barcode value with iPhone 13 pro max or M1 MacBook.

ericblade commented 2 years ago

Does the iPhone 12 pro report multiple camera devices available if you use enumerateDevices() to get a list of available inputs?

On most Android devices these days, there are multiple selections, for different types of shots, wide angle, close up, etc, and selecting a wide-angle one will generally prevent detection as the display is fisheyed. I haven't heard of that occurring on iOS, but it might be new to the 12 pro? something to look into.

I unfortunately don't have any iOS devices available so, all I can do from right here right now is suggest ideas.

Lordodson commented 2 years ago

I'm using enumerateDevices() and it only shows front or back camera. Inspect console shows: Error starting Quagga: TypeError: null is not an object (evaluating 'n.x')

ericblade commented 2 years ago

aahh that's at least helpful. @Lordodson would you be able to swap in the unminified version (from dist/quagga.js instead of dist/quagga.min.js) and perhaps provide any more detail that might give?

also @Shefali-Upadhyay and @Lordodson could you share your quagga init config?

Lordodson commented 2 years ago

Currently have both dist/quagga.js & dist/quagga.min.js. I won't be able to delete dist/quagga.min.js and test it for another ~11hrs.

import { useCallback, useLayoutEffect, useRef, useState, useEffect } from 'react';
import Quagga, { QuaggaJSResultObject } from "@ericblade/quagga2";
import * as React from 'react';
import { faGear } from "@fortawesome/free-solid-svg-icons";
import { FontAwesomeIcon } from "@fortawesome/react-fontawesome";
import Button from "react-bootstrap/esm/Button";
import ButtonGroup from "react-bootstrap/esm/ButtonGroup";
import ToggleButton from "react-bootstrap/esm/ToggleButton";

function getMedian(arr: any[]) {
  arr.sort((a, b) => a - b);
  const half = Math.floor(arr.length / 2);
  if (arr.length % 2 === 1) return arr[half];
  return (arr[half - 1] + arr[half]) / 2;
}

function getMedianOfCodeErrors(decodedCodes: { error?: number; code: number; start: number; end: number; }[]) {
  const errors = decodedCodes.filter((x) => x.error !== undefined).map((x) => x.error);
  return getMedian(errors);
}

function InnerScanner(props: { cameraId: string; onDetected: (result: string) => void; scannerRef: React.MutableRefObject<any>; }): null {
  const errorCheck = useCallback(
    (result) => {
      if (!props.onDetected) return;
      const err = getMedianOfCodeErrors(result.codeResult.decodedCodes);
      if (err < 0.25) props.onDetected(result.codeResult.code);
    },
    [props]
  );

  const handleProcessed = (result: QuaggaJSResultObject) => {
    const drawingCtx = Quagga.canvas.ctx.overlay;
    drawingCtx.font = "18px Arial";
    if (result) {
      if (result.box)
        Quagga.ImageDebug.drawPath(result.box, { x: 0, y: 1 }, drawingCtx, { color: "white", lineWidth: 2 });
      if (result.codeResult && result.codeResult.code)
        drawingCtx.fillText(
          result.codeResult.code,
          result.box[0][0],
          result.box[0][1] + 25
        );
    }
  };

  useLayoutEffect(() => {
    Quagga.init({
      inputStream: {
        constraints: {
          width: props.scannerRef.current.offsetWidth,
          height: props.scannerRef.current.offsetHeight,
          ...(props.cameraId ? { deviceId: props.cameraId } : { facingMode: "environment" })
        },
        target: props.scannerRef.current,
      },
      decoder: { readers: ["upc_reader", "ean_reader"] },
    },
    (err) => {
      Quagga.onProcessed(handleProcessed);
      if (err) return console.log("Error starting Quagga:", err);
      Quagga.start();
    });

    Quagga.onDetected(errorCheck);
    return () => {
      Quagga.offDetected(errorCheck);
      Quagga.offProcessed(handleProcessed);
      Quagga.stop();
    };
  }, [errorCheck, props.cameraId, props.scannerRef]);

  return null;
}

export default function Scanner(props: { onResult: (result: string) => void, active?: boolean }) {
  const [devices, setDevices] = useState<MediaDeviceInfo[] | null>(null);
  const [cameraId, setCameraId] = useState<string | null>(localStorage.getItem("cameraId"));
  const [showOptions, setShowOptions] = useState(false);
  const [scanning, setScanning] = useState(false);
  const [mediaPerms, setMediaPerms] = useState(false);
  const scannerRef = useRef(null);

  useEffect(() => {
    navigator.mediaDevices.getUserMedia({ video: true }).then((e) => {
      setMediaPerms(true);
    });
  }, []);

  if (!devices && mediaPerms) {
    Quagga.CameraAccess.enumerateVideoDevices().then((results) => setDevices(results));
  }

  useEffect(() => {
    if (!cameraId) setShowOptions(true);
    if (cameraId) setScanning(true);
  }, [cameraId]);

  const handleResult = (result: string) => {
    setScanning(false);
    props.onResult(result);
  };

  const handleSelect = (e: React.FormEvent<HTMLInputElement>) => {
    localStorage.setItem("cameraId", e.currentTarget.value);
    setCameraId(e.currentTarget.value);
    setScanning(true);
  };

  return (
    <div className="ratio ratio-16x9">
      <div>
        <div style={{ position: "absolute", top: "0px", bottom: "0px", left: "0px", right: "0px" }} ref={scannerRef}>
          <canvas className="drawingBuffer" style={{ position: "absolute", top: "0px", bottom: "0px", left: "0px", right: "0px" }} />
          {(props.active ?? true) && scanning && cameraId ? (<InnerScanner cameraId={cameraId} scannerRef={scannerRef} onDetected={handleResult} />) : null}
        </div>
        {showOptions ? <div style={{ position: "absolute", top: "20px", bottom: "20px", right: "0px", width: "100%", height: "100%" }} className="justify-content-center">
          <ButtonGroup vertical className="d-grid">
            {devices ? devices.map(device => (
              <ToggleButton 
                key={device.deviceId}
                id={`radio-${device.deviceId}`}
                type='radio'
                variant="outline-primary"
                name="radio"
                value={device.deviceId}
                checked={cameraId === device.deviceId}
                onChange={handleSelect}>
                  {device.label}
                </ToggleButton>
            )) : null}
          </ButtonGroup>
        </div> : null}
        <Button variant="link" onClick={() => setShowOptions(!showOptions)} style={{ zIndex: "1" }} className="position-absolute top-0 end-0"><FontAwesomeIcon icon={faGear} /></Button>
      </div>
    </div>
  );
}
ericblade commented 2 years ago

ok that does basically look like you're using a variant of my react demo, so I'd also expect that my application would fail.. I'm going to ping my users and see if anyone has a iPhone 12 Pro Max.

VVlasy commented 2 years ago

I would like to also note that my users with iPhone 13 pro are having this issue with scanning. I will also try adjusting some settings but thought to let you know that its definitely an issue.

RobertDickey commented 2 years ago

I don't know if this helps, but I love your fork, and use it in the hundreds of thousands of scans a day. One thing I've noticed particularly on the newest iOS or iPad devices is that the camera sensors are changing from individual sensors to an all In one FaceTime camera called "center stage" it seems like this sensor is a fully wide angel sensor, and uses software to crop. the auto focus seems to almost be constantly looking for faces and not allowing it to focus on barcode. I don't know accurate it is, but its particularly bad on the 13 pro and max, and the iPad mini latest release. I've managed to tweak the focus a little bit to let it work for about 30-40 scans...but then it randomly crashes and requires a refresh of the page, or sometimes a close-reopen of the browser. Very interesting. Let me know if this sounds plausible.

ericblade commented 2 years ago

@RobertDickey interesting that you seem to be successful with scanning at least, and others are hitting some sort of software failure that seems like it's unique to the device. Considering that Apple drastically changed the meaning of the aspectRatio parameter a couple of major releases ago, it wouldn't surprise me if they are doing a lot of undocumented changes that we might need to figure out how to work with now. Unfortunately, just buying high-end Apple devices to have something to debug with is rather cost prohibitive. :|

My first thought on that would be to try to tell it very specific constraints about what you want. The more you leave up to the OS to decide, the more likely you are to get something that makes it rather difficult to work with.

Apple may have implemented (or maybe should implement) some way of turning off face detection, or manipulating focus. There are proposals for manipulating auto-focus in Javascript, but as far as I'm aware, none have been implemented anywhere yet. Even if/when they do get implemented in Chromium, Apple's on device WebKit tends to be months or years behind on some things, if they decide to even expose it to users.

Also glad to hear of people outside of Grocy making extensive use of it :)

@VVlasy are you able to confirm that you're also hitting some sort of initialization error?

@all I am definitely concerned about this, especially since if there is an OS wide issue on Apple, then it will eventually affect a lot more users than just the subset of "12 pro max" users. Apple's processes, changes, and so on are not exactly easily visible -- whereas we can pretty much see everything in Chromium, if we have to actually dig down that far, iOS layers a lot of question marks all around their public WebKit source. Like with the aspectRatio behavior change, there was no corresponding changes that I could find in WebKit that would cause an invalid aspectRatio to suddenly produce valid but wildly undesired results. (for those not familiar, a couple of iOS versions ago, passing an aspectRatio that makes no sense to a user, like 9999 would instead of giving you the maximum aspect ratio that makes sense, it would try to give you what you asked for, resulting in visuals that were completely broken)

... but I'm going to need help from developers who know how to debug on device, to be able to track down what the source of


TypeError: null is not an object (evaluating 'n.x')```
is exactly.  And hopefully that will at least get us on the road to figuring out any further issues.
justanthonylee commented 2 years ago

I have a 13 Pro max, I suspect it's due too the main camera getting used and they now use a third camera as a macro or close camera. I was looking for ways to switch also and the device list only shows back and front camera.

RobertDickey commented 2 years ago

My iPhone 12 mini works just fine on scanning. So does my m1 iPad Pro. The new iPad mini is unreliable, but I think that is a separate issue.

@anthonyrossbach @Lordodson a hack I've used was to change the resolution in the code to exclude certain cameras. I wonder if you can set min/max values to only use the camera you want. Maybe it can skip whichever camera is causing it to fail, and atleast let you use it. I have my set for a weird solution of 900/600 and 1-1 aspect

@ericblade I spoke to a chromium dev about the possibility of adding in more camera control. From what they knew there was no intentions to add any additional controls anytime soon, if anything they may be more restricted and forced into an app setting for security purposes. That was a little discouraging. I doubt that will happen anytime soon either though...but I have noticed Apple changing the user agent to hide what type of device is browsing for privacy reasons, which sometimes breaks any code used to set specific resolution per device.

Just for a note - We use Apple MDM, I dont think that changes the camera at all, but something to note.

Lordodson commented 2 years ago

@RobertDickey I'm using the iPhone 13 pro max. I'll see if I can modify the resolution. It seems to be using the wide lens, which makes it too blurry at the most focused distance. If we could mod the resolution to make it use the telephoto lens, it might work. But we also discovered there might be a bug with the center stage or the sensor it uses for center stage that makes it look for human faces instead of barcodes.

brainytwoo commented 2 years ago

I was working with @Lordodson today and we figured out that we could change up the constraints a bit more and force the iPhone 13 pro max to use the camera we wanted. We removed the width and height constraints and basically use a mask and center the video in it. This cleaned up the video feed on the iPhone pro max 13 specifically. The last thing we did that probably fixed everything was specifying an aspect ratio of 1. It all seems to work now.

inputStream: {
  constraints: {
    aspectRatio: { ideal: 1 },
    ...(cameraId === 'default' ? { facingMode: 'environment' } : { deviceId: cameraId })
  },
  target: containerRef.current,
}
ericblade commented 2 years ago

That's actually probably a pretty good solution -- i wonder how the results are if you use 1.333 (4:3) or 1.777 (16:9) and just completely drop resolution from the thing. I also wonder if that whole thing works similarly in android, too

RobertDickey commented 2 years ago

@Lordodson @brainytwoo Glad you guys are making some Progress. The resolution I use seems to work for all cameras iOS related, but I'm still having issues on the new iPad mini. Is there a way to force the camera to reload complete? Its like the focus gets stuck. What is everyone using as a solution for the new androids? I've actually mandated people use iOS devices due to users having to select a a second camera on newer androids. Is there a way to automatically find the correct one? I haven't spent a ton of time on it.

@ericblade I've got two employees with a pro 13 and pro max 13, so I'll try to get some test time in today. I'll pick up a new one over the weekend since it seems like this will affect future phone versions. I'll run it debug mode in Safari and post the results. Hopefully there is an elegant solution.

ericblade commented 2 years ago

It's quagga-adjacent, but if there's some way to tell which camera you actually get when you put in different constraints on Android, it wouldn't be too difficult to write a quick web app thing that would just allow us to play with various constraints, and see what gets returned. That's one of the things that annoys me that you can't supply constraints to enumerateDevices

In context of what we discovered when Apple made the aspect ratio changes, the picture starts to become a little clearer now (har har) -- old examples of quagga used to use aspectratio min 1 max 99 or some such, which doesn't make sense in any context, but suddenly Apple started honoring it. So I suspect that if you don't specify something that actually expresses your desire for a traditional aspect ratio, you're going to end up with whatever it comes up with, probably assuming the biggest highest end thing is what you want.

I do wonder if one were to specify the constraints that you want one of 1, 1.333, or 1.777 for an aspectratio, if that would end up with something that works on iOS, as well as selecting the 'correct' camera automatically in Android.

RobertDickey commented 2 years ago

@Shefali-Upadhyay Is this still an issue for you? After changing to the resolutions I posted above, I've tested several Pro Max devices in the wild and have no issues. I am still unable to get stability on the new iPad mini with centerstage though.

shefaliupadhyay commented 2 years ago

Actually the project has been handed over to another company, hence was not able to check whether the resolution works or not. But thanks alot for immediate resolution.

argia-andreas commented 2 years ago

I can also confirm that setting the aspectRatio to 1 as a constraint got the cameraSelection/autofocus to work on a IOS Pro device we tested with.

erikmillergalow commented 2 years ago

Anyone have any other ideas to resolve this? I'm still unable to scan on my iPhone 13 mini after fiddling with aspectRatio/facingMode.

macrozone commented 1 year ago

also works if i set aspectRatio to 1, unfortunalty this is missing in the typescript types

ericblade commented 1 year ago

also works if i set aspectRatio to 1, unfortunalty this is missing in the typescript types

So, the constraints are a bit more complex of a type than supplying "aspectRatio: 1" ..

The constraints object has changed a LOT over time in the browsers, and so the browsers will understand a lot of malformed or older versions.

Constraints object should look something like

{
    video: {
        aspectRatio: {
            ideal: 1920 / 1080,
        }
    }
}