ganeshv / egarim

Lenovo Mirage VR180 Camera API client - remote control and custom live streaming
MIT License
27 stars 2 forks source link

Error #2

Closed m6m112 closed 3 years ago

m6m112 commented 4 years ago

Hello. I'm a student studying your project.

I typed " python3 bluestrap.py pair " in terminal. The result was an error, "cannot unpack non-iterable NoneType object". I don't know why this error is happening. need your helps _(

ganeshv commented 4 years ago

Unfortunately, I've lost access to both the laptop and camera due to the covid lockdown, so there's little I can do to help. Can you put down all the messages which appear on the screen? Do you have the camera running and in pairing mode?

m6m112 commented 4 years ago

I've just solved the problem. Actually the problem was not about the camera mode. I forgot to turn the Bluetooth on in my Ubuntu. Thank you for the response. :)

However I got another question. How do you make a connection between the camera and the laptop? I have my camera in pairing mode, but I cannot connect it with other devices including my laptop and phone. My devices detect the camera but cannot make a connection with it. I found out that the camera is only connectable when I use the application 'VR180". I need the 'camera-laptop' connection for my project. Please let me know if you got any idea. Thank you very much.

ganeshv commented 4 years ago

I think the README has all the necessary instructions, can you check again please?

There are two modes of communicating with the camera

  1. Pairing, which happens only over bluetooth. Once pairing is successful, the file me_cam.skey is generated, which is a shared key used for all further communication
  2. API calls, which can be done over bluetooth or wifi (latter is preferred). API calls need to be signed by the shared key generated above. Note that you can issue API calls from any laptop (not just the one which did the pairing), so long as you have the me_cam.skey file. Phones are going to be difficult as they will need to run this python program.
  3. The typical sequence of operations is - first pair with bluetooth (bluestrap.py pair, then set wifi access point for the camera using bluestrap.py config_wifi, get the IP address using bluestrap.py status. This caches the IP address in a local file (~/.egarim-status). After this, you should use egarim.py status or other API calls.

If you can provide a brief idea about your project, I can tell you whether it is likely to work with this code.

Did pairing work? Did you manage to successfully run all the commands listed under the Pairing and Bluetooth API calls in the README?

m6m112 commented 4 years ago

I'm making a VR platform that connects my VR headset and a robot in real time. When I use a VR headset, I will be connected to the robot and the robot moves just like me. The camera acts as the eye of the robot. It sends video information of the space that the robot is looking at to my VR headset in real time.

As you can see on my screen screen attached below, I tried to pair but my laptop couldn't find the camera that is pairing mode. I'm stuck here, so I haven't executed any other commands. Thank you for the response. :)

111 222

ganeshv commented 4 years ago

Please check that the camera is in pairing mode before running the bluestrap.py pair command. Power on the camera. Once the shutter LED is in the solid blue state, hold the shutter button down for 5 seconds, until the LED starts flashing alternately green and blue - this is the pairing mode.

I did not get the deprecation messages, perhaps your Ubuntu is a newer version - mine was 18.04.

As for your use case - I had pretty much the same idea, telepresence robot with VR (Oculus Quest + Roomba). However, there are several issues.

  1. High resolution livestreaming has latency issues. In the best case, at least 500 ms, usually more than 2000 ms. Camera -> wifi router -> livestreaming server -> wifi router -> Oculus is a long path. Livestreaming servers add considerable delays.
  2. There is a low latency WebRTC "viewfinder" mode. But that is not stereo (API allows stereo, but the camera doesn't implement it) and the resolution is low. I could get this as a live feed within a webpage on the Oculus browser, but the video quality was quite poor.
m6m112 commented 4 years ago

Thank you for your help. I finished my project, well. But, Like you said, Data transmission has considerable delays. So, I will do a new experiment related to it. see you some day :)