Closed adeebshihadeh closed 1 year ago
I took a swing at MetaDrive. I can only get about 10 FPS on wls2 when copying the panda3d screenshot to vpc. Maybe I will try running it natively.
Added this function to metadrive lib and converted the returned screenshot to YUV format and reused the carla bridge code.
def get_screenshot(self):
if self.engine.episode_step <= 1:
self.engine.graphicsEngine.renderFrame()
origin_img = self.cam.node().getDisplayRegion(0).getScreenshot()
return origin_img
Metadrive is missing imu sensor inputs so that would need to be added eventually. Metadrive also has a few compatibility issues with mac.
Here is a little video. https://youtu.be/P-ti8c1esiY. I could probably get the performance to be stable if I work on it a bit more. I had to make a few changes to OP to get it running without seg faulting. I whipped this code together quickly so it's pretty rotten.
Cool! No accelerometer/gyro should be fine, with changes to the localizer.
i found out there's a fork of torcs that can be controlled with CAN messages which could be interesting (albeit probably doesn't make sense in the context of this bounty since it wouldn't already be in package repos): https://github.com/epozzobon/torcs-1.3.7
So I ended up getting some pretty stable performance with metadrive (400x300x3x1@20fps). One command to launch! However, I am having other issues with getting the model to run reliably. I am getting an assert error in onnx_runner.py. It works about 10% of the time.
def read(sz, tf8=False):
dd = [] # data
gt = 0 # got
szof = 1 if tf8 else 4 # size of
print("reading %d bytes" % (sz * szof), file=sys.stderr)
while gt < sz * szof: # read until we have enough
#print("here")
st = os.read(0, sz * szof - gt)
assert(len(st) > 0) # make sure we got something. We fail here 90% of the time reading stdout?? dup2(pipeout[1], 1); ??
dd.append(st) # append
gt += len(st) # update
r = np.frombuffer(b''.join(dd), dtype=np.uint8 if tf8 else np.float32)
if tf8:
r = r / 255.
return r
I also tried to compile op with --pc-thneed --snpe and the model drops many frames. It takes about 800ms to process. Not sure what im doing wrong.
Running on windows 11 WSL2 ubuntu_2004
Edit:
I fixed the issue with onnx. need to import torch.
I basically have it working. I am going to polish it up for the first PR.
I have it working at 1928 X 1208 on a Ryzen 6850H (Laptop) iGPU:
Can definitely go much faster if you have a decent GPU.
@MoreTore could you post your updated results in YouTube again? I think we would optimize for better performance on lower spec device (e.g. iGPU).
I ensured that we are getting a similar field of view to the comma device cameras and that we are running at the intended resolution 1928 X 1208.
I should have unit tests and PR by EOD, e2e tests per https://github.com/commaai/openpilot/issues/26215 by tmr.
Can't really get high resolution on low spec device with the current game engine. We need to copy the Vram to the IPC, do a conversion, then put it back to do invfrence and display. This takes a long time and there is no way around it with the current game engine API. I am using a rtx3070ti with 12700H. I get around 30fps sending wide and road cam at high resolution. You already got the Pr then I won't keep going.
https://github.com/metadriverse/metadrive/issues/290#issuecomment-1432996599
@jon-chuang I just tested your branch and on my PC I only get 5 fps? What am I missing?
What am I missing?
Some settings changes. I will push the update soon. It's running smoothly at 33 and 50 frames per second (3 and 2 ticks_per_frame
).
It is running slower than expected on my laptop with iGPU (around 10FPS). But if you have a proper GPU you should be able to set ticks_per_frame=2,3
no problem, let me know how it goes.
let me know how it goes.
Seems to work at 3 tpf. The bridge terminal control is really bad. Probably shouldn't use it for metadrive
The bridge terminal control is really bad.
Could you elaborate?
Make it like this https://youtu.be/ic1IKOWRzwQ
Yeah, if you're talking about the manual controls, I fully agree. You can't hold W while steering with A/D.
Use metadrive for the inputs. It's already configured. Made it very simple to do this. As a side effect of how I did it, you can run openpilot on the Comma device(or any openpilot env) and metadrive on a separate PC.
Anyways I made this branch to modify metadrive in openpilot easier. Give it a try. https://github.com/MoreTore/openpilot/tree/meta
I am curious how it will perform on other PCs
Simple to use.
install openpilot like normal
pip install torch
from the openpilot directory:
./tools/streamer/launch_openpilot.sh
press h on the game window for controls
EDIT: I also fixed the controls lagging issue and commissue
There is another simulator i was looking into. It should be available on linux in a few months. Its high fidelity but it has shared gpu memory feature built in. I am talking about beamnd drive. https://beamng.tech/ Here is a sample script to get camera streams. stream.zip
There is also this sim https://www.youtube.com/watch?v=Ucr0aM334_k. It is no longer maintained but the community forks are still very active. I believe there are commercial restrictions but OP is opensource so i think its ok to use.
IMO these are what a high quality sim looks like.
@adeebshihadeh Care to try this? I cleaned it up a bit. https://github.com/commaai/openpilot/pull/27892
Clone openpilot, run the setup scripts, then run a single command to get dropped into the openpilot UI with it driving in the simulator. This will be the new default way to run openpilot on PC. Ideally, this experience runs on something like a recent MacBook Air (though fixing up macOS support in openpilot is a separate project).
Some possible simulator options:
It's best to do this in small chunks. A good first PR proposes a simulator with a small POC. The bounty will be considered locked after that first PR is merged.