Open jzymessi opened 2 years ago
VITIS-AI is already integrated with XRM, Would you let us know:
@bttu HI: 1.I get Vitis-ai 2.5 form github, and I already build Vitis ai docker. 2.I use vck5000, I can run Vitis-ai-libreray demo in it. 3.I run on ubuntu 18.04 Thanks!
@jzymessi have you run through the application on host (instead of inside docker)?
To run Vitis AI with XRM on host: 1) build and install XRM package on host; after installation, XRM daemon will be started on host; 2) follow Vitis AI document to run it.
if you want to run Vitis AI in docker, one solution is to run XRM daemon on host, run Vitis AI in docker with suitable configuration: 1) install XRM on host; 2) install XRM in docker, install Vitis AI in docker; 3) start docker like: docker run --rm \ --pid host \ --network host \ --device=/dev --privileged \ -it docker_image bash
Is this still an issue?
Hi @bttu:
Thanks for your support.
I haven't tried it recently. I can close this issue, and if there are any questions, I'll take them back.
Hi @bttu I build XRM in host is success. But when I try to build in docker is error:
How to solve it ?
@jzymessi will you assign FPGA device into container? if not, then you should not run xrmd in container.
If you want to run xrmd in container, then you will need download systemctl replacement (https://github.com/gdraheim/docker-systemctl-replacement/blob/master/files/docker/systemctl.py) to start xrmd in container since systemctl is disabled in container:
./systemctl.py xrmd start
@bttu :
I'm a little confused right now. I have successfully installed and run XRM tests on host.
I have two VCK5000 cards. Now I need to run different models on the two cards through Vitis_ai_Library in Docker. What should I do?