Xilinx / Vitis-AI

Vitis AI is Xilinx’s development stack for AI inference on Xilinx hardware platforms, including both edge devices and Alveo cards.
https://www.xilinx.com/ai
Apache License 2.0
1.46k stars 630 forks source link

Vitis-AI can support aws f1 ? #2

Open zyxcambridge opened 4 years ago

zyxcambridge commented 4 years ago

update the ml-suite and give a ami

wilderfield commented 4 years ago

Stay tuned. We will build an AMI shortly.

On Mon, Dec 2, 2019 at 7:43 PM zyxcambridge notifications@github.com wrote:

update the ml-suite and give a ami

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/Xilinx/Vitis-AI/issues/2?email_source=notifications&email_token=ADX4NJYZ6BNCVCES3V3V5TLQWXIWLA5CNFSM4JURLI6KYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4H5RBBOA, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADX4NJ6PQ3RA5Z3LCPPTGFDQWXIWLANCNFSM4JURLI6A .

zyxcambridge commented 4 years ago

XILINX_XRT : /opt/xilinx/xrt PATH : /opt/xilinx/xrt/bin:/root/miniconda3/bin:/root/miniconda3/condabin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games LD_LIBRARY_PATH : /opt/xilinx/xrt/lib: PYTHONPATH : /opt/xilinx/xrt/python: WARNING: The xbutil sub-command flash has been deprecated. Please use the xbmgmt utility with flash sub-command for equivalent functionality.

Card not found! WARNING: The xbutil sub-command flash has been deprecated. Please use the xbmgmt utility with flash sub-command for equivalent functionality.

zyxcambridge commented 4 years ago

Verifying XILINX_XRT

XILINX_XRT : /opt/xilinx/xrt PATH : /opt/xilinx/xrt/bin:/root/miniconda3/bin:/root/miniconda3/condabin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games LD_LIBRARY_PATH : /opt/xilinx/xrt/lib: PYTHONPATH : /opt/xilinx/xrt/python: --- Using System XCLBIN --- Couldn't find the VAI compiler. Exiting ...

aws f1 ubuntu 16.04

imyoungyang commented 4 years ago

Any update on this ticket?

Looks the XRT runtime tool xbutil, and xbmgmt can't communicate to AWS F1 card.

Second, the demo code only provide ZCU102, ZCU104, U50 boards. Could you provide VU9P, which is Virtex UltraScale+ AWS VU9P F1 Acceleration Development Board?

rodrigomelo9 commented 4 years ago

No news? The AMI is not strictly needed but the VU9P support? The 1.2 version was recently launched without this addition :-(

wilderfield commented 4 years ago

We are looking at enabling AWS EC2 F1 by Vitis AI 1.3

Bryan

On Fri, Jul 17, 2020 at 8:41 PM Rodrigo A. Melo notifications@github.com wrote:

No news? The AMI is not strictly needed but the VU9P support? The 1.2 version was recently launched without this addition :-(

— You are receiving this because you commented.

Reply to this email directly, view it on GitHub https://github.com/Xilinx/Vitis-AI/issues/2#issuecomment-660417892, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADX4NJYKKDBXQQA4AJZPECTR4EKVLANCNFSM4JURLI6A .

rodrigomelo9 commented 4 years ago

The idea is to support the VU9P or only to provide an AMI to replace the ML-suite?

acastanedam commented 3 years ago

Hi,

Are there still plans for supportting AWS's FPGAs?. Maybe there is something I am overlooking, but I can not use it for VU9Ps, even with the most recent version. It would be great to use Vitis-AI in AWS.

Regards,

A. Castaneda

HasinduKariyawasam commented 2 years ago

After setting up Vitis AI on AWS f1.2xlarge instance, I tried to run an example. In that, there are steps for setting up Alveo accelerator cards. But the AWS f1.2xlarge instance has an Ultrascale+VU9P card. Do I need to set up that? If so are there any tutorials or guides on how to do that?

fanz-xlnx commented 2 years ago

To start from F1 instance, you don't need to worry about the shell or xclbin, but need to install the XRT and XRM yourself. The version on AWS is different from the release version, which you may refer to the guidence here.

WWWindrunner commented 1 year ago

To start from F1 instance, you don't need to worry about the shell or xclbin, but need to install the XRT and XRM yourself. The version on AWS is different from the release version, which you may refer to the guidence here.

For cloud users, do I need to set xclbin? If I don’t set it, an error will be reported. There is aws-xclbin for aws, but I don't see aliyun-xclbin. eg. https://www.aliyun.com/product/ecs/fpga

WWWindrunner commented 1 year ago

To start from F1 instance, you don't need to worry about the shell or xclbin, but need to install the XRT and XRM yourself. The version on AWS is different from the release version, which you may refer to the guidence here.

In aws, looking at the document shows that xbutil can find the platform of aws, but on the fpga cloud server of aliyun, xbutil cannot find the corresponding platform, and the error message shows that the device cannot be found. Why?