Hello,
I am trying to implement Cifar-10 in real-time on hardware(PYNQ-Z2) after capturing a frame from a USB webcam and pass it to HDMI out (monitor). For this, I have used the base overlay of pynq (having HDMI-in and HDMI-out) and manually add the CNV IP core i.e BlackBoxJam_0 with it. So that, I can generate my own overlay with CNV and HDMI using VIVADO IP integrator.
The block design has successfully generated a bitstream file. The .bit, tcl and hwh file has uploaded to the jupyter notebook environment. Also, the hardware .so file is successfully generated. I have tried to capture images from USB camera and classify each frame using the bnn.CnvClassifier. But always the jupyter notebook kernel dies whenever I apply bnn.CnvClassifier on the frame.
Is this a correct way to manually add BlackBoxJam_0 IP?? Please give suggestions
Hello, I am trying to implement Cifar-10 in real-time on hardware(PYNQ-Z2) after capturing a frame from a USB webcam and pass it to HDMI out (monitor). For this, I have used the base overlay of pynq (having HDMI-in and HDMI-out) and manually add the CNV IP core i.e BlackBoxJam_0 with it. So that, I can generate my own overlay with CNV and HDMI using VIVADO IP integrator.
The block design has successfully generated a bitstream file. The .bit, tcl and hwh file has uploaded to the jupyter notebook environment. Also, the hardware .so file is successfully generated. I have tried to capture images from USB camera and classify each frame using the bnn.CnvClassifier. But always the jupyter notebook kernel dies whenever I apply bnn.CnvClassifier on the frame.
Is this a correct way to manually add BlackBoxJam_0 IP?? Please give suggestions
IP block design given :