High performance Cross-platform Inference-engine, you could run Anakin on x86-cpu,arm, nv-gpu, amd-gpu,bitmain and cambricon devices.
532
stars
135
forks
source link
Can we build and run Anakin with MKLDNN on CPU for CNN models? #513
Open
avinashcpandey opened 5 years ago
We have use MKLDNN in our latest code, but MKLDNN OP will NOT be used for running model. We will use our JIT code. MKLDNN OP will be used in next code