YonghaoHe / LFFD-A-Light-and-Fast-Face-Detector-for-Edge-Devices

A light and fast one class detection framework for edge devices. We provide face detector, head detector, pedestrian detector, vehicle detector......
MIT License
1.31k stars 329 forks source link

How to set FP16 in deploy_tensorrt? #64

Closed congphase closed 3 years ago

congphase commented 4 years ago

Hello sir,

I want to try setting FP16 instead of the originally FP32 in the deploy_tensorrt to reduce inference time. How can I do that? Is it that I have to change from numpy.float32 to numpy.float16 in this line: https://github.com/YonghaoHe/A-Light-and-Fast-Face-Detector-for-Edge-Devices/blob/c33d364ea01a1fb2b1adf6c590eeb79c4b72200d/face_detection/deploy_tensorrt/to_onnx.py#L28 first, then change each existence of "float32" to "float16" in predict_tensorrt.py before running it?

congphase commented 4 years ago

@YonghaoHe Can you help me with this, sir? I would really appreciate it.