Closed SokPhanith closed 1 year ago
Hi @SokPhanith, jetson-inference doesn't support loading the mean.binaryproto file - if needed, you can change the average mean pixel value here: https://github.com/dusty-nv/jetson-inference/blob/19ed62150b3e9499bad2ed6be1960dd38002bb7d/c/imageNet.cpp#L449
However this does not seem necessary if you are still getting 98% accuracy
Thank you @dusty-nv. Sorry, Can you give me some example add more code to load mean.binaryproto file ?
Sorry, I don't have the code for doing that or using the per-pixel mean data in the preprocessing
Thank you.
Hello @dusty-nv , I train on my own custom data-set Image Classification 4 classes with caffe. I copy from path : jetson-inference/build/aarch64/bin/networks/ResNet-18/ file below: 1- first train :
transform_param { mirror: true crop_size: 224 mean_value: 104 mean_value: 117 mean_value: 123 }
I build tensorrt and run inference working good. but 2- second train :
transform_param { mirror: true crop_size: 224 mean_file: "/home/phanith/Desktop/resnet18/board/mean.binaryproto" } I compute mean file from caffe tool on my custom data-set
I want to ask you when tensorrt Image Classification inference with caffe model format transform_param default look like ? if I have a mean.binaryproto file compute by caffe tool how can I set path for inference. I really not sure for it when tensorrt inference need or not. By the way 2 training I test on my validation set got accuracy around 98% the same.