Closed gemaizi closed 4 years ago
By the way today i use the new github code!
the detector model works well, only classifier model get this error
Yes, recently converter code was updated to support proper data preprocessing. What is the structure of your dataset?
Yes, recently converter code was updated to support proper data preprocessing. What is the structure of your dataset?
struct of dataset: five folders name "annimal", "guitar","flower","house","plane" each folder contains some jpg images , the shape(width and height) of each image is different, the only difference between the detector images and classifier images is that the detector images is the same shape, all are 640*480!
Different image size shouldn't present a problem, since it resizes images before calibration automatically to the network input size. I just checked on my local computer, everything seems to be normal. Are you running locally or in Colab? If you're use aXeleRate on local computer, can you run https://github.com/AIWintermuteAI/aXeleRate/blob/master/tests_training_and_inference.py this script and see if there are any errors?
Different image size shouldn't present a problem, since it resizes images before calibration automatically to the network input size. I just checked on my local computer, everything seems to be normal. Are you running locally or in Colab? If you're use aXeleRate on local computer, can you run https://github.com/AIWintermuteAI/aXeleRate/blob/master/tests_training_and_inference.py this script and see if there are any errors?
i use use aXeleRate on local compute(ubuntu18.04), i will try your script to see what happend, thanks!
Different image size shouldn't present a problem, since it resizes images before calibration automatically to the network input size. I just checked on my local computer, everything seems to be normal. Are you running locally or in Colab? If you're use aXeleRate on local computer, can you run https://github.com/AIWintermuteAI/aXeleRate/blob/master/tests_training_and_inference.py this script and see if there are any errors?
i run the script and it can generate the kmodel but finally have some error:
- Import graph...
- Optimize Pass 1...
- Optimize Pass 2...
- Quantize... 4.1. Add quantization checkpoints... 4.2. Get activation ranges... Plan buffers... Run calibration... [==================================================] 100% 5.364s 4.5. Quantize graph...
- Lowering...
- Generate code... Plan buffers... Emit code... Main memory usage: 109200 B
SUMMARY
INPUTS
0 Input_0 1x3x240x320
OUTPUTS
0 dense_1/Softmax 1x5
0
Folder projects/classifier/2020-07-13_18-00-27/Inference_results is created.
Classifier
Loading pre-trained weights for the whole model: projects/classifier/2020-07-13_18-00-27/Classifier_best_val_accuracy.h5
(227, 300, 3)
Traceback (most recent call last):
File "tests_training_and_inference.py", line 127, in
Okay, so at least the model is converted properly. The error has something to do with opencv - what version of opencv are you using? For your original problem, Fatal: Invalid dataset, file size should be 602112B, but got 1806336B --- will you be willing to share your dataset or part of it(that you can confirm causing the problem) in private message? Or openly, if your dataset is not confidential.
Okay, so at least the model is converted properly. The error has something to do with opencv - what version of opencv are you using? For your original problem, Fatal: Invalid dataset, file size should be 602112B, but got 1806336B --- will you be willing to share your dataset or part of it(that you can confirm causing the problem) in private message? Or openly, if your dataset is not confidential.
opencv version is 4.3.0.36 , what is your email address? i will send you my datasets.
file is too large(180mb),github can not upload
Can you try 4.2.0.34? dmitrywat@gmail.com - alternatively you can share on google drive
Can you try 4.2.0.34? dmitrywat@gmail.com - alternatively you can share on google drive
i will try 4.2.0.34
here is my datasets goole drive link,: https://drive.google.com/file/d/1BBYCTtd8-nva5WtM2KheysB3xdhwirWb/view?usp=sharing
Found the issue, please change https://github.com/AIWintermuteAI/aXeleRate/blob/cf365c2ed9f51b39d48f104ebc43c72a0fd697d8/axelerate/networks/common_utils/convert.py#L80 to with open(os.path.join(temp_folder, bin_filename), "wb") as f:
Should work fine after that! I'll fix it in the next commit. Also, since you have a lot of images in validation dataset, you can change https://github.com/AIWintermuteAI/aXeleRate/blob/cf365c2ed9f51b39d48f104ebc43c72a0fd697d8/axelerate/networks/common_utils/convert.py#L64 to num_imgs = 200 200 should be enough for proper calibration
Found the issue, please change https://github.com/AIWintermuteAI/aXeleRate/blob/cf365c2ed9f51b39d48f104ebc43c72a0fd697d8/axelerate/networks/common_utils/convert.py#L80
to with open(os.path.join(temp_folder, bin_filename), "wb") as f: Should work fine after that! I'll fix it in the next commit. Also, since you have a lot of images in validation dataset, you can change https://github.com/AIWintermuteAI/aXeleRate/blob/cf365c2ed9f51b39d48f104ebc43c72a0fd697d8/axelerate/networks/common_utils/convert.py#L64
to num_imgs = 200 200 should be enough for proper calibration
good!! now it works, thanks!!
Great!
Two moth ago I train a classifier model and it works well, but today I retrain the same model using same dataset , get an error: Fatal: Invalid dataset, file size should be 602112B, but got 1806336B I dont konw why conver to k210 model get this error, 1806336B is three times of 602112B