Closed LxYuan-Handshakes closed 2 years ago
I have tried downgraded numpy to 1.17.4
and 1.19.3
but it didn't work.
The error message i got:
(env)$ python main.py --data_type detection --config_file detection_config.json --data_root_dir ~/../pubtables/PubTables1M-Detection-PASCAL-VOC/
Traceback (most recent call last):
File "/home/lxyuan/playground/table-transformer/src/main.py", line 16, in <module>
from engine import evaluate, train_one_epoch
File "/home/lxyuan/playground/table-transformer/src/../detr/engine.py", line 13, in <module>
from datasets.coco_eval import CocoEvaluator
File "/home/lxyuan/playground/table-transformer/src/../detr/datasets/__init__.py", line 5, in <module>
from .coco import build as build_coco
File "/home/lxyuan/playground/table-transformer/src/../detr/datasets/coco.py", line 12, in <module>
from pycocotools import mask as coco_mask
File "/home/lxyuan/playground/table-transformer/env/lib64/python3.9/site-packages/pycocotools/mask.py", line 3, in <module>
import pycocotools._mask as _mask
File "pycocotools/_mask.pyx", line 1, in init pycocotools._mask
ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 96 from C header, got 80 from PyObject
Solution:
Hi,
I encountered this numpy type error during the evaluation phase. Any idea how to fix this?
How to reproduce the error
Error Message
It seems like i was able to complete one training epoch but got the numpy error message when we were trying to evaluate model performance on the validation set (i.e.,
src/main:L317
)Similar error when I tried to use
main.py
to evaluate model performance directly.How to reproduce the error
Error Message
NOTE: I am using
numpy==1.23.2
andpython3.9