nka77 / DAHiTra

DAHiTra: UNET Architecture with Hierarchical Transformers for Automated Building Damage Assessment Using Satellite Imagery
MIT License
26 stars 6 forks source link

Training data filtering in xBDatasetMulti class and selection #9

Open mmuneeburahman opened 1 year ago

mmuneeburahman commented 1 year ago

Why the data is selected from only these five destructions:

https://github.com/nka77/DAHiTra/blob/fffeb3f94f7a9d5af47f1e9ffcacb9affcad3474/datasets/CD_dataset.py#L229 Are the results in the paper obtained with this data only? Secondly, the complete xBD Dataset is provided in hold, test, train, tier3 files. Was all the data combined in the single train directory?

mmuneeburahman commented 1 year ago

with these configs, I am getting these evaluation scores:

Begin evaluation...
Is_training: False. [153,199][1,55], imps: 21.15, est: 4.82h, G_loss: 0.78621, running_mf1: 0.75423
Is_training: False. Epoch 153 / 199, epoch_mF1= 0.84416
acc: 0.96060 miou: 0.73796 mf1: 0.84416 iou_0: 0.95911 iou_1: 0.72390 iou_2: 0.63657 iou_3: 0.73862 iou_4: 0.63162 F1_0: 0.97913 F1_1: 0.83984 F1_2: 0.77793 F1_3: 0.84966 F1_4: 0.77422 precision_0: 0.97705 precision_1: 0.84950 precision_2: 0.83857 precision_3: 0.84658 precision_4: 0.78685 recall_0: 0.98121 recall_1: 0.83040 recall_2: 0.72547 recall_3: 0.85277 recall_4: 0.76200 

Lastest model updated. Epoch_acc=0.8442, Historical_best_acc=0.8378 (at epoch 151)

These are F1 scores:

F1_0: 0.97913 F1_1: 0.83984 F1_2: 0.77793 F1_3: 0.84966 F1_4: 0.77422 
XiaoBaiHhy commented 6 months ago

@mmuneeburahman Hello, I would like to ask you, how is this XBD dataset finally composed?Is it used after combining hold, test, train, tier3 into one file? If not, please tell me how it is composed, thank you very much