cleverhans-lab / machine-unlearning

MIT License
162 stars 35 forks source link

Results on different slices #6

Open Shrimp-99 opened 1 year ago

Shrimp-99 commented 1 year ago

I run the purchase data and set shard=1, slice=1 and epoch=1, the output model is trained on 0.25 million data for 1 epoch and the accuracy is around 0.955. Then I set shard=1, slice=16, and epoch=1. The train always take places on the 8th slice and epoch for it is 1 while for other slices is 0, according to the 'avg_epochs_per_slice' and 'slice_epochs' calculated. But the output model still uses 0.14 million data and the accuracy is around 0.945. I didn't see significant difference between them as Fig5 in paper shows. Is there any mistake using the above settings?

cnhbww commented 5 months ago

I run the purchase data and set shard=1, slice=1 and epoch=1, the output model is trained on 0.25 million data for 1 epoch and the accuracy is around 0.955. Then I set shard=1, slice=16, and epoch=1. The train always take places on the 8th slice and epoch for it is 1 while for other slices is 0, according to the 'avg_epochs_per_slice' and 'slice_epochs' calculated. But the output model still uses 0.14 million data and the accuracy is around 0.945. I didn't see significant difference between them as Fig5 in paper shows. Is there any mistake using the above settings?

Hello, I am a newbie who has just entered this field. I tried to directly clone the author's project for experiments, but encountered many problems. Can you share your experimental project with me? Thank you very much if you can.

Sushant-kumar-pal commented 4 months ago

I run the purchase data and set shard=1, slice=1 and epoch=1, the output model is trained on 0.25 million data for 1 epoch and the accuracy is around 0.955. Then I set shard=1, slice=16, and epoch=1. The train always take places on the 8th slice and epoch for it is 1 while for other slices is 0, according to the 'avg_epochs_per_slice' and 'slice_epochs' calculated. But the output model still uses 0.14 million data and the accuracy is around 0.945. I didn't see significant difference between them as Fig5 in paper shows. Is there any mistake using the above settings?

Hello, I am a newbie who has just entered this field. I tried to directly clone the author's project for experiments, but encountered many problems. Can you share your experimental project with me? Thank you very much if you can.

do u got the experimental results?Can you share with me please

cnhbww commented 3 months ago

我运行购买数据并设置 shard=1、slice=1 和 epoch=1,输出模型在 1 个 epoch 的 25 万条数据上训练,准确率约为 0.955。然后我设置 shard=1、slice=16 和 epoch=1。根据计算的“avg_epochs_per_slice”和“slice_epochs”,火车总是发生在第 8 个切片和纪元上,它的纪元为 1,而其他切片的纪元为 0。但输出模型仍然使用 0.14 万个数据,准确率约为 0.945。我没有看到它们之间的显着差异,如论文中的图 5 所示。使用上述设置有什么错误吗?

您好,我是刚进入这个领域的新手。我尝试直接克隆作者的项目进行实验,但遇到了很多问题。你能和我分享你的实验项目吗?如果可以的话,非常感谢你。

你有实验结果吗?你能和我分享吗

no, I didn't even get through the code,Can you give me some advice or help? pls,I will also share my progress with you later