elvisyjlin / AttGAN-PyTorch

AttGAN PyTorch Arbitrary Facial Attribute Editing: Only Change What You Want
MIT License
248 stars 61 forks source link

Testing on Custom Images #10

Closed Jnasic closed 5 years ago

Jnasic commented 5 years ago

Hello, Thank you for your great implementation. Can you kindly tell me how can I use the pretrained model to test on my own images. Thanks

elvisyjlin commented 5 years ago

Hi @Jnasic,

It's a great idea. But my code supports testing on only CelebA and CelebA-HQ at the moment.

If you'd like to do it yourself, what you need to do is to make a custom dataset class and replace test_dataset with it in this line. The custom dataset is similar to CelebA dataset (see here). There is no need to read the CelebA files but paths of your testing image. And you need appropriate transform functions when loading images, which are resizing them to args.img_size and normalizing them to [-1, 1].

I'll try to make a custom image loader if I have time this weekend. If you are not in a hurry, you can just wait for a couple of days.

elvisyjlin commented 5 years ago

I forgot to mention that you will also need to build an attribute list for your custom images like the ./data/list_attr_celeba.txt. AttGAN encodes an image (somehow disentangles its characteristics) and then decodes it with a set of given attributes. So the model needs to know what specific attributes the image has as well as all the target attributes that you want to change it into.

So you probably have to

  1. Create a file like ./data/list_attr_celeba.txt containing the ids and attributes.
  2. Name your testing images as 000000.jpg, 000001.jpg, etc.
  3. Read them with a dataset like CelebA but without cropping images into 170x170.

It requires a lot of work though.

elvisyjlin commented 5 years ago

Hi, I've added custom image dataset in 2f8cb3e. You can test on your custom images after preparing images and the attribute list in ./data. Please read the instruction here.

Jnasic commented 5 years ago

Thanks, I will look at it. Too many thanks.