Hello!
First of all I want to thank you for this very interesting work and of course for sharing your implementation!
I was trying to reproduce your season translation results (yosemite dataset) but cant get it working.
I made a separate dataset class similar to the existing Celeba one to load the yosemite data, set the att_dict to {'summer': 0, 'winter': 1} and specified label pairs for each image (e.g. [1, -1] summer, [-1, 1] winter).
I tried training the STGAN with diff and target label parameter, but it did not help. The generated images during training (sample_training) as well as when using the test function always show the identical input images.
Is there a special parameter configuration needed to achieve image2image translation results?
Would you mind sharing your implementation for this experiment too?
Thank you in advance!
Hello!
First of all I want to thank you for this very interesting work and of course for sharing your implementation!
I was trying to reproduce your season translation results (yosemite dataset) but cant get it working.
I made a separate dataset class similar to the existing Celeba one to load the yosemite data, set the att_dict to {'summer': 0, 'winter': 1} and specified label pairs for each image (e.g. [1, -1] summer, [-1, 1] winter).
I tried training the STGAN with diff and target label parameter, but it did not help. The generated images during training (sample_training) as well as when using the test function always show the identical input images.
Is there a special parameter configuration needed to achieve image2image translation results?
Would you mind sharing your implementation for this experiment too?
Thank you in advance!