harvardnlp / NeuralSteganography

STEGASURAS: STEGanography via Arithmetic coding and Strong neURAl modelS
https://steganography.live/
184 stars 25 forks source link

Live demo site produces different cover text compared to source code #5

Open arxivcrawler opened 5 years ago

arxivcrawler commented 5 years ago

Hello, this is my message. Is the secret message that I am using. When I put this into the live demo site the cover text produced is:

" His resignation was held on March 1, 1788, so that he may return to the presidency of the United States in January 1820.

The year 1788 marks the year of Washington's retirement from the presidency. In 1788"

When I run the secret message through the source code, I get this produced text.

" His resignation was declared the official declaration of war in the Union and he called it quits.

He left Arlington Cemetery to pursue his education at the University of Virginia before entering the Navy"

The source code is properly decoding it back into the original message when the model used is 'gpt2' which I presume is the small model. I am also presuming the live website uses gpt2 as the parameter as the PPL is measured by gpt2.

The results are don't match when I use 'gpt2-medium' and it seems like 'gpt2-large' does not work (it's saying that it's an invalid parameter) and when i upgrade pytorch_transformers to the latest version, it returns the BPE error.

flarn2006 commented 5 years ago

I noticed this as well; wasn't sure what's going on.

vijeyanidhi commented 4 years ago

@arxivcrawler Did the environment you used for gpt2 is pytorch_transformers==1.1.0 torch==1.0.1 bitarray==1.0.1

or the version for pytorch_transformers is 1.4.0

i am not able to decode the message back to the original message when i try it with gpt2 not the medium or large version just gpt2.

i also tried the workaround mentioned in #1 but nothing seems to work for me

zhangwei1992-s commented 4 years ago

pytorch_transformers==1.1.0 torch==1.0.1 bitarray==1.0.1

I just ran the model used the environment described in your reply and it woked. When i update the pytorch-transformers to 1.2.0, i am not able to decode the message back.