CuriousAI / mean-teacher

A state-of-the-art semi-supervised method for image recognition
https://arxiv.org/abs/1703.01780
Other
1.56k stars 331 forks source link

How to unpack training.msgpack and show the training logs? #21

Closed zhe-meng closed 5 years ago

zhe-meng commented 5 years ago

Thanks for your inspiring idea and the corresponding code.

I have run the cifar10 experiments in your code on the AWS cloud. After i trained the network, the data logs were saved in the cloud. Then I downloaded the results files, such as, training.msgpack, but i don't know how to unpack it to show the training logs.

I have google and searched at stackoverflow. But i still have not find a way to show the logs.

Would you please show me how to unpack the .msgpack file and show the logs?

Thanks.

tarvaina commented 5 years ago

https://pandas.pydata.org/pandas-docs/stable/generated/pandas.read_msgpack.html

On Wed, 12 Sep 2018 at 18.26, lindameiguo notifications@github.com wrote:

Thanks for your inspiring idea and the corresponding code.

I have run the cifar10 experiments in your code on the AWS cloud. After i trained the network, the data logs were saved in the cloud. Then I downloaded the results files, such as, training.msgpack, but i don't know how to unpack it to show the training logs.

I have google and searched at stackoverflow. But i still have not find a way to show the logs.

Would you please show me how to unpack the .msgpack file and show the logs?

Thanks.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/CuriousAI/mean-teacher/issues/21, or mute the thread https://github.com/notifications/unsubscribe-auth/AAAKs0KTr-DfqxWCfEe4ZThqMqglmWT2ks5uaSeNgaJpZM4WloSr .

zhe-meng commented 5 years ago

@tarvaina Thank you for the response! I have found the problem in 'run_context.py'. The compress was set as ‘None’. So i cannot open it. Thank you. def save(self): df = self._as_dataframe() df.to_msgpack(self.log_file_path, compress='None')