Closed tangbing9922 closed 3 years ago
leave your e-mail and I will send u later
leave your e-mail and I will send u later
btang3538@gmail.com thanks a lot
leave your e-mail and I will send u later excuse me,when i run the train_muInfo.py it shows No such file or directory: 'mutual data/x1/12382.npy' where can i get the data or how should I make such data thanks a lot : )
leave your e-mail and I will send u later
Dear Owner, i want to get the data_process.py too, my e-mail 875048655@qq.com
thank u very much!
leave your e-mail and I will send u later Dear Owner, i want to get the data_process.py too, my e-mail 1023365168@qq.com thank u very much!
I want the data in the data_process. py file, my e-mail [2414727785@qq.com]. Thank you very much!
I want the data in the data_process. py file too, my e-mail [w1739426792@163.com].Thank you very much!
Dear author, i want to get the data_process.py too, my e-mail [503989460@qq.com] thank you very much! Best wishes!
Dear author, i want to get the data_process.py too, my e-mail 851022055@qq.com thank you very much! Best wishes!
Dear Author,
Sorry to disturb you. I am a new learner of semantic communication. Would you please send me the data involved in the data_process.py to my email pawn0603@gmail.com?
Thank you for your help.
Best wishes.
Dear author, i want to get the data_process.py too, my e-mail is [965582246@qq.com] . Thank you very much!
您好,邮件我已收到,会尽快回复,谢谢。
Dear Author, I learned a lot from your code which is so clear. But when I run it, I met a problem. Would you please send me the data in the data_process.py to my email (1052211929@qq.com)? I will really appreciate for your help. Best wishes!
您好,邮件我已收到,会尽快回复,谢谢。
Dear author,
I have just seen the file "predict_deepSC_without_MI.py" also has some datasets I don't know, like id_dic_10w.pkl, word_dic_10w.pkl,corpus_10w.txt, corpus_10w_train.txt.
Would you please sent it all to me? Or could you please tell me how to make this datasets? I will really appriciate for your help. By the way, the dataset I will just use for study. My email[1052211929@qq.com]
Best wishes!
Dear author,
Sorry to disturb you. I am a first-year PhD student working on semantic communication. Thank you very much for sharing the original code of the DeepSC model. Reading it greatly helps me to learn how the model actually works. I would like to enquire if it is possible that I could also have the datasets used by the codes data_process.py and predict_deepSC_without_MI.py? These datasets will be used only for helping me better understand the DeepSC model, and will not be used for other purposes. My email is huangxinysu@gmail.com.
I sincerely appreciate your help! Thank you very much!
Sorry, I do not have the dataset.
童话果果 @.***
------------------ 原始邮件 ------------------ 发件人: "Azul-9/DeepLearningEnabledSemanticCommunicationSystems" @.>; 发送时间: 2022年11月5日(星期六) 中午1:53 @.>; @.**@.>; 主题: Re: [Azul-9/DeepLearningEnabledSemanticCommunicationSystems] about the dataset (#1)
Dear author,
Sorry to disturb you. I am a first-year PhD student working on semantic communication. Thank you very much for sharing the original code of the DeepSC model. Reading it greatly helps me to learn how the model actually works. I would like to enquire if it is possible that I could also have the datasets used by the codes data_process.py and predict_deepSC_without_MI.py? These datasets will be used only for helping me better understand the DeepSC model, and will not be used for other purposes. My email is @.***
I sincerely appreciate your help! Thank you very much!
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
Dear authur, I want to get the dataset just for the study purpose, my e-mail 925166869@qq.com Thank you very much!
Dear Author,
I'd like to have a share of the code file. My E-mail is 1198584684@qq.com. For friends who already have a share of the file, I would appreciate it very much if you could please share it with me.
Thank you!
Several classmates like me who have not received emails have contacted me. I believe that there will be other problems that are difficult to solve alone in the learning process. At present, I have not found an open communication group related to semantic communication, so I have created a new WeChat group dedicated to discussing semantic communication and task oriented communication. I believe we will all benefit from the discussion. You can add my WeChat (bdhzb1997), and then I will invite you to join the group. Members of the group are interested in the research of semantic communication, and welcome to join.
Dear Author,
Sorry to disturb you. I am a new learner of semantic communication. Would you please send me the data involved in the data_process.py to my email (2587349837@qq.com)?
Thank you for your help.
Best wishes.
您好,邮件我已收到,会尽快回复,谢谢。
Sorry to disturb you.I'd like to have a share of the code file. My E-mail is 2764113962@qq.com).Thank you very much!
您好,邮件我已收到,会尽快回复,谢谢。
Dear Author, Would you please send me the data to my email (zjjslch.00@outlook.com)? Thank you. Best wish.
您好,邮件我已收到,会尽快回复,谢谢。
leave your e-mail and I will send u later
Dear Owner, i want to get the data_process.py too, my e-mail [2265612002@qq.com] thank u very much!
您好,邮件我已收到,会尽快回复,谢谢。
Dear Author, could you please send me the data in the data_process.py to my email (2740574341@qq.com)? Thank you for your help.
您好,邮件我已收到,会尽快回复,谢谢。
Dear Author, Could you please send me the datasets to my email (840196927@qq.com)? Thank you very much for your help.
Dear Author, Could you please send me the datasets to my email (1739756418@qq.com)? Thank you very much for your help.
Dear Author,
Sorry to disturb you. I am a new learner of semantic communication. Would you please send me the data involved in the data_process.py to my email [1071253148@qq.com)?
Thank you for your help.
Best wishes.
Dear Author,
Sorry to disturb you. I am a new learner of semantic communication. I queried a lot of corpus datasets and there are no datasets that satisfy data_process.py.Would you please send me the data involved in the data_process.py to my email [3249723967@qq.com)?
Thank you for your help.
Best wishes.
leave your e-mail and I will send u later excuse me,when i run the train_muInfo.py it shows No such file or directory: 'mutual data/x1/12382.npy' where can i get the data or how should I make such data thanks a lot : )
Would you please send me the data involved in the data_process.py to my email[2908202015@qq.com]? Thanks
您好,邮件我已收到,会尽快回复,谢谢。
Would you please send me the data involved in the data_process.py to my email[[2825539093@qq.com]? Thanks
Dear Author, could you please send me the data in the data_process.py to my email [2454720607@qq.com]? Thank you for your help.
您好,邮件我已收到,会尽快回复,谢谢。
could anybody send a copy of the dataset, please? my email address is xbb@xbb.moe Thanks a lot
您好,邮件我已收到,会尽快回复,谢谢。
leave your e-mail and I will send u later
Dear Owner, i want to get the data_process.py too, my e-mail is qianchengnju@126.com, thanks a lot!
Dear Author, could you please send me the data in the data_process.py to my email [1134602569@qq.com]? Thank you for your help.
您好,邮件我已收到,会尽快回复,谢谢。
Dear Author, Could you please send me the datasets to my email ([1833211401@qq.com])? Thank you very much for your help!
leave your e-mail and I will send u later
Dear Author, Could you please send me the datasets to my email ([1833211401@qq.com])? Thank you very much for your help!
leave your e-mail and I will send u later
Dear Azul-9,
Sorry to disturb you after such a long time. I just start to learn semantic communication. Would you please send me the data involved in the data_process.py to my email yfzhang109@tju.edu.cn? This means a lot to me.
Thank you sincerely for your help!
Best wishes!
您好,邮件我已收到,会尽快回复,谢谢。
Dear author, I want to get the dataset for research purpose only, my email is 878230743@qq.com
您好,邮件我已收到,会尽快回复,谢谢。
Where can I get the data in the data_process. py file