FC-Li / CloudSimPy

CloudSimPy: Datacenter job scheduling simulation framework
MIT License
222 stars 76 forks source link

进程创建后不能启动 #7

Closed silkyrose closed 4 years ago

silkyrose commented 4 years ago

主程序main-makespan.py在windows下执行 已增加: if name == 'main': freeze_support() 可以看到Process创建了13个进程,但是第一进程就不能start() 报错说它不能转换为数值

 for i in range(n_episode):
     algorithm = RLAlgorithm(agent, reward_giver, features_extract_func=features_extract_func,
                             features_normalize_func=features_normalize_func)
     episode = Episode(machine_configs, jobs_configs, algorithm, None)
     algorithm.reward_giver.attach(episode.simulation)
     p = Process(target=multiprocessing_run,
                 args=(episode, trajectories, makespans, average_completions, average_slowdowns))
     p.start()
     p.join()
 #

WARNING:tensorflow:From C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\training\checkpointable\util.py:1858: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version. Instructions for updating: Colocations handled automatically by placer. Traceback (most recent call last): File "E:/DPL/CloudSimPy-master/CloudSimPy-master/playground/Non_DAG/launch_scripts/main-makespan.py", line 93, in pj.start() File "C:\ProgramData\Anaconda3\lib\multiprocessing\process.py", line 112, in start self._popen = self._Popen(self) File "C:\ProgramData\Anaconda3\lib\multiprocessing\context.py", line 223, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "C:\ProgramData\Anaconda3\lib\multiprocessing\context.py", line 322, in _Popen return Popen(process_obj) File "C:\ProgramData\Anaconda3\lib\multiprocessing\popen_spawn_win32.py", line 65, in init reduction.dump(process_obj, to_child) File "C:\ProgramData\Anaconda3\lib\multiprocessing\reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 745, in reduce return (convert_to_tensor, (self.numpy(),)) File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 724, in numpy raise ValueError("Resource handles are not convertible to numpy.") ValueError: Resource handles are not convertible to numpy.

HaodaY commented 4 years ago

我也遇到了这个问题,请问你解决了吗?

silkyrose commented 4 years ago

你好: 我还没有解决,你呢? p.start() return convert_to_tensor, (self._numpy(),)ValueError: Cannot convert a Tensor of dtype resource to a NumPy array.



Best Regards!王瑾M: +86 13805156041 E: sophie_icon@sina.com

----- 原始邮件 ----- 发件人:HaodaY notifications@github.com 收件人:RobertLexis/CloudSimPy CloudSimPy@noreply.github.com 抄送人:silkyrose sophie_icon@sina.com, Author author@noreply.github.com 主题:Re:_[RobertLexis/CloudSimPy]进程创建后不能启动(#7) 日期:2020年02月13日 14点18分

我也遇到了这个问题,请问你解决了吗?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.

HaodaY commented 4 years ago

我改成用单进程去运行了

silkyrose commented 4 years ago

我知道是哪里的问题,episode这个对象不能作为process函数的参数传进去。但是我不会改,试图把episode转换为json字符串,在multiprocess函数里面再转回为对象,但是没成功


Best Regards!王瑾M: +86 13805156041 E: sophie_icon@sina.com

----- 原始邮件 ----- 发件人:HaodaY notifications@github.com 收件人:RobertLexis/CloudSimPy CloudSimPy@noreply.github.com 抄送人:silkyrose sophie_icon@sina.com, Author author@noreply.github.com 主题:Re:_[RobertLexis/CloudSimPy]进程创建后不能启动(#7) 日期:2020年02月27日 22点30分

我改成用单进程去运行了

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.

silkyrose commented 4 years ago

作者看到了也不出来说几句。。。。。。

Mikey2266 commented 4 years ago
    for i in range(n_episode):
        algorithm = RLAlgorithm(agent, reward_giver, features_extract_func=features_extract_func,
                                features_normalize_func=features_normalize_func)
        episode = Episode(machine_configs, jobs_configs, algorithm, None)
        algorithm.reward_giver.attach(episode.simulation)

        multiprocessing_run(episode, trajectories, makespans, average_completions, average_slowdowns)
    #     p = Process(target=multiprocessing_run,
    #                 args=(episode, trajectories, makespans, average_completions, average_slowdowns))
    # 
    #     processes.append(p)
    # 
    # for p in processes:
    #     p.start()
    # 
    # for p in processes:
    #     p.join()
taichanghong commented 2 years ago

我改成用单进程去运行了

请问是怎样操作的?