ray-project / ray

Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
https://ray.io
Apache License 2.0
33.06k stars 5.6k forks source link

Error when run examples/autoregressive_action_dist.py #5848

Closed toanngosy closed 4 years ago

toanngosy commented 4 years ago

System information

Describe the problem

I run the example of autoregressive policy head and there is an error from the logger, it is because of the mixing mode between eager and non-eager execution. I tried to disable eager execution by adding "eager": False in the configs, however, it does not solve the error.

Source code / logs

Traceback (most recent call last): File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/tensorflow_core/python/ops/gen_summary_ops.py", line 851, in write_summary tensor, tag, summary_metadata) tensorflow.python.eager.core._FallbackException: This function does not handle the case of the path where all inputs are not already EagerTensors.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/ray/tune/trial_runner.py", line 540, in _process_trial result, terminate=(decision == TrialScheduler.STOP)) File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/ray/tune/trial.py", line 386, in update_last_result self.result_logger.on_result(self.last_result) File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/ray/tune/logger.py", line 333, in on_result _logger.on_result(result) File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/ray/tune/logger.py", line 193, in on_result "/".join(path + [attr]), value, step=step) File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/tensorboard/plugins/scalar/summary_v2.py", line 65, in scalar metadata=summary_metadata) File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/tensorflow_core/python/ops/summary_ops_v2.py", line 646, in write _should_record_summaries_v2(), record, _nothing, name="summary_cond") File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/tensorflow_core/python/framework/smart_cond.py", line 54, in smart_cond return true_fn() File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/tensorflow_core/python/ops/summary_ops_v2.py", line 640, in record name=scope) File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/tensorflow_core/python/ops/gen_summary_ops.py", line 856, in write_summary writer, step, tensor, tag, summary_metadata, name=name, ctx=_ctx) File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/tensorflow_core/python/ops/gen_summary_ops.py", line 893, in write_summary_eager_fallback attrs=_attrs, ctx=_ctx, name=name) File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/tensorflow_core/python/eager/execute.py", line 76, in quick_execute raise e File "/home/toanngo/anaconda3/envs/pycigar/lib/python3.6/site-packages/tensorflow_core/python/eager/execute.py", line 61, in quick_execute num_outputs) TypeError: An op outside of the function building code is being passed a "Graph" tensor. It is possible to have Graph tensors leak out of the function building context by including a tf.init_scope in your function building code. For example, the following function will fail: @tf.function def has_init_scope(): my_constant = tf.constant(1.) with tf.init_scope(): added = my_constant * 2 The graph tensor has name: create_file_writer/SummaryWriter:0

toanngosy commented 4 years ago

It has been solved with the lastest update on logger.