ray-project / ray

Ray is an AI compute engine. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
https://ray.io
Apache License 2.0
34.14k stars 5.8k forks source link

[RLlib] tf.Variable error when running regression tests with dreamerV3 #48610

Open shawn9995 opened 2 weeks ago

shawn9995 commented 2 weeks ago

What happened + What you expected to happen

I am trying to run a regression test on the cartpole example and am running into the issue below.

rayvenv) shpa7847@UCB-TDLQ372645 bsk_rl % python run_regression_tests.py --dir /Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/tuned_examples/dreamerv3/cartpole.py
rllib dir=/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab
Will run the following regression tests:
-> /Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/tuned_examples/dreamerv3/cartpole.py
/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/gymnasium/spaces/box.py:130: UserWarning: WARN: Box bound precision lowered by casting to float32
  gym.logger.warn(f"Box bound precision lowered by casting to {self.dtype}")
/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/gymnasium/utils/passive_env_checker.py:164: UserWarning: WARN: The obs returned by the `reset()` method was expecting numpy array dtype to be float32, actual type: float64
  logger.warn(
/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/gymnasium/utils/passive_env_checker.py:188: UserWarning: WARN: The obs returned by the `reset()` method is not within the observation space.
  logger.warn(f"{pre} is not within the observation space.")
2024-11-06 13:57:40,067 INFO worker.py:1816 -- Started a local Ray instance.
╭─────────────────────────────────────────────────────────────────────────────╮
│ Configuration for experiment     default_1c4e66587d8e427a8111f55e7954e792   │
├─────────────────────────────────────────────────────────────────────────────┤
│ Search algorithm                 BasicVariantGenerator                      │
│ Scheduler                        FIFOScheduler                              │
│ Number of trials                 1                                          │
╰─────────────────────────────────────────────────────────────────────────────╯

View detailed results here: /Users/shpa7847/ray_results/default_1c4e66587d8e427a8111f55e7954e792
To visualize your results with TensorBoard, run: `tensorboard --logdir /tmp/ray/session_2024-11-06_13-57-37_753686_95652/artifacts/2024-11-06_13-57-40/default_1c4e66587d8e427a8111f55e7954e792/driver_artifacts`

Trial status: 1 PENDING
Current time: 2024-11-06 13:57:41. Total running time: 0s
Logical resource usage: 1.0/10 CPUs, 0/0 GPUs
╭──────────────────────────────────────────────╮
│ Trial name                          status   │
├──────────────────────────────────────────────┤
│ DreamerV3_CartPole-v1_c30ac_00000   PENDING  │
╰──────────────────────────────────────────────╯
(pid=96069) /Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/gymnasium/spaces/box.py:130: UserWarning: WARN: Box bound precision lowered by casting to float32
(pid=96069)   gym.logger.warn(f"Box bound precision lowered by casting to {self.dtype}")
(pid=96069) /Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/gymnasium/utils/passive_env_checker.py:164: UserWarning: WARN: The obs returned by the `reset()` method was expecting numpy array dtype to be float32, actual type: float64
(pid=96069)   logger.warn(
(pid=96069) /Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/gymnasium/utils/passive_env_checker.py:188: UserWarning: WARN: The obs returned by the `reset()` method is not within the observation space.
(pid=96069)   logger.warn(f"{pre} is not within the observation space.")
(DreamerV3 pid=96069) 2024-11-06 13:57:48,398   WARNING deprecation.py:50 -- DeprecationWarning: `RLModule(config=[RLModuleConfig object])` has been deprecated. Use `RLModule(observation_space=.., action_space=.., inference_only=.., model_config=.., catalog_class=..)` instead. This will raise an error in the future!
(DreamerV3 pid=96069) /Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/keras/src/layers/layer.py:391: UserWarning: `build()` was called on layer 'dreamer_model', however the layer does not have a `build()` method implemented and it looks like it has unbuilt state. This will cause the layer to be marked as built, despite not being actually built, which may cause failures down the line. Make sure to implement a proper `build()` method.
(DreamerV3 pid=96069)   warnings.warn(
2024-11-06 13:58:00,502 ERROR tune_controller.py:1331 -- Trial task failed for trial DreamerV3_CartPole-v1_c30ac_00000
Traceback (most recent call last):
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/air/execution/_internal/event_manager.py", line 110, in resolve_future
    result = ray.get(future)
             ^^^^^^^^^^^^^^^
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/_private/auto_init_hook.py", line 21, in auto_init_wrapper
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/_private/client_mode_hook.py", line 103, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/_private/worker.py", line 2745, in get
    values, debugger_breakpoint = worker.get_objects(object_refs, timeout=timeout)
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/_private/worker.py", line 903, in get_objects
    raise value
ray.exceptions.ActorDiedError: The actor died because of an error raised in its creation task, ray::DreamerV3.__init__() (pid=96069, ip=127.0.0.1, actor_id=46355e95cbb5951113bab35201000000, repr=DreamerV3)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/algorithms/algorithm.py", line 584, in __init__
    super().__init__(
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/tune/trainable/trainable.py", line 158, in __init__
    self.setup(copy.deepcopy(self.config))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/algorithms/dreamerv3/dreamerv3.py", line 492, in setup
    super().setup(config)
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/algorithms/algorithm.py", line 802, in setup
    self.learner_group = self.config.build_learner_group(
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/algorithms/algorithm_config.py", line 1237, in build_learner_group
    learner_group = LearnerGroup(config=self.copy(), module_spec=rl_module_spec)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/core/learner/learner_group.py", line 130, in __init__
    self._learner.build()
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/core/learner/tf/tf_learner.py", line 275, in build
    super().build()
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/core/learner/learner.py", line 313, in build
    self.configure_optimizers()
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/core/learner/learner.py", line 426, in configure_optimizers
    self.configure_optimizers_for_module(module_id=module_id, config=config)
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/algorithms/dreamerv3/tf/dreamerv3_tf_learner.py", line 57, in configure_optimizers_for_module
    self.register_optimizer(
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/core/learner/learner.py", line 359, in register_optimizer
    self._check_registered_optimizer(optimizer, params)
  File "/Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/core/learner/tf/tf_learner.py", line 180, in _check_registered_optimizer
    raise ValueError(
ValueError: One of the parameters (<KerasVariable shape=(4, 256), dtype=float32, path=dreamer_model/vector_encoder/dense/kernel>) in the registered optimizer is not a tf.Variable!

Versions / Dependencies

absl-py 2.1.0 aiosignal 1.3.1 astunparse 1.6.3 attrs 24.2.0 certifi 2024.8.30 charset-normalizer 3.4.0 click 8.1.7 cloudpickle 3.1.0 decorator 5.1.1 dm-tree 0.1.8 Farama-Notifications 0.0.4 filelock 3.16.1 flatbuffers 24.3.25 frozenlist 1.5.0 fsspec 2024.10.0 gast 0.6.0 google-pasta 0.2.0 grpcio 1.67.1 gymnasium 0.28.1 h5py 3.12.1 idna 3.10 imageio 2.36.0 jax-jumpy 1.0.0 jsonschema 4.23.0 jsonschema-specifications 2024.10.1 keras 3.6.0 lazy_loader 0.4 libclang 18.1.1 lz4 4.3.3 Markdown 3.7 markdown-it-py 3.0.0 MarkupSafe 3.0.2 mdurl 0.1.2 ml-dtypes 0.4.1 msgpack 1.1.0 namex 0.0.8 networkx 3.4.2 numpy 2.0.2 opt_einsum 3.4.0 optree 0.13.0 packaging 24.1 pandas 2.2.3 pillow 11.0.0 pip 24.1.2 protobuf 5.28.3 pyarrow 18.0.0 Pygments 2.18.0 python-dateutil 2.9.0.post0 pytz 2024.2 PyYAML 6.0.2 ray 2.38.0 referencing 0.35.1 requests 2.32.3 rich 13.9.4 rpds-py 0.21.0 scikit-image 0.24.0 scipy 1.14.1 setuptools 70.3.0 shellingham 1.5.4 six 1.16.0 tensorboard 2.18.0 tensorboard-data-server 0.7.2 tensorboardX 2.6.2.2 tensorflow 2.18.0 tensorflow-io-gcs-filesystem 0.37.1 tensorflow-probability 0.24.0 termcolor 2.5.0 tf_keras 2.18.0 tifffile 2024.9.20 typer 0.12.5 typing_extensions 4.12.2 tzdata 2024.2 urllib3 2.2.3 Werkzeug 3.1.2 wheel 0.44.0 wrapt 1.16.0

Reproduction script

  1. Create new virtual environment
  2. pip install ray[rllib]
  3. pip install tensorflow
  4. pip install tensorflow-probability
  5. pip install tf-keras
  6. python run_regression_tests.py --dir /Users/shpa7847/Library/CloudStorage/OneDrive-UCB-O365/Documents/AVSLab/bsk_rl/rayvenv/lib/python3.11/site-packages/ray/rllib/tuned_examples/dreamerv3/cartpole.py

Issue Severity

None

Mark2000 commented 2 weeks ago

For some additional context, I have managed to produce a different error that occurs farther along in execution in an existing venv, but haven't been able to replicate it in a fresh venv. Notably occurs after the dreamer_model is created:

(DreamerV3 pid=16827) Install gputil for GPU system monitoring.
(DreamerV3 pid=16827) Model: "dreamer_model"
(DreamerV3 pid=16827) _________________________________________________________________
(DreamerV3 pid=16827)  Layer (type)                Output Shape              Param #   
(DreamerV3 pid=16827) =================================================================
(DreamerV3 pid=16827)  world_model (WorldModel)    multiple                  0 (unused)
(DreamerV3 pid=16827) |¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯|
(DreamerV3 pid=16827) | vector_encoder (MLP)       multiple                  1536     |
(DreamerV3 pid=16827) ||¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯||
(DreamerV3 pid=16827) || dense (Dense)             multiple                  1024    ||
(DreamerV3 pid=16827) ||                                                             ||
...
...
(DreamerV3 pid=16827) | reward_layer_255buckets (  multiple                  65535    |
(DreamerV3 pid=16827) | RewardPredictorLayer)                                         |
(DreamerV3 pid=16827) ¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯
(DreamerV3 pid=16827) =================================================================
(DreamerV3 pid=16827) Total params: 3551238 (13.55 MB)
(DreamerV3 pid=16827) Trainable params: 3157509 (12.04 MB)
(DreamerV3 pid=16827) Non-trainable params: 393729 (1.50 MB)
(DreamerV3 pid=16827) _________________________________________________________________

Trial status: 1 RUNNING
Current time: 2024-11-06 14:09:41. Total running time: 30s
Logical resource usage: 1.0/12 CPUs, 0/0 GPUs
╭──────────────────────────────────────────────╮
│ Trial name                          status   │
├──────────────────────────────────────────────┤
│ DreamerV3_CartPole-v1_5e43c_00000   RUNNING  │
╰──────────────────────────────────────────────╯
Trial status: 1 RUNNING
Current time: 2024-11-06 14:10:11. Total running time: 1min 0s
Logical resource usage: 1.0/12 CPUs, 0/0 GPUs
╭──────────────────────────────────────────────╮
│ Trial name                          status   │
├──────────────────────────────────────────────┤
│ DreamerV3_CartPole-v1_5e43c_00000   RUNNING  │
╰──────────────────────────────────────────────╯
2024-11-06 14:10:30,548 ERROR tune_controller.py:1331 -- Trial task failed for trial DreamerV3_CartPole-v1_5e43c_00000
Traceback (most recent call last):
  File "/Users/markstephenson/avslab/.venv/lib/python3.10/site-packages/ray/air/execution/_internal/event_manager.py", line 110, in resolve_future
    result = ray.get(future)
  File "/Users/markstephenson/avslab/.venv/lib/python3.10/site-packages/ray/_private/auto_init_hook.py", line 21, in auto_init_wrapper
    return fn(*args, **kwargs)
  File "/Users/markstephenson/avslab/.venv/lib/python3.10/site-packages/ray/_private/client_mode_hook.py", line 103, in wrapper
    return func(*args, **kwargs)
  File "/Users/markstephenson/avslab/.venv/lib/python3.10/site-packages/ray/_private/worker.py", line 2661, in get
    values, debugger_breakpoint = worker.get_objects(object_refs, timeout=timeout)
  File "/Users/markstephenson/avslab/.venv/lib/python3.10/site-packages/ray/_private/worker.py", line 871, in get_objects
    raise value.as_instanceof_cause()
ray.exceptions.RayTaskError(KeyError): ray::DreamerV3.train() (pid=16827, ip=127.0.0.1, actor_id=725f37650ad104326a90233e01000000, repr=DreamerV3)
  File "/Users/markstephenson/avslab/.venv/lib/python3.10/site-packages/ray/tune/trainable/trainable.py", line 331, in train
    raise skipped from exception_cause(skipped)
  File "/Users/markstephenson/avslab/.venv/lib/python3.10/site-packages/ray/tune/trainable/trainable.py", line 328, in train
    result = self.step()
  File "/Users/markstephenson/avslab/.venv/lib/python3.10/site-packages/ray/rllib/algorithms/algorithm.py", line 969, in step
    self.env_runner_group.sync_env_runner_states(
  File "/Users/markstephenson/avslab/.venv/lib/python3.10/site-packages/ray/rllib/env/env_runner_group.py", line 395, in sync_env_runner_states
    self.local_env_runner.set_state(
  File "/Users/markstephenson/avslab/.venv/lib/python3.10/site-packages/ray/rllib/algorithms/dreamerv3/utils/env_runner.py", line 555, in set_state
    self.module.set_state(state[COMPONENT_RL_MODULE][DEFAULT_MODULE_ID])
KeyError: 'rl_module'

Associated package versions are here:

keras                         2.15.0
ray                           2.35.0
tensorflow                    2.15.0
tensorflow-estimator          2.15.0
tensorflow-io-gcs-filesystem  0.37.1
tensorflow-macos              2.15.0
tensorflow-probability        0.23.0
Mark2000 commented 1 week ago

The latter issue I encountered seems to be this unresolved one: https://github.com/ray-project/ray/issues/47527