PKU-Alignment / omnisafe

JMLR: OmniSafe is an infrastructural framework for accelerating SafeRL research.
https://www.omnisafe.ai
Apache License 2.0
933 stars 132 forks source link

[Question] how can I run agent in omnisafe but using safety_gymnasium env #132

Closed zlpiscoming closed 1 year ago

zlpiscoming commented 1 year ago

Required prerequisites

Questions

I want to run a omnisafe with a safety_gymnasium env or some other env, how can i train it and eval it. `import safety_gymnasium import omnisafe

if name == 'main': env = safety_gymnasium.make("SafetyCarPush2-v0") agent = omnisafe.Agent('PPOLag', env) agent.learn() obs, info = env.reset() ep_reward, ep_cost = 0, 0 for i in range(1000): action, states = agent.predict(obs, deterministic=True) obs, reward, cost, done, , info = env.step(action) ep_reward += reward ep_cost += cost env.render() if done: print(ep_reward, ep_cost) obs, info = env.reset() ep_reward, ep_cost = 0, 0 env.close()`

like this.

Gaiejj commented 1 year ago

You can run the following command line:


cd omnisafe/exmaples
python train_policy.py --algo ALGO --env-id ENV

where ALGO can be algorithms in omnisafe, e.g. ppo, trpo, and ENV can be SafetyPointGoal1-v0 and so on.

Then you can check your experiment results in omnisafe/examples/runs. You can copy the path in examples/evaluate_saved_policy.py and run

cd omnisafe/examples
python evaluate_saved_policy.py

to evaluate your policy.

zlpiscoming commented 1 year ago

can i use omnisafe agent for the env which i create myself.

friedmainfunction commented 1 year ago

can i use omnisafe agent for the env which i create myself.

Sorry for the late reply. The answer is yes. You can check out omnisafe/envs. Using wrappers and env_register which shield the differences between different environments, you can train policy on your own env. And maybe you also need to see omnisafe/adapter, which handles the rollout process.

zmsn-2077 commented 1 year ago

Feel free to ask to reopen this if you have more questions.