Skip to content

PPO on continuous actions #77

@zaksemenov

Description

@zaksemenov

I noticed that in the PPO agent initialization it forces the is_action_continuous=False whereas the PPO algorithm and other libraries implementing PPO allow continuous actions. Can this be added to Pearl as well

https://github.com/facebookresearch/Pearl/blob/main/pearl/policy_learners/sequential_decision_making/ppo.py#L99

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions