Skip to content

How to use PPO to train in psro_scenario #59

@donotbelieveit

Description

@donotbelieveit

I can not find the implementation of PPO in this project.Through docs I know policy is compatible with Tianshou,but what about trainer?How can I use PPO to train in psro_scenario?I will appreciate it if you can answer my question.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions