Skip to content

Latest commit

 

History

History
95 lines (72 loc) · 2.15 KB

README.md

File metadata and controls

95 lines (72 loc) · 2.15 KB

Clash Royale Env

Action Space Discrete(2304)
Observation Shape (128, 128, 3)
Observation High 255
Observation Low 0
Import import clash_royale
gymnasium.make("clash-royale", render_mode="rgb_array")

Description

Clash Royale as a Gymnasium environment. Supports Python versions 3.10 and above.

Installation

pip install git+https://github.com/MSU-AI/clash-royale-rl.git@0.0.1

Usage

  1. Import it to train your RL model
import clash_royale
env = gymnasium.make("clash-royale", render_mode="rgb_array")

The package relies on import side-effects to register the environment name so, even though the package is never explicitly used, its import is necessary to access the environment.

  1. Some sample code
# WARNING: This code is subject to change and may be OUTDATED!
import clash_royale
import gymnasium
env = gymnasium.make("clash-royale", render_mode="rgb_array")

obs, _ = env.reset()
while True:
    # Next action:
    # (feed the observation to your agent here)
    action = env.action_space.sample()

    # Processing:
    obs, reward, terminated, _, info = env.step(action)
    
    # Checking if the player is still alive
    if terminated:
        break

env.close()

Action Space

Clash Royale has the action space Discrete(2304).

Variable Meaning
x Card x-coordinate
y Card y-coordinate
z Card index in hand

Corresponding action space index of x * y * z.

Observation Space

The observation will be the RGB image that is displayed to a human player with observation space Box(low=0, high=255, shape=(128, 128, 3), dtype=np.uint8).

Version History

  • v0.0.1: initial version release with mock api calls for internal testing