Skip to content

Commit

Permalink
update batch size in D3IL so it works with the new form of gradient u…
Browse files Browse the repository at this point in the history
…pdate
  • Loading branch information
allenzren committed Dec 24, 2024
1 parent 1d04211 commit e7f73df
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion cfg/d3il/finetune/avoid_m1/ft_ppo_diffusion_mlp.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ train:
reward_scale_running: True
reward_scale_const: 1.0
gae_lambda: 0.95
batch_size: ${eval:'round(${train.n_steps} * ${env.n_envs} / 2)'}
batch_size: ${eval:'round(${train.n_steps} * ${env.n_envs} * ${ft_denoising_steps} / 2)'}
update_epochs: 10
vf_coef: 0.5
target_kl: 1
Expand Down
2 changes: 1 addition & 1 deletion cfg/d3il/finetune/avoid_m2/ft_ppo_diffusion_mlp.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ train:
reward_scale_running: True
reward_scale_const: 1.0
gae_lambda: 0.95
batch_size: ${eval:'round(${train.n_steps} * ${env.n_envs} / 2)'}
batch_size: ${eval:'round(${train.n_steps} * ${env.n_envs} * ${ft_denoising_steps} / 2)'}
update_epochs: 10
vf_coef: 0.5
target_kl: 1
Expand Down
2 changes: 1 addition & 1 deletion cfg/d3il/finetune/avoid_m3/ft_ppo_diffusion_mlp.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ train:
reward_scale_running: True
reward_scale_const: 1.0
gae_lambda: 0.95
batch_size: ${eval:'round(${train.n_steps} * ${env.n_envs} / 2)'}
batch_size: ${eval:'round(${train.n_steps} * ${env.n_envs} * ${ft_denoising_steps} / 2)'}
update_epochs: 10
vf_coef: 0.5
target_kl: 1
Expand Down

0 comments on commit e7f73df

Please sign in to comment.