Skip to content

Commit

Permalink
chore: Add whisper dropout to large turbo, up lr to 3e-5
Browse files Browse the repository at this point in the history
  • Loading branch information
saattrupdan committed Oct 29, 2024
1 parent 0594341 commit 6443f2e
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions config/model/whisper-large-turbo.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,8 @@ clean_text: true

# Model hyperparameters
sampling_rate: 16_000
dropout: 0.0
activation_dropout: 0.1
dropout: 0.1
activation_dropout: 0.0
attention_dropout: 0.0
mask_time_prob: 0.5
mask_time_length: 10
Expand All @@ -20,4 +20,4 @@ layerdrop: 0.1 # NOTE: This will automatically be set to 0 in a multi-gpu setti
max_length: 225

# Model-specific optimisation hyperparameters
learning_rate: 1e-5
learning_rate: 3e-5

0 comments on commit 6443f2e

Please sign in to comment.