Skip to content

Commit

Permalink
extendes attention algorithm with masking
Browse files Browse the repository at this point in the history
  • Loading branch information
JakubSchwenkbeck authored Dec 7, 2024
1 parent e3c8aff commit f80fa3a
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion src/attention/scaled_dot_attention.rs
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@ pub fn scaled_dot_product_attention(
) -> Array3<f32> {
let scores = scaled_dot_product(q, k, v.clone(), mask);
let sm_scores = softmax_3d(&scores);
// TODO implement masking
tensor_product(&sm_scores, &v)
}

Expand Down

0 comments on commit f80fa3a

Please sign in to comment.