-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: jl projection #2
Conversation
labrador/src/jl_projection.rs
Outdated
let mut projected_data = Vec::with_capacity(data.len()); | ||
for data_vec in data.clone() { | ||
let mut projected_vector: Vec<RingGoldilock256> = Vec::with_capacity(target_dim); | ||
for projection_row in projection_matrix.clone() { | ||
let projection_value = RingGoldilock256::dot_product(&data_vec, &projection_row); | ||
projected_vector.push(projection_value); | ||
} | ||
projected_data.push(projected_vector); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably using reference instead of cloning is better here. I am not very sure but we can discuss here
let mut projected_data = Vec::with_capacity(data.len()); | |
for data_vec in data.clone() { | |
let mut projected_vector: Vec<RingGoldilock256> = Vec::with_capacity(target_dim); | |
for projection_row in projection_matrix.clone() { | |
let projection_value = RingGoldilock256::dot_product(&data_vec, &projection_row); | |
projected_vector.push(projection_value); | |
} | |
projected_data.push(projected_vector); | |
} | |
let mut projected_data = Vec::with_capacity(data.len()); | |
for data_vec in &data { | |
let mut projected_vector: Vec<RingGoldilock256> = Vec::with_capacity(target_dim); | |
for projection_row in &projection_matrix { | |
let projection_value = RingGoldilock256::dot_product(&data_vec, &projection_row); | |
projected_vector.push(projection_value); | |
} | |
projected_data.push(projected_vector); | |
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And I reckon the result of project should be a vector of pure numbers
for example,
we have
lets the randomly sampled projection vector be
I think in the end we will get a 256-length vector [9, 2, 13, ...]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have different understanding here, since the given data would be a polynomial ring matrix, the result should be a polynomial ring matrix with lower dimension.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think @FoodChain1028 is correct, refer to page 35 of https://eprint.iacr.org/2024/311.pdf#page=34.19 , you will find the result of JL projection is in
In the Greyhound paper (https://eprint.iacr.org/2024/1293.pdf#page=28.41), they made some changes to the details of the JL project. I think we should follow these changes. But I'm not sure what the probability distribution of {1, -1}, maybe 1/2? |
Closing the pr for now as we will be focusing on the poc prover implementation and will likely use the implementation from math library |
Summary
This pr implemented the JL projection method that projects the given polynomial ring matrix to a polynomial ring matrix with smaller dimension.
Example
Given a data matrix D_{2, 256} by setting the target dimension of 128, the expected result would be a data matrix of D_{2, 128}