-
Notifications
You must be signed in to change notification settings - Fork 24
Open
Description
I tried to use fxhash to distribute objects between buckets. To do this I use buckets[fxhash(value) % buckets.len()].
However I noticed that all objects ended up in the same bucket because the low-order bits (in fact the entire lower half of the digest) was constant.
This example shows the issue:
for &value in &[b"00000001", b"00000002", b"00000003"] {
let mut h = FxHasher::default();
h.write(value);
let r: u64 = h.finish();
println!("{:08x}", r);
}
output:
8eb24dfba44debf0
23b24dfba44debf0
b8b24dfba44debf0
Is this a known issue? Is there a recommended method for truncating the digest without losing too much entropy? This happens with a much higher number of test values too (tried 100,000 values formatted the same way).
fnv gives a better result in this case:
5bedc133e5925f92
5bedc033e5925ddf
5bedbf33e5925c2c
This might warrant a warning in the docs.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels