Fast arbitrary protobuf data to json conversion
- No schema required
- Use field number as json key
- Configurable bytes encoding (base64, hex, byte array, etc.)
- Automatically guess length-delimited value types (string, nested message, bytes)
- Length-delimited value type is guessed based on content. It may not always be correct.
- Repeated fields (with the same field number) may not be grouped into arrays when only one field is parsed.
Add it to your Cargo.toml
:
[dependencies]
protobuf-to-json = "0.1"
use protobuf_to_json::Parser;
use serde_json::json;
let data = [
0x0d, 0x1c, 0x00, 0x00, 0x00, 0x12, 0x03, 0x59, 0x6f, 0x75, 0x1a, 0x02, 0x4d, 0x65,
0x20, 0x2b, 0x2a, 0x0a, 0x0a, 0x06, 0x61, 0x62, 0x63, 0x31, 0x32, 0x33, 0x12, 0x00,
];
let parser = Parser::new();
let json = parser.parse(&data).unwrap();
println!("{}", json);
let expected = json!({
"1": 28,
"2": "You",
"3": "Me",
"4": 43,
"5": {
"1": "abc123",
"2": ""
}
});
assert_eq!(json, expected);
When parsing raw protobuf data without known schema, the performance comparison with protofish is as follows.
crate | time | throughput |
---|---|---|
protofish (parse context*) | 3.7243 µs | 101.15 MiB/s |
protofish (not parse context) | 317.92 ns | 1.1571 GiB/s |
protobuf-to-json | 172.50 ns | 2.1326 GiB/s |
protofish parses .proto
file using pest, whose performance is not particularly good.
For detailed performance test info, see parse_once.rs.
This project is licensed under either of
- Apache License, Version 2.0, (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.