Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Interoperability with tonic #141

Open
aliaksei-imi opened this issue Jan 21, 2025 · 4 comments
Open

Interoperability with tonic #141

aliaksei-imi opened this issue Jan 21, 2025 · 4 comments

Comments

@aliaksei-imi
Copy link

Can Prost reflect be used together with tonic to inspect messages?

@andrewhickman
Copy link
Owner

Yes, this library can be used with tonic.

You can convert request message type to a DynamicMessage using transcode_to_dynamic or transcode_from.

You can also use DynamicMessage directly as a request type, by implementing a custom Codec. You can see an example of this here:
https://github.com/andrewhickman/lanquetta/blob/main/src/grpc/codec.rs

@aliaksei-imi
Copy link
Author

aliaksei-imi commented Jan 25, 2025

Oh, great,

dm.transcode_from(request.get_ref())

indeed works, thank you!

To clarify the goal, i'm having a server that accepts dynamic messages with hundreds of fields of different types. These messages need to be inspected, and latency kept minimal. So afais transcode_from might bring additional overhead to encode/transcode each request.

So not sure that's the right issue to ask, but is there a way to add dynamic message to tonic code generation or some other way to decode client's request straight to dynamic message? Because your example seems to be related to grpc client, not server.

@aliaksei-imi
Copy link
Author

Tried implementing a custom encoder but there's a trap, because it requires default() to be implemented (so DynamicMessage can't be used), and ReflectMessage doesn't allow fields' values reflection.

@aliaksei-imi
Copy link
Author

aliaksei-imi commented Jan 26, 2025

best I could get is having a typeid -> reflection name hashmap

    fn decoder(&mut self) -> Self::Decoder {
        let req_tpid = TypeId::of::<Request>();
        let resp_tpid = TypeId::of::<Response>();

        Decoder {
            ph: PhantomData,
            names: HashMap::from([
                (req_tpid, "reflect.Request".into()),
                (resp_tpid, "reflect.Response".into()),
            ]),
        }
    }

and using type id of Decode generic type

    fn decode(&mut self, buf: &mut DecodeBuf<'_>) -> Result<Option<Self::Item>, Self::Error> {
        if !buf.has_remaining() {
            return Ok(None);
        }

        let tpid = TypeId::of::<Self::Item>();

        let name = self
            .names
            .get(&tpid)
            .ok_or(Status::internal("unknown message type"))?;

        let desc = DESCRIPTOR_POOL.get_message_by_name(name).ok_or(Status::internal("unknown message type"))?;

        let msg = DynamicMessage::decode(desc, buf).map_err(|e| Status::internal(e.to_string()))?;

        let mut u: U = U::default();
        u.set_message(msg);

and this requires wrapping dynamic message to implement a new method, so all this looks pretty dirty, but reading numerous articles about prost's serialization slowness, this avoids double de/serialization on each request, so might be fine

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants