Skip to content

Commit

Permalink
Merge pull request #44 from Anders429/master
Browse files Browse the repository at this point in the history
Merge master into dev
  • Loading branch information
Anders429 authored Dec 18, 2024
2 parents f4fc1d3 + b2b9f43 commit e784031
Show file tree
Hide file tree
Showing 5 changed files with 67 additions and 14 deletions.
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Changelog

## 0.8.0 - 2024-06-27
### Added
- `Deserializer::deserialize_identifier()` now deserializes `Token::Bytes` along with `Token::Str` and `Token::Field`.

## 0.7.1 - 2023-12-26
### Changed
- `PartialEq` implementation for `Tokens` now avoids unnecessary iterator cloning when checking against `Unordered` `Token`s.
Expand Down
2 changes: 1 addition & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "serde_assert"
version = "0.7.1"
version = "0.8.0"
authors = ["Anders Evensen"]
edition = "2021"
rust-version = "1.56.0"
Expand Down
61 changes: 55 additions & 6 deletions src/de.rs
Original file line number Diff line number Diff line change
Expand Up @@ -59,14 +59,14 @@ use serde::{
/// The following options can be configured on the [`Builder`]:
///
/// - [`is_human_readable()`]: Determines whether the deserializer will interpret the input tokens
/// in a readable or compact format. Useful for complicated structs wishing to provide different
/// outputs depending on the readability of the serialization type.
/// in a readable or compact format. Useful for complicated structs wishing to provide different
/// outputs depending on the readability of the serialization type.
/// - [`self_describing()`]: Determines whether the deserialization should interpret the input
/// tokens as self-describing, meaning the type the tokens should deserialize to can be discerned
/// directly from the tokens themselves. If this is set to `false`, calls to [`deserialize_any()`]
/// will result in an error.
/// tokens as self-describing, meaning the type the tokens should deserialize to can be discerned
/// directly from the tokens themselves. If this is set to `false`, calls to [`deserialize_any()`]
/// will result in an error.
/// - [`zero_copy()`]: Defines whether zero-copy deserialization should be permitted by the
/// `Deserializer`, allowing deserializations of strings and byte sequences to avoid allocations.
/// `Deserializer`, allowing deserializations of strings and byte sequences to avoid allocations.
///
/// # Example
/// ``` rust
Expand Down Expand Up @@ -695,6 +695,7 @@ impl<'a, 'de> de::Deserializer<'de> for &'a mut Deserializer<'de> {
let token = self.next_token()?;
match token {
CanonicalToken::Str(v) => visitor.visit_str(v),
CanonicalToken::Bytes(v) => visitor.visit_bytes(v),
CanonicalToken::Field(v) => visitor.visit_str(v),
_ => Err(Self::Error::invalid_type((token).into(), &visitor)),
}
Expand Down Expand Up @@ -3398,6 +3399,54 @@ mod tests {
);
}

#[test]
fn deserialize_struct_string_fields() {
let mut deserializer = Deserializer::builder([
Token::Struct {
name: "Struct",
len: 2,
},
Token::Str("foo".to_owned()),
Token::U32(42),
Token::Str("bar".to_owned()),
Token::Bool(false),
Token::StructEnd,
])
.build();

assert_ok_eq!(
Struct::deserialize(&mut deserializer),
Struct {
foo: 42,
bar: false,
}
);
}

#[test]
fn deserialize_struct_byte_fields() {
let mut deserializer = Deserializer::builder([
Token::Struct {
name: "Struct",
len: 2,
},
Token::Bytes(b"foo".to_vec()),
Token::U32(42),
Token::Bytes(b"bar".to_vec()),
Token::Bool(false),
Token::StructEnd,
])
.build();

assert_ok_eq!(
Struct::deserialize(&mut deserializer),
Struct {
foo: 42,
bar: false,
}
);
}

#[test]
fn deserialize_struct_error_name() {
let mut deserializer = Deserializer::builder([
Expand Down
12 changes: 6 additions & 6 deletions src/ser.rs
Original file line number Diff line number Diff line change
Expand Up @@ -111,13 +111,13 @@ pub enum SerializeStructAs {
/// # Configuration
/// The following options can be configured on the [`Builder`]:
///
/// - [`is_human_readable()`]: Determines whether the serializer will serialize values in a
/// readable format or a compact format. Useful for complicated structs wishing to provide
/// different outputs depending on the readability of the serialization type.
/// - [`is_human_readable()`]: Determines whether the serializer will serialize values in a readable
/// format or a compact format. Useful for complicated structs wishing to provide different
/// outputs depending on the readability of the serialization type.
/// - [`serialize_struct_as()`]: Specifies how the serializer should serialize structs. Compact
/// formats often serialize structs as sequences. By enabling this setting, tokens can be produced
/// in this format, and can then be deserialized to ensure structs deserialized as sequences are
/// deserialized correctly.
/// formats often serialize structs as sequences. By enabling this setting, tokens can be produced
/// in this format, and can then be deserialized to ensure structs deserialized as sequences are
/// deserialized correctly.
///
/// # Example
///
Expand Down
2 changes: 1 addition & 1 deletion src/token.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1376,7 +1376,7 @@ impl TryFrom<Context> for Split {
fn try_from(value: Context) -> Result<Self, Self::Error> {
if let Ok(mut split) = Split::try_from(value.remaining.as_slice()) {
for context in &mut split.contexts {
context.nested_context = value.nested_context.clone();
context.nested_context.clone_from(&value.nested_context);
}
Ok(split)
} else if let Some(nested_context) = value.nested_context {
Expand Down

0 comments on commit e784031

Please sign in to comment.