Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update: chunk size limitation #171

Merged
merged 8 commits into from
Nov 17, 2024
Merged

update: chunk size limitation #171

merged 8 commits into from
Nov 17, 2024

Conversation

clostao
Copy link
Member

@clostao clostao commented Nov 15, 2024

IPLD node byte length limit was not strict since extrinsic are limited >> 64kb when a hexadecimal string is sent as data. Since it's intended to publish IPLD nodes in binary format (limited to 65535 bytes) a strict limit is needed.

The calculated chunk limit (and maxLinksPerNode ) comes from the 64kb limit minus the maximum size of the IPLD metadata.

Copy link

PR Reviewer Guide 🔍

Here are some key observations to aid the review process:

⏱️ Estimated effort to review: 4 🔵🔵🔵🔵⚪
🧪 PR contains tests
🔒 No security concerns identified
⚡ Recommended focus areas for review

API Change
The function signatures for chunk and root have been modified to include new parameters like maxNodeSize and changing size to bigint. This change could affect existing codebases where these methods are used. Ensure backward compatibility or provide a migration path.

Constants Adjustment
New constants for node size calculations and default values have been introduced. Review these values to ensure they align with system requirements and constraints.

Error Handling
Added filename length validation which throws an error if the filename exceeds the maximum allowed size. This could impact user experience and should be handled gracefully.

New Utility Function
A new utility function stringifyMetadata has been introduced to handle bigint conversion. Ensure this function is robust and handles all edge cases.

Copy link

PR Code Suggestions ✨

Explore these optional code suggestions:

CategorySuggestion                                                                                                                                    Score
Possible bug
Add validation for totalSize to prevent negative values

Validate the totalSize parameter in the processFileToIPLDFormat function to ensure
it is not negative, which could lead to logical errors in processing.

packages/auto-dag-data/src/ipld/chunker.ts [51]

+if (totalSize < 0n) throw new Error('totalSize cannot be negative');
 totalSize: bigint,
Suggestion importance[1-10]: 8

Why: Validating totalSize to ensure it is not negative is crucial as negative sizes could lead to logical errors in data processing, potentially affecting the entire file handling logic.

8
Add a null check for the data parameter to enhance robustness

In the createFileChunkIpldNode function, check if data is not null or undefined
before proceeding to prevent runtime errors.

packages/auto-dag-data/src/ipld/nodes.ts [10]

+if (!data) throw new Error('Data buffer cannot be null or undefined');
 data: Buffer,
Suggestion importance[1-10]: 7

Why: Adding a null or undefined check for the data parameter in createFileChunkIpldNode function prevents runtime errors, enhancing the function's reliability when handling file data.

7
Best practice
Add a default value for the optional parameter to ensure stability

Ensure that the optional parameter maxNodeSize in the chunk method of the Builders
interface has a default value to prevent potential runtime errors when not provided.

packages/auto-dag-data/src/ipld/builders.ts [17]

-chunk: (data: Buffer, maxNodeSize?: number) => PBNode
+chunk: (data: Buffer, maxNodeSize: number = DEFAULT_MAX_CHUNK_SIZE) => PBNode
Suggestion importance[1-10]: 7

Why: Adding a default value for maxNodeSize in the chunk method improves the function's robustness by ensuring it behaves predictably when the parameter is not provided.

7
Enhancement
Implement error handling in JSON operations to prevent unhandled exceptions

Consider adding error handling for JSON parsing in stringifyMetadata to manage cases
where metadata might not be serializable.

packages/auto-dag-data/src/utils/metadata.ts [3-5]

-JSON.stringify(metadata, (_, v) =>
+try {
+  JSON.stringify(metadata, (_, v) =>
+} catch (error) {
+  throw new Error('Failed to stringify metadata: ' + error.message);
+}
Suggestion importance[1-10]: 6

Why: Implementing error handling for JSON stringification in stringifyMetadata is beneficial to manage serialization errors gracefully, especially when the metadata might contain non-serializable values.

6

@clostao clostao merged commit fd4b400 into main Nov 17, 2024
1 check passed
@clostao clostao deleted the update/limit-size branch November 17, 2024 13:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants