Skip to content

Commit

Permalink
Merge pull request #144 from autonomys/update/separate-drive-pkg
Browse files Browse the repository at this point in the history
Update pkg name and added encryption & compression logic
  • Loading branch information
clostao authored Nov 4, 2024
2 parents a4dfc6b + beeaf59 commit a106417
Show file tree
Hide file tree
Showing 36 changed files with 383 additions and 27 deletions.
4 changes: 2 additions & 2 deletions .vscode/auto.code-workspace
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@
"path": "../packages/auto-consensus",
},
{
"name": "Auto-Drive",
"path": "../packages/auto-drive",
"name": "auto-dag-data",
"path": "../packages/auto-dag-data",
},
{
"name": "Auto-ID",
Expand Down
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ This monorepo contains multiple packages, each serving a specific purpose. All p

- **[`@autonomys/auto-utils`](https://www.npmjs.com/package/@autonomys/auto-utils)**: Core utility functions for interacting with the Autonomys Network.
- **[`@autonomys/auto-consensus`](https://www.npmjs.com/package/@autonomys/auto-consensus)**: Functions for interacting with the Consensus Layer.
- **[`@autonomys/auto-drive`](https://www.npmjs.com/package/@autonomys/auto-drive)**: Tools for preparing and managing data for on-chain storage.
- **[`@autonomys/auto-dag-data`](https://www.npmjs.com/package/@autonomys/auto-dag-data)**: Tools for preparing and managing data for on-chain storage.
- **[`@autonomys/auto-id`](https://www.npmjs.com/package/@autonomys/auto-id)**: Functions for generating, renewing, and revoking Decentralized Identities (Auto IDs).

## Installation
Expand Down Expand Up @@ -109,15 +109,15 @@ import { transfer } from '@autonomys/auto-consensus'
})()
```

### 2. Using `@autonomys/auto-drive`
### 2. Using `@autonomys/auto-dag-data`

The `@autonomys/auto-drive` package provides utilities for creating and managing IPLD DAGs (InterPlanetary Linked Data Directed Acyclic Graphs) for files and folders.
The `@autonomys/auto-dag-data` package provides utilities for creating and managing IPLD DAGs (InterPlanetary Linked Data Directed Acyclic Graphs) for files and folders.

#### **Creating an IPLD DAG from a File**

```typescript
// Import necessary functions
import { createFileIPLDDag } from '@autonomys/auto-drive'
import { createFileIPLDDag } from '@autonomys/auto-dag-data'
import fs from 'fs'

const fileBuffer = fs.readFileSync('path/to/your/file.txt')
Expand All @@ -135,7 +135,7 @@ console.log(`Total nodes in DAG: ${dag.nodes.size}`)

```typescript
// Import necessary functions
import { createFolderIPLDDag } from '@autonomys/auto-drive'
import { createFolderIPLDDag } from '@autonomys/auto-dag-data'
import { CID } from 'multiformats'
import fs from 'fs'
import path from 'path'
Expand Down
File renamed without changes.
28 changes: 14 additions & 14 deletions packages/auto-drive/README.md → packages/auto-dag-data/README.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
# Auto-Drive
# Auto DAG Data

![Autonomys Banner](https://github.com/autonomys/auto-sdk/blob/main/.github/images/autonomys-banner.webp)

[![Latest Github release](https://img.shields.io/github/v/tag/autonomys/auto-sdk.svg)](https://github.com/autonomys/auto-sdk/tags)
[![Build status of the main branch on Linux/OSX](https://img.shields.io/github/actions/workflow/status/autonomys/auto-sdk/build.yaml?branch=main&label=Linux%2FOSX%20build)](https://github.com/autonomys/auto-sdk/actions/workflows/build.yaml)
[![npm version](https://badge.fury.io/js/@autonomys%2Fauto-drive.svg)](https://badge.fury.io/js/@autonomys/auto-drive)
[![npm version](https://badge.fury.io/js/@autonomys%2Fauto-dag-data.svg)](https://badge.fury.io/js/@autonomys/auto-dag-data)

## Overview

The **Autonomys Auto Drive SDK** (`@autonomys/auto-drive`) provides utilities for creating and managing IPLD DAGs (InterPlanetary Linked Data Directed Acyclic Graphs) for files and folders. It facilitates chunking large files, handling metadata, and creating folder structures suitable for distributed storage systems like IPFS.
The **Autonomys Auto DAG Data SDK** (`@autonomys/auto-dag-data`) provides utilities for creating and managing IPLD DAGs (InterPlanetary Linked Data Directed Acyclic Graphs) for files and folders. It facilitates chunking large files, handling metadata, and creating folder structures suitable for distributed storage systems like IPFS.

## Features

Expand All @@ -20,16 +20,16 @@ The **Autonomys Auto Drive SDK** (`@autonomys/auto-drive`) provides utilities fo

## Installation

You can install Auto-Drive using npm or yarn:
You can install Auto-DAG-Data using npm or yarn:

```bash
npm install @autonomys/auto-drive
npm install @autonomys/auto-dag-data
```

or

```bash
yarn add @autonomys/auto-drive
yarn add @autonomys/auto-dag-data
```

## Usage
Expand All @@ -39,7 +39,7 @@ yarn add @autonomys/auto-drive
To create an IPLD DAG from a file, you can use the `createFileIPLDDag` function:

```typescript
import { createFileIPLDDag } from '@autonomys/auto-drive'
import { createFileIPLDDag } from '@autonomys/auto-dag-data'
import fs from 'fs'

const fileBuffer = fs.readFileSync('path/to/your/file.txt')
Expand All @@ -52,7 +52,7 @@ const dag = createFileIPLDDag(fileBuffer, 'file.txt')
To create an IPLD DAG from a folder, you can use the `createFolderIPLDDag` function:

```typescript
import { createFolderIPLDDag } from '@autonomys/auto-drive'
import { createFolderIPLDDag } from '@autonomys/auto-dag-data'
import { CID } from 'multiformats'

// Example child CIDs and folder information
Expand All @@ -70,7 +70,7 @@ const folderDag = createFolderIPLDDag(childCIDs, folderName, folderSize)
You can use functions from the `cid` module to work with CIDs:

```typescript
import { cidOfNode, cidToString, stringToCid } from '@autonomys/auto-drive'
import { cidOfNode, cidToString, stringToCid } from '@autonomys/auto-dag-data'

// Create a CID from a node
const cid = cidOfNode(dag.head)
Expand All @@ -87,7 +87,7 @@ const parsedCID = stringToCid(cidString)
You can encode and decode IPLD nodes:

```typescript
import { encodeNode, decodeNode } from '@autonomys/auto-drive'
import { encodeNode, decodeNode } from '@autonomys/auto-dag-data'

// Encode a node
const encodedNode = encodeNode(dag.head)
Expand All @@ -101,7 +101,7 @@ const decodedNode = decodeNode(encodedNode)
To add metadata to a node, you can create a metadata node:

```typescript
import { createMetadataNode } from '@autonomys/auto-drive'
import { createMetadataNode } from '@autonomys/auto-dag-data'

const metadata = {
name: 'My File',
Expand All @@ -115,7 +115,7 @@ const metadataNode = createMetadataNode(metadata)
### Example: Creating a File DAG and Converting to CID

```typescript
import { createFileIPLDDag, cidOfNode, cidToString } from '@autonomys/auto-drive'
import { createFileIPLDDag, cidOfNode, cidToString } from '@autonomys/auto-dag-data'
import fs from 'fs'

const fileBuffer = fs.readFileSync('path/to/your/file.txt')
Expand All @@ -136,7 +136,7 @@ import {
cidOfNode,
cidToString,
type OffchainMetadata,
} from '@autonomys/auto-drive'
} from '@autonomys/auto-dag-data'
import fs from 'fs'

const metadata: OffchainMetadata = fs.readFileSync('path/to/your/metadata.json')
Expand All @@ -156,7 +156,7 @@ This project is licensed under the MIT License. See the [LICENSE](LICENSE) file
## Additional Resources

- **Autonomys Academy**: Learn more at [Autonomys Academy](https://academy.autonomys.xyz).
- **Auto-Utils Package**: Utility functions used alongside `auto-drive` can be found in [`@autonomys/auto-utils`](../Auto-Utils/README.md).
- **Auto-Utils Package**: Utility functions used alongside `auto-dag-data` can be found in [`@autonomys/auto-utils`](../Auto-Utils/README.md).

## Contact

Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"name": "@autonomys/auto-drive",
"name": "@autonomys/auto-dag-data",
"packageManager": "yarn@4.1.1",
"version": "0.7.4",
"license": "MIT",
Expand Down Expand Up @@ -36,6 +36,7 @@
"dependencies": {
"@ipld/dag-pb": "^4.1.2",
"blake3": "1.1.0",
"fflate": "^0.8.2",
"multiformats": "^13.2.2",
"protobufjs": "^7.4.0",
"protons": "^7.6.0",
Expand Down
File renamed without changes.
84 changes: 84 additions & 0 deletions packages/auto-dag-data/src/compression/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
import { Unzlib, Zlib } from 'fflate'
import type { AwaitIterable } from 'interface-store'
import { CompressionAlgorithm } from '../metadata/index.js'
import { asyncByChunk } from '../utils/async.js'
import type { PickPartial } from '../utils/types.js'
import { CompressionOptions } from './types.js'

export const COMPRESSION_CHUNK_SIZE = 1024 * 1024

export async function* compressFile(
file: AwaitIterable<Buffer>,
{
level = 9,
chunkSize = COMPRESSION_CHUNK_SIZE,
algorithm,
}: PickPartial<CompressionOptions, 'algorithm'>,
): AsyncIterable<Buffer> {
if (algorithm !== CompressionAlgorithm.ZLIB) {
throw new Error('Unsupported compression algorithm')
}
if (level < 0 || level > 9) {
throw new Error('Invalid compression level')
}
if (chunkSize <= 0) {
throw new Error('Invalid chunk size')
}

const zlib = new Zlib({ level })
const compressedChunks: Buffer[] = []

zlib.ondata = (chunk) => {
compressedChunks.push(Buffer.from(chunk))
}

for await (const chunk of asyncByChunk(file, chunkSize)) {
zlib.push(chunk, false)
while (compressedChunks.length > 0) {
yield compressedChunks.shift()!
}
}

zlib.push(new Uint8Array(), true)
while (compressedChunks.length > 0) {
yield compressedChunks.shift()!
}
}

export async function* decompressFile(
compressedFile: AwaitIterable<Buffer>,
{
chunkSize = COMPRESSION_CHUNK_SIZE,
algorithm = CompressionAlgorithm.ZLIB,
level = 9,
}: PickPartial<CompressionOptions, 'algorithm'>,
): AsyncIterable<Buffer> {
if (algorithm !== CompressionAlgorithm.ZLIB) {
throw new Error('Unsupported compression algorithm')
}
if (chunkSize <= 0) {
throw new Error('Invalid chunk size')
}
if (level < 0 || level > 9) {
throw new Error('Invalid compression level')
}

const unzlib = new Unzlib()
const decompressedChunks: Buffer[] = []

unzlib.ondata = (chunk) => {
decompressedChunks.push(Buffer.from(chunk))
}

for await (const chunk of asyncByChunk(compressedFile, chunkSize)) {
unzlib.push(chunk, false)
while (decompressedChunks.length > 0) {
yield decompressedChunks.shift()!
}
}

unzlib.push(new Uint8Array(), true)
while (decompressedChunks.length > 0) {
yield decompressedChunks.shift()!
}
}
11 changes: 11 additions & 0 deletions packages/auto-dag-data/src/compression/types.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
import { CompressionAlgorithm } from '../metadata/index.js'

export type CompressionLevel = 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9

export type ZLibOptions = {
algorithm: CompressionAlgorithm.ZLIB
level: CompressionLevel
chunkSize: number
}

export type CompressionOptions = ZLibOptions
99 changes: 99 additions & 0 deletions packages/auto-dag-data/src/encryption/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,99 @@
import { Crypto } from '@peculiar/webcrypto'
import { randomBytes } from 'crypto'
import { EncryptionAlgorithm, EncryptionOptions } from '../metadata/index.js'
import { asyncByChunk } from '../utils/async.js'
import type { PickPartial } from '../utils/types.js'
import { PasswordGenerationOptions } from './types.js'

const crypto = new Crypto()

export const ENCRYPTING_CHUNK_SIZE = 1024 * 1024
const IV_SIZE = 16
const TAG_SIZE = 16
const ENCRYPTED_CHUNK_SIZE = ENCRYPTING_CHUNK_SIZE + IV_SIZE + TAG_SIZE
const SALT_SIZE = 32

export const getKeyFromPassword = async ({ password, salt }: PasswordGenerationOptions) => {
const encoder = new TextEncoder()
const saltHash =
typeof salt === 'string' ? await crypto.subtle.digest('SHA-256', encoder.encode(salt)) : salt

const keyMaterial = await crypto.subtle.importKey(
'raw',
encoder.encode(password),
'PBKDF2',
false,
['deriveBits', 'deriveKey'],
)

return crypto.subtle.deriveKey(
{
name: 'PBKDF2',
salt: saltHash,
iterations: 100000,
hash: 'SHA-256',
},
keyMaterial,
{ name: 'AES-GCM', length: 256 },
false,
['encrypt', 'decrypt'],
)
}

export const encryptFile = async function* (
file: AsyncIterable<Buffer>,
password: string,
{ chunkSize = ENCRYPTING_CHUNK_SIZE, algorithm }: PickPartial<EncryptionOptions, 'algorithm'>,
): AsyncIterable<Buffer> {
if (algorithm !== EncryptionAlgorithm.AES_256_GCM) {
throw new Error('Unsupported encryption algorithm')
}

const salt = randomBytes(SALT_SIZE)
const key = await getKeyFromPassword({ password, salt })

yield salt

for await (const chunk of asyncByChunk(file, chunkSize)) {
const iv = crypto.getRandomValues(new Uint8Array(IV_SIZE))
const encrypted = await crypto.subtle.encrypt({ name: 'AES-GCM', iv }, key, chunk)
yield Buffer.concat([Buffer.from(iv), Buffer.from(encrypted)])
}
}

export const decryptFile = async function* (
file: AsyncIterable<Buffer>,
password: string,
{ chunkSize = ENCRYPTED_CHUNK_SIZE, algorithm }: PickPartial<EncryptionOptions, 'algorithm'>,
): AsyncIterable<Buffer> {
if (algorithm !== EncryptionAlgorithm.AES_256_GCM) {
throw new Error('Unsupported encryption algorithm')
}

let key: CryptoKey | undefined = undefined
let chunks = Buffer.alloc(0)
for await (const chunk of file) {
chunks = Buffer.concat([chunks, chunk])

if (chunks.length >= SALT_SIZE && !key) {
const salt = chunks.subarray(0, 32)
key = await getKeyFromPassword({ password, salt })
chunks = chunks.subarray(SALT_SIZE)
}

while (key && chunks.length >= chunkSize) {
const iv = chunks.subarray(0, IV_SIZE)
const encryptedChunk = chunk.subarray(IV_SIZE, chunkSize)
const decrypted = await crypto.subtle.decrypt({ name: 'AES-GCM', iv }, key, encryptedChunk)
chunks = chunks.subarray(chunkSize)
yield Buffer.from(decrypted)
}
}

if (key && chunks.length > 0) {
const iv = chunks.subarray(0, IV_SIZE)
const encryptedChunk = chunks.subarray(IV_SIZE, chunkSize)
const decrypted = await crypto.subtle.decrypt({ name: 'AES-GCM', iv }, key, encryptedChunk)
yield Buffer.from(decrypted)
}
}
4 changes: 4 additions & 0 deletions packages/auto-dag-data/src/encryption/types.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
export type PasswordGenerationOptions = {
password: string
salt: string | Uint8Array
}
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
export * from './cid/index.js'
export * from './compression/index.js'
export * from './encryption/index.js'
export * from './ipld/index.js'
export * from './metadata/index.js'
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import { CID } from 'multiformats/cid'
import { PBNode } from '../ipld/index.js'
import { FileUploadOptions } from '../metadata/index.js'
import { PBNode } from './index.js'
import {
createChunkedFileIpldNode,
createChunkedMetadataIpldNode,
Expand Down
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
import { CID } from 'multiformats/cid'
import { createNode, PBNode } from '../ipld/index.js'
import { FileUploadOptions, OffchainMetadata } from '../metadata/index.js'
import { encodeIPLDNodeData, MetadataType } from '../metadata/onchain/index.js'
import { DEFAULT_MAX_CHUNK_SIZE, ensureNodeMaxSize } from './chunker.js'
import { createNode, PBNode } from './index.js'

/// Creates a file chunk ipld node
export const createFileChunkIpldNode = (data: Buffer): PBNode =>
Expand Down
File renamed without changes.
File renamed without changes.
Loading

0 comments on commit a106417

Please sign in to comment.