diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/README.md b/appsync-lambda-s3-presigned-urls-pipeline-resolver/README.md new file mode 100644 index 000000000..b3be7ad41 --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/README.md @@ -0,0 +1,212 @@ +# AppSync Pipeline Resolvers with S3 Presigned URLs + +This pattern creates an AWS AppSync GraphQL API with pipeline resolvers that orchestrate Lambda functions, DynamoDB, and S3 to enable secure file uploads and downloads through presigned URLs. + +Learn more about this pattern at Serverless Land Patterns: http://serverlessland.com/patterns/appsync-lambda-s3-presigned-cdk-python + +**Important:** this application uses various AWS services and there are costs associated with these services after the Free Tier usage - please see the [AWS Pricing page](https://aws.amazon.com/pricing/) for details. You are responsible for any AWS costs incurred. No warranty is implied in this example. + +## Requirements + +* [Create an AWS account](https://portal.aws.amazon.com/gp/aws/developer/registration/index.html) if you do not already have one and log in. The IAM user that you use must have sufficient permissions to make necessary AWS service calls and manage AWS resources. +* [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html) installed and configured +* [Git Installed](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) +* [AWS Cloud Development Kit](https://docs.aws.amazon.com/cdk/latest/guide/cli.html) (AWS CDK) installed +* [Python 3.9+](https://www.python.org/downloads/) installed + +## Deployment Instructions + +1. Create a new directory, navigate to that directory in a terminal and clone the GitHub repository: + ```bash + git clone https://github.com/aws-samples/serverless-patterns + ``` + +2. Change directory to the pattern directory: + ```bash + cd appsync-lambda-s3-presigned-cdk-python + ``` + +3. Create a virtual environment for Python: + ```bash + python3 -m venv .venv + ``` + +4. Activate the virtual environment: + ```bash + source .venv/bin/activate + ``` + For Windows: + ```bash + .venv\Scripts\activate.bat + ``` + +5. Install the Python dependencies: + ```bash + pip install -r requirements.txt + ``` + +6. Bootstrap your AWS account for CDK (if you haven't done this already): + ```bash + cdk bootstrap + ``` + +7. Deploy the stack: + ```bash + cdk deploy + ``` + +8. Note the outputs from the CDK deployment process. These contain the GraphQL API endpoint, API key, S3 bucket name, and DynamoDB table name which are used for testing. + +## How it works + +This pattern demonstrates AWS AppSync pipeline resolvers that chain multiple operations together: + +### Architecture + +The stack deploys: +- **AWS AppSync GraphQL API** with API Key authentication +- **AWS Lambda function** (Python 3.12) for generating S3 presigned URLs +- **Amazon DynamoDB table** for storing note metadata +- **Amazon S3 bucket** with CORS configuration for file attachments + +### Pipeline Resolvers + +**saveNote Mutation Pipeline:** +1. Lambda function generates a presigned upload URL for S3 +2. DynamoDB saves the note with the attachment key and upload URL +3. Returns note data including the temporary upload URL to the client + +**getNote Query Pipeline:** +1. DynamoDB retrieves the note data +2. Lambda conditionally generates a presigned download URL (only if attachment exists) +3. Returns note data with download URL if applicable + +The pipeline resolvers use VTL (Velocity Template Language) mapping templates to pass data between stages using `$ctx.prev.result`, enabling complex orchestration without additional code. + +## Testing + +### Using GraphQL Queries + +1. After deployment, copy the **GraphQLApiEndpoint** and **APIKey** from the CDK outputs. + +2. Use a GraphQL client (like Postman, Insomnia, or the AppSync console) to test the API. + +3. Set the authorization header: + - Header name: `x-api-key` + - Value: [Your API Key from outputs] + +4. **Create a note with file attachment:** + + ```graphql + mutation CreateNote { + saveNote( + NoteId: "note-001" + title: "My First Note" + content: "This note has an attachment" + fileName: "document.pdf" + ) { + NoteId + title + content + attachmentKey + uploadUrl + } + } + ``` + + The response includes an `uploadUrl` field - a presigned S3 URL valid for 1 hour. + +5. **Upload a file using the presigned URL:** + + ```bash + curl -X PUT \ + -H "Content-Type: application/pdf" \ + --data-binary @/path/to/your/document.pdf \ + "[uploadUrl from previous response]" + ``` + +6. **Retrieve the note with download URL:** + + ```graphql + query GetNote { + getNote(NoteId: "note-001") { + NoteId + title + content + attachmentKey + downloadUrl + } + } + ``` + + The response includes a `downloadUrl` field - a presigned S3 URL for downloading the file. + +7. **Download the file:** + + ```bash + curl "[downloadUrl from previous response]" -o downloaded-file.pdf + ``` + +8. **Query all notes:** + + ```graphql + query ListNotes { + allNotes(limit: 10) { + notes { + NoteId + title + content + } + nextToken + } + } + ``` + +9. **Delete a note:** + + ```graphql + mutation DeleteNote { + deleteNote(NoteId: "note-001") { + NoteId + title + } + } + ``` + +### Automated Testing + +Run the included test suite: + +```bash +# Configure test credentials +python test_notes_api.py +``` + +Update the `APPSYNC_URL` and `API_KEY` in the test file before running. + +## Cleanup + +1. Delete the stack: + ```bash + cdk destroy + ``` + +2. Confirm the stack has been deleted: + ```bash + aws cloudformation list-stacks --query "StackSummaries[?contains(StackName,'NotesAppSyncStack')].StackStatus" + ``` + +**Note:** The S3 bucket and DynamoDB table are configured with `DESTROY` removal policy and will be automatically deleted. If you changed the removal policy to `RETAIN`, you'll need to manually delete these resources. + +## Documentation + +* [AWS AppSync Pipeline Resolvers](https://docs.aws.amazon.com/appsync/latest/devguide/pipeline-resolvers.html) +* [Amazon S3 Presigned URLs](https://docs.aws.amazon.com/AmazonS3/latest/userguide/PresignedUrlUploadObject.html) +* [VTL Mapping Templates for AppSync](https://docs.aws.amazon.com/appsync/latest/devguide/resolver-mapping-template-reference.html) +* [AWS AppSync - Managed GraphQL Service](https://aws.amazon.com/appsync/) +* [AWS CDK Python Reference](https://docs.aws.amazon.com/cdk/api/v2/python/) + +---- +Copyright 2025 Amazon.com, Inc. or its affiliates. All Rights Reserved. + +SPDX-License-Identifier: MIT-0 diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/example-pattern.json b/appsync-lambda-s3-presigned-urls-pipeline-resolver/example-pattern.json new file mode 100644 index 000000000..72327a417 --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/example-pattern.json @@ -0,0 +1,68 @@ +{ + "title": "AppSync Pipeline Resolvers with S3 Presigned URLs", + "description": "Create an AppSync GraphQL API with pipeline resolvers that generate S3 presigned URLs for secure file uploads and downloads.", + "language": "Python", + "level": "300", + "framework": "AWS CDK", + "introBox": { + "headline": "How it works", + "text": [ + "This sample project demonstrates how to use AWS AppSync pipeline resolvers to orchestrate multiple data sources in a single GraphQL operation. The pattern integrates AppSync with Lambda, DynamoDB, and S3 to create a notes API with secure file attachment capabilities.", + "When a note is created via the saveNote mutation, the pipeline resolver first invokes a Lambda function to generate a presigned S3 upload URL, then saves the note metadata (including the attachment key and upload URL) to DynamoDB. The client receives both the note data and a temporary upload URL in a single response, enabling immediate file upload without additional API calls.", + "When retrieving a note via the getNote query, the pipeline resolver first fetches the note from DynamoDB, then conditionally invokes Lambda to generate a presigned download URL if an attachment exists. This demonstrates how pipeline resolvers can execute conditional logic and pass data between stages using $ctx.prev.result.", + "This pattern deploys one AppSync GraphQL API, one Lambda function, one DynamoDB table, and one S3 bucket with CORS configuration." + ] + }, + "gitHub": { + "template": { + "repoURL": "https://github.com/aws-samples/serverless-patterns/tree/main/appsync-lambda-s3-presigned-cdk-python", + "templateURL": "serverless-patterns/appsync-lambda-s3-presigned-cdk-python", + "projectFolder": "appsync-lambda-s3-presigned-cdk-python", + "templateFile": "app.py" + } + }, + "resources": { + "bullets": [ + { + "text": "AWS AppSync Pipeline Resolvers", + "link": "https://docs.aws.amazon.com/appsync/latest/devguide/pipeline-resolvers.html" + }, + { + "text": "Amazon S3 Presigned URLs", + "link": "https://docs.aws.amazon.com/AmazonS3/latest/userguide/PresignedUrlUploadObject.html" + }, + { + "text": "AWS AppSync - Build data-driven apps with fully managed GraphQL APIs", + "link": "https://aws.amazon.com/appsync/" + }, + { + "text": "VTL Mapping Templates for AppSync", + "link": "https://docs.aws.amazon.com/appsync/latest/devguide/resolver-mapping-template-reference.html" + } + ] + }, + "deploy": { + "text": [ + "cdk deploy" + ] + }, + "testing": { + "text": [ + "See the GitHub repo for detailed testing instructions." + ] + }, + "cleanup": { + "text": [ + "Delete the stack: cdk destroy." + ] + }, + "authors": [ + { + "name": "Matia RaĊĦetina", + "image": "link-to-your-photo.jpg", + "bio": "Your bio.", + "linkedin": "linked-in-ID", + "twitter": "twitter-handle" + } + ] +} diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/PresignerLambda/handler.py b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/PresignerLambda/handler.py new file mode 100644 index 000000000..6c53acfdf --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/PresignerLambda/handler.py @@ -0,0 +1,204 @@ +import json +import os +import boto3 +from botocore.exceptions import ClientError +from typing import Dict, Any +import logging + +# Configure logging +logger = logging.getLogger() +logger.setLevel(logging.INFO) + +# Initialize AWS clients +s3_client = boto3.client('s3', endpoint_url=f'https://s3.{os.environ.get("AWS_REGION")}.amazonaws.com') +dynamodb = boto3.resource('dynamodb') + +# Environment variables to fetcht bucket and table names +BUCKET_NAME = os.environ['BUCKET_NAME'] +TABLE_NAME = os.environ['TABLE_NAME'] + +# Presigned URL expiration time (in seconds) +# Both upload and download URLs will expire after 1 hour +UPLOAD_URL_EXPIRATION = 3600 +DOWNLOAD_URL_EXPIRATION = 3600 + + +def handler(event: Dict[str, Any], context: Any) -> Dict[str, Any]: + """ + Main Lambda handler for S3 operations. + + Supports two operations: + 1. generateUploadUrl: Creates a presigned URL for uploading files to S3 + 2. generateDownloadUrl: Creates a presigned URL for downloading files from S3 + """ + try: + logger.info(f"Received event: {json.dumps(event)}") + + operation = event.get('operation') + + if operation == 'generateUploadUrl': + return generate_upload_url(event) + elif operation == 'generateDownloadUrl': + return generate_download_url(event) + else: + raise ValueError(f"Unknown operation: {operation}") + + except Exception as e: + logger.error(f"Error processing request: {str(e)}", exc_info=True) + return { + 'error': str(e), + 'message': 'Failed to process request' + } + + +def generate_upload_url(event: Dict[str, Any]) -> Dict[str, Any]: + """ + Generate a presigned URL for uploading a file to S3. + + Args: + event: Contains 'noteId' and optional 'fileName' + + Returns: + Dict containing 'uploadUrl' and 'attachmentKey' + """ + note_id = event.get('noteId') + file_name = event.get('fileName', 'attachment') + + if not note_id: + raise ValueError("noteId is required") + + # Sanitize filename and create S3 key + sanitized_filename = sanitize_filename(file_name) + attachment_key = f"notes/{note_id}/{sanitized_filename}" + + try: + # Generate presigned URL for PUT operation + upload_url = s3_client.generate_presigned_url( + 'put_object', + Params={ + 'Bucket': BUCKET_NAME, + 'Key': attachment_key, + 'ContentType': get_content_type(sanitized_filename), + }, + ExpiresIn=UPLOAD_URL_EXPIRATION, + HttpMethod='PUT' + ) + + logger.info(f"Generated upload URL for note {note_id}, key: {attachment_key}") + + return { + 'uploadUrl': upload_url, + 'attachmentKey': attachment_key, + 'expiresIn': UPLOAD_URL_EXPIRATION + } + + except ClientError as e: + logger.error(f"Error generating upload URL: {str(e)}") + raise Exception(f"Failed to generate upload URL: {str(e)}") + + +def generate_download_url(event: Dict[str, Any]) -> Dict[str, Any]: + """ + Generate a presigned URL for downloading a file from S3. + + Args: + event: Contains 'attachmentKey' + + Returns: + Dict containing 'downloadUrl' if attachment exists, empty dict otherwise + """ + attachment_key = event.get('attachmentKey') + + # If no attachment key, return empty result (note has no attachment) + if not attachment_key: + logger.info("No attachment key provided, skipping download URL generation") + return {} + + try: + # Check if object exists + try: + s3_client.head_object(Bucket=BUCKET_NAME, Key=attachment_key) + except ClientError as e: + if e.response['Error']['Code'] == '404': + logger.warning(f"Attachment not found: {attachment_key}") + return {} + raise + + # Generate presigned URL for GET operation + download_url = s3_client.generate_presigned_url( + 'get_object', + Params={ + 'Bucket': BUCKET_NAME, + 'Key': attachment_key, + }, + ExpiresIn=DOWNLOAD_URL_EXPIRATION, + HttpMethod='GET' + ) + + logger.info(f"Generated download URL for key: {attachment_key}") + + return { + 'downloadUrl': download_url, + 'expiresIn': DOWNLOAD_URL_EXPIRATION + } + + except ClientError as e: + logger.error(f"Error generating download URL: {str(e)}") + raise Exception(f"Failed to generate download URL: {str(e)}") + + +def sanitize_filename(filename: str) -> str: + """ + Sanitize filename to prevent path traversal and invalid characters. + + Args: + filename: Original filename + + Returns: + Sanitized filename safe for S3 + """ + # Remove path components + filename = os.path.basename(filename) + + # Replace spaces with underscores + filename = filename.replace(' ', '_') + + # Remove any potentially dangerous characters + allowed_chars = set('abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789-_.') + filename = ''.join(c for c in filename if c in allowed_chars) + + # Ensure filename is not empty + if not filename: + filename = 'attachment' + + return filename + + +def get_content_type(filename: str) -> str: + """ + Determine content type based on file extension. + + Args: + filename: Name of the file + + Returns: + MIME type string + """ + extension = filename.lower().split('.')[-1] if '.' in filename else '' + + content_types = { + 'txt': 'text/plain', + 'pdf': 'application/pdf', + 'png': 'image/png', + 'jpg': 'image/jpeg', + 'jpeg': 'image/jpeg', + 'gif': 'image/gif', + 'json': 'application/json', + 'csv': 'text/csv', + 'doc': 'application/msword', + 'docx': 'application/vnd.openxmlformats-officedocument.wordprocessingml.document', + 'xls': 'application/vnd.ms-excel', + 'xlsx': 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet', + } + + return content_types.get(extension, 'application/octet-stream') \ No newline at end of file diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/app.py b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/app.py new file mode 100644 index 000000000..87161b2fd --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/app.py @@ -0,0 +1,200 @@ +from aws_cdk import ( + App, + Stack, + CfnOutput, + Duration, + RemovalPolicy, + aws_dynamodb as ddb, + aws_appsync as appsync, + aws_lambda as lambda_, + aws_s3 as s3, +) +from constructs import Construct + +class NotesAppSyncStack(Stack): + def __init__(self, scope: Construct, id: str, **kwargs) -> None: + super().__init__(scope, id, **kwargs) + + # S3 bucket + attachments_bucket = s3.Bucket( + self, "NoteAttachmentsBucket", + versioned=True, + removal_policy=RemovalPolicy.DESTROY, + auto_delete_objects=True, + public_read_access=False, + block_public_access=s3.BlockPublicAccess( + block_public_acls=False, + block_public_policy=False, + ignore_public_acls=False, + restrict_public_buckets=False + ), + cors=[ + s3.CorsRule( + allowed_headers=["*"], + allowed_methods=[ + s3.HttpMethods.PUT, + s3.HttpMethods.POST, + s3.HttpMethods.DELETE, + s3.HttpMethods.GET + ], + allowed_origins=["*"], + max_age=3000 + ) + ] + ) + + # DynamoDB table + table = ddb.Table( + self, + "DynamoDBNotesTable", + partition_key=ddb.Attribute(name="NoteId", type=ddb.AttributeType.STRING), + billing_mode=ddb.BillingMode.PAY_PER_REQUEST, + removal_policy=RemovalPolicy.RETAIN, + ) + + # Lambda function for S3 operations + s3_lambda = lambda_.Function( + self, + "S3OperationsFunction", + runtime=lambda_.Runtime.PYTHON_3_12, + handler="handler.handler", + code=lambda_.Code.from_asset("PresignerLambda"), + timeout=Duration.seconds(30), + environment={ + "BUCKET_NAME": attachments_bucket.bucket_name, + "TABLE_NAME": table.table_name, + }, + ) + + # Grant Lambda permissions + attachments_bucket.grant_read_write(s3_lambda) + table.grant_read_write_data(s3_lambda) + + # AppSync GraphQL API + api = appsync.GraphqlApi( + self, + "Api", + name="MyAppSyncApi", + definition=appsync.Definition.from_file("graphql/schema.graphql"), + authorization_config=appsync.AuthorizationConfig( + default_authorization=appsync.AuthorizationMode( + authorization_type=appsync.AuthorizationType.API_KEY + ) + ), + ) + + # API Key for Output + api_key = appsync.CfnApiKey(self, "AppSyncApiKey", api_id=api.api_id) + + # Data Sources + ddb_ds = appsync.DynamoDbDataSource( + self, + "AppSyncNotesTableDataSource", + api=api, + table=table, + description="The Notes Table AppSync Data Source", + ) + + lambda_ds = appsync.LambdaDataSource( + self, + "S3OperationsDataSource", + api=api, + lambda_function=s3_lambda, + description="Lambda for S3 presigned URL operations", + ) + + # Simple resolvers for allNotes and deleteNote + ddb_ds.create_resolver( + "AppSyncAllNotesQueryResolver", + type_name="Query", + field_name="allNotes", + request_mapping_template=appsync.MappingTemplate.from_file("resolvers/Query.allNotes.req.vtl"), + response_mapping_template=appsync.MappingTemplate.from_file("resolvers/Query.allNotes.res.vtl"), + ) + + ddb_ds.create_resolver( + "AppSyncDeleteNoteMutationResolver", + type_name="Mutation", + field_name="deleteNote", + request_mapping_template=appsync.MappingTemplate.dynamo_db_delete_item("NoteId", "NoteId"), + response_mapping_template=appsync.MappingTemplate.dynamo_db_result_item(), + ) + + # Pipeline Function: Generate presigned URL for upload + generate_upload_url_fn = appsync.AppsyncFunction( + self, + "GenerateUploadUrlFunction", + name="GenerateUploadUrlFunction", + api=api, + data_source=lambda_ds, + request_mapping_template=appsync.MappingTemplate.from_file("resolvers/functions/generateUploadUrl.req.vtl"), + response_mapping_template=appsync.MappingTemplate.lambda_result(), + ) + + # Pipeline Function: Save note to DynamoDB + save_note_to_ddb_fn = appsync.AppsyncFunction( + self, + "SaveNoteToDynamoDBFunction", + name="SaveNoteToDynamoDBFunction", + api=api, + data_source=ddb_ds, + request_mapping_template=appsync.MappingTemplate.from_file("resolvers/functions/saveNoteToDynamoDB.req.vtl"), + response_mapping_template=appsync.MappingTemplate.dynamo_db_result_item(), + ) + + # Pipeline Function: Get note from DynamoDB + get_note_from_ddb_fn = appsync.AppsyncFunction( + self, + "GetNoteFromDynamoDBFunction", + name="GetNoteFromDynamoDBFunction", + api=api, + data_source=ddb_ds, + request_mapping_template=appsync.MappingTemplate.dynamo_db_get_item("NoteId", "NoteId"), + response_mapping_template=appsync.MappingTemplate.dynamo_db_result_item(), + ) + + # Pipeline Function: Generate presigned URL for download (conditional) + generate_download_url_fn = appsync.AppsyncFunction( + self, + "GenerateDownloadUrlFunction", + name="GenerateDownloadUrlFunction", + api=api, + data_source=lambda_ds, + request_mapping_template=appsync.MappingTemplate.from_file("resolvers/functions/generateDownloadUrl.req.vtl"), + response_mapping_template=appsync.MappingTemplate.from_file("resolvers/functions/generateDownloadUrl.res.vtl"), + ) + + # Pipeline Resolver: saveNote (generate URL -> save to DDB) + appsync.Resolver( + self, + "SaveNotePipelineResolver", + api=api, + type_name="Mutation", + field_name="saveNote", + pipeline_config=[generate_upload_url_fn, save_note_to_ddb_fn], + request_mapping_template=appsync.MappingTemplate.from_file("resolvers/functions/saveNotePipeline.req.vtl"), + response_mapping_template=appsync.MappingTemplate.from_file("resolvers/functions/saveNotePipeline.res.vtl"), + ) + + # Pipeline Resolver: getNote (get from DDB -> generate download URL if attachment exists) + appsync.Resolver( + self, + "GetNotePipelineResolver", + api=api, + type_name="Query", + field_name="getNote", + pipeline_config=[get_note_from_ddb_fn, generate_download_url_fn], + request_mapping_template=appsync.MappingTemplate.from_file("resolvers/functions/getNotePipeline.req.vtl"), + response_mapping_template=appsync.MappingTemplate.from_file("resolvers/functions/getNotePipeline.res.vtl"), + ) + + # Outputs + CfnOutput(self, "GraphQLApiEndpoint", value=api.graphql_url, description="GraphQL endpoint URL") + CfnOutput(self, "APIKey", value=api_key.attr_api_key, description="API Key") + CfnOutput(self, "BucketName", value=attachments_bucket.bucket_name, description="S3 Bucket for attachments") + CfnOutput(self, "TableName", value=table.table_name, description="DynamoDB Table name") + + +app = App() +NotesAppSyncStack(app, "NotesAppSyncStack") +app.synth() \ No newline at end of file diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/Query.allNotes.req.vtl b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/Query.allNotes.req.vtl new file mode 100644 index 000000000..22811b0e8 --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/Query.allNotes.req.vtl @@ -0,0 +1,6 @@ +{ + "version": "2017-02-28", + "operation": "Scan", + "limit": $util.defaultIfNull($ctx.args.limit, 20), + "nextToken": $util.toJson($util.defaultIfNull($ctx.args.nextToken, null)) +} \ No newline at end of file diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/Query.allNotes.res.vtl b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/Query.allNotes.res.vtl new file mode 100644 index 000000000..535249275 --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/Query.allNotes.res.vtl @@ -0,0 +1,4 @@ +{ + "notes": $util.toJson($ctx.result.items), + "nextToken": $util.toJson($ctx.result.nextToken) +} \ No newline at end of file diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/generateDownloadUrl.req.vtl b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/generateDownloadUrl.req.vtl new file mode 100644 index 000000000..ceeb6c396 --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/generateDownloadUrl.req.vtl @@ -0,0 +1,20 @@ +## Only invoke Lambda if attachmentKey exists +#if($ctx.prev.result.attachmentKey) +{ + "version": "2018-05-29", + "operation": "Invoke", + "payload": { + "operation": "generateDownloadUrl", + "attachmentKey": $util.toJson($ctx.prev.result.attachmentKey) + } +} +#else +## Pass through without invoking Lambda +{ + "version": "2018-05-29", + "operation": "Invoke", + "payload": { + "operation": "skip" + } +} +#end \ No newline at end of file diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/generateDownloadUrl.res.vtl b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/generateDownloadUrl.res.vtl new file mode 100644 index 000000000..f1c9d4cab --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/generateDownloadUrl.res.vtl @@ -0,0 +1,5 @@ +## Merge download URL into previous result if it exists +#if($ctx.result && $ctx.result.downloadUrl) + #set($ctx.prev.result.downloadUrl = $ctx.result.downloadUrl) +#end +$util.toJson($ctx.prev.result) \ No newline at end of file diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/generateUploadUrl.req.vtl b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/generateUploadUrl.req.vtl new file mode 100644 index 000000000..1b41ca12c --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/generateUploadUrl.req.vtl @@ -0,0 +1,9 @@ +{ + "version": "2018-05-29", + "operation": "Invoke", + "payload": { + "operation": "generateUploadUrl", + "noteId": $util.toJson($ctx.args.NoteId), + "fileName": $util.toJson($util.defaultIfNull($ctx.args.fileName, "attachment")) + } +} \ No newline at end of file diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/getNotePipeline.req.vtl b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/getNotePipeline.req.vtl new file mode 100644 index 000000000..9e26dfeeb --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/getNotePipeline.req.vtl @@ -0,0 +1 @@ +{} \ No newline at end of file diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/getNotePipeline.res.vtl b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/getNotePipeline.res.vtl new file mode 100644 index 000000000..634741579 --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/getNotePipeline.res.vtl @@ -0,0 +1 @@ +$util.toJson($ctx.prev.result) \ No newline at end of file diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/saveNotePipeline.req.vtl b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/saveNotePipeline.req.vtl new file mode 100644 index 000000000..9e26dfeeb --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/saveNotePipeline.req.vtl @@ -0,0 +1 @@ +{} \ No newline at end of file diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/saveNotePipeline.res.vtl b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/saveNotePipeline.res.vtl new file mode 100644 index 000000000..634741579 --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/saveNotePipeline.res.vtl @@ -0,0 +1 @@ +$util.toJson($ctx.prev.result) \ No newline at end of file diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/saveNoteToDynamoDB.req.vtl b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/saveNoteToDynamoDB.req.vtl new file mode 100644 index 000000000..c1b9c9176 --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/resolvers/functions/saveNoteToDynamoDB.req.vtl @@ -0,0 +1,13 @@ +{ + "version": "2018-05-29", + "operation": "PutItem", + "key": { + "NoteId": $util.dynamodb.toDynamoDBJson($ctx.args.NoteId) + }, + "attributeValues": { + "title": $util.dynamodb.toDynamoDBJson($ctx.args.title), + "content": $util.dynamodb.toDynamoDBJson($ctx.args.content), + "attachmentKey": $util.dynamodb.toDynamoDBJson($ctx.prev.result.attachmentKey), + "uploadUrl": $util.dynamodb.toDynamoDBJson($ctx.prev.result.uploadUrl) + } +} \ No newline at end of file diff --git a/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/schema.graphql b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/schema.graphql new file mode 100644 index 000000000..28f853ab8 --- /dev/null +++ b/appsync-lambda-s3-presigned-urls-pipeline-resolver/src/graphql/schema.graphql @@ -0,0 +1,28 @@ +type Note { + NoteId: ID! + title: String + content: String + attachmentKey: String + uploadUrl: String + downloadUrl: String +} + +type PaginatedNotes { + notes: [Note!]! + nextToken: String +} + +type Query { + allNotes(limit: Int, nextToken: String): PaginatedNotes! + getNote(NoteId: ID!): Note +} + +type Mutation { + saveNote(NoteId: ID!, title: String!, content: String!, fileName: String): Note + deleteNote(NoteId: ID!): Note +} + +schema { + query: Query + mutation: Mutation +} \ No newline at end of file