A production-ready Serverless architecture utilizing TypeScript, Middy, and DynamoDB, following Onion/Clean Architecture principles.
This project is a high-quality Infrastructure built with the Serverless Framework. It demonstrates how to build maintainable, scalable, and testable cloud-native applications on AWS.
Beyond a simple Lambda function, this repository showcases a robust enterprise-grade structure focusing on separation of concerns, runtime validation, and automated testing.
AWS fundamental concepts, please refer to aws-serverless-toturial.
- Onion Architecture: Strict separation between Handlers, Services, and Repositories.
- Type Safety: End-to-end TypeScript implementation with Zod for runtime schema validation.
- Robust Middleware: Powered by Middy for centralized JSON parsing, input validation, and standardized error handling.
- Comprehensive Testing: Full test suite using Vitest with local DynamoDB mocking.
- Local DynamoDB: Full offline development support via
serverless-dynamodb. - OpenAPI Integration: Schema-first API documentation auto-generated from Zod schemas.
The project follows the Onion/Clean Architecture pattern to ensure the business logic remains independent of external infrastructure (AWS SDK, Database).
API Gateway ➔ Middy Middleware ➔ Lambda Handler ➔ Service Layer ➔ Repository Layer ➔ DynamoDB
- Handler Layer: Pure entry point. Adapts API Gateway events and manages middleware.
- Service Layer: The "Brain." Contains core business logic and rules.
- Repository Layer: Data access abstraction. Interacts with the DynamoDB client.
- Middleware Stack: Handles cross-cutting concerns (Validation, HTTP Error Formatting).
before: httpJsonBodyParser → zodValidation → handler
after: zodValidationResponse → responseMiddleware (reverse registration order)
error: errorHandler
| Category | Technology |
|---|---|
| Language | TypeScript |
| Framework | Serverless Framework v4 |
| Middleware | Middy (@middy/core) |
| Validation | Zod |
| Database | AWS DynamoDB (@aws-sdk/lib-dynamodb) |
| AI Enrichment | AWS Bedrock (Claude 3 Haiku) |
| Testing | Vitest, Faker.js |
| Local DynamoDB | serverless-dynamodb |
| Logging | AWS Lambda Powertools Logger |
| Tracing | AWS X-Ray + Lambda Powertools Tracer |
| Docs | Redocly + zod-to-openapi |
src/
├── clients/
│ └── aws.client.ts # All AWS clients: DynamoDB, S3, SNS, Bedrock
├── docs/ # OpenAPI documentation
│ ├── openapi.ts # Central aggregator for all routes
│ ├── gen-docs.ts # CLI script for file generation
│ ├── registry.ts # Global OpenAPI registry singleton
├── handlers/ # Lambda entry points
│ └── user/
│ ├── index.ts # CRUD handler exports
│ └── user.serverless.ts # Serverless function definitions
├── middleware/ # Middy middleware stack
│ └── api.ts # restApiHandler, zodValidation, errorHandler
├── services/ # Business logic layer (framework-agnostic)
│ ├── ai/
│ │ ├── BedrockService.ts # Interface
│ │ └── BedrockServiceImpl.ts # AWS Bedrock implementation
│ └── user/
│ ├── userService.ts # Interface
│ └── userServiceImpl.ts # Implementation
├── repositories/ # Data access layer (persistence logic)
├── schemas/ # Zod validation schemas
├── utils/ # Helpers and error definitions
└── tests/ # Unit & integration test suites
requests/ # Bruno/REST Client API request collections
├── user.http
└── .env.example
- Node.js >= 20.x
- pnpm >= 9.x
- Docker (for local DynamoDB)
- AWS CLI configured (
aws configure)
# Install pnpm if not already installed
npm install -g pnpm
# Verify
pnpm --versionpnpm installpnpm db:start # Start local dynamoDB
pnpm db:create-tables # Create table in local dynamoDBThis project uses .env files to manage environment-specific variables.
Serverless Framework reads .env and injects values into Lambda via serverless.ts.
.env → serverless.ts (${env:KEY}) → Lambda process.env
| File | Purpose | Git |
|---|---|---|
.env |
Local development | ❌ Never commit |
.env.dev |
AWS dev deployment | ❌ Never commit |
.env.prod |
AWS prod deployment | ❌ Never commit |
.env.example |
Template for team | ✅ Commit |
cp ./src/requests/.env.example .env# .env — Local development
IS_OFFLINE=true
DYNAMODB_ENDPOINT=http://localhost:8000
AWS_REGION=us-east-1
⚠️ DYNAMODB_ENDPOINTmust be set for local DynamoDB (Docker).
In AWS deployments, leave it empty — Lambda uses IAM Role credentials automatically.
| Variable | Local | Dev | Prod | Description |
|---|---|---|---|---|
IS_OFFLINE |
true |
false |
false |
Enables local DynamoDB mode |
DYNAMODB_ENDPOINT |
http://localhost:8000 |
(empty) | (empty) | Local DynamoDB endpoint |
AWS_REGION |
us-east-1 |
us-east-1 |
us-east-1 |
AWS region |
API_URL_LOCAL |
http://localhost:3000/dev |
(empty) | (empty) | Local API base URL for OpenAPI docs |
API_URL_DEV |
(empty) | https://xxxxxx.execute-api.us-east-1.amazonaws.com/dev |
(empty) | AWS dev API base URL for OpenAPI docs |
API_URL_PROD |
(empty) | (empty) | https://xxxxxx.execute-api.us-east-1.amazonaws.com/prod |
AWS prod API base URL for OpenAPI docs |
# Start locally
pnpm devOn startup you should see:
DynamoDB Local Started on port 8000
serverless-dynamodb: Migration ran for table: aws-serverless-infrastructure-users-dev ✓
offline: POST http://localhost:3000/dev/users
offline: GET http://localhost:3000/dev/users/{id}
offline: GET http://localhost:3000/dev/users
offline: PATCH http://localhost:3000/dev/users/{id}
offline: DELETE http://localhost:3000/dev/users/{id}
Option A — VS Code JavaScript Debug Terminal (Recommended, zero config)
1. Open Run and Debug panel
2. Click "JavaScript Debug Terminal"
3. Run: pnpm dev
4. Set breakpoints → send a request → execution pauses automatically
# List tables
AWS_ACCESS_KEY_ID=local AWS_SECRET_ACCESS_KEY=local \
aws dynamodb list-tables \
--endpoint-url http://localhost:8000 \
--region us-east-1
# Expected
# { "TableNames": ["aws-serverless-infrastructure-users-dev"] }# Deploy to dev
pnpm deploy
# Deploy to production
pnpm deploy:prod# Generate spec → openapi.yaml
pnpm docs:gen
# Live preview at http://localhost:4000
pnpm docs:preview
# Build standalone index.html
pnpm docs:build
# Deploy index.html to S3 (dev)
pnpm docs:deploy:dev
# Get the public website URL
pnpm docs:urlThe docs:deploy:dev command builds the docs and uploads index.html to the aws-serverless-infrastructure-docs-dev S3 bucket, which is provisioned by serverless.ts as a public static website.
| Method | Path | Description |
|---|---|---|
POST |
/users |
Create user — AI-enriched bio/tags via Bedrock |
GET |
/users/{id} |
Get user by ID |
GET |
/users |
List users |
PATCH |
/users/{id} |
Update user |
DELETE |
/users/{id} |
Delete user |
GET |
/users/{id}/portrait/upload-url |
Get presigned S3 URL to upload portrait |
POST |
/users/{id}/verify/send |
Send email verification code via SNS |
POST |
/users/{id}/verify/confirm |
Confirm verification code |
1. Get presigned URL — GET /users/{id}/portrait/upload-url
{
"success": true,
"data": {
"uploadUrl": "https://s3.amazonaws.com/...",
"portraitKey": "portraits/{id}.jpg"
}
}2. Upload directly from client (expires in 5 minutes):
curl -X PUT "<uploadUrl>" \
-H "Content-Type: image/jpeg" \
--data-binary @portrait.jpg3. Save the key — PATCH /users/{id} with { "portraitKey": "portraits/{id}.jpg" }
When a user is created (POST /users), Bedrock (Claude 3 Haiku) automatically generates a bio and tags based on the user's name and email domain:
{
"success": true,
"data": {
"id": "...",
"name": "Jane Smith",
"email": "jane@acme.com",
"createdAt": "...",
"bio": "A professional at acme.com with expertise in their field.",
"tags": ["acme", "professional", "tech"]
}
}If Bedrock is unavailable, the user is still created — enrichment failure is non-blocking.
1. Send code — POST /users/{id}/verify/send
{ "email": "user@example.com" }Publishes a 6-digit code to the SNS VerificationTopic. Subscribe an email address to the topic in the AWS console after deploying.
2. Confirm code — POST /users/{id}/verify/confirm
{ "code": "123456" }Validates the code (10-minute expiry) and marks the user as EmailVerified = true in DynamoDB.
All error responses follow a consistent structure:
{
"success": false,
"error": {
"code": "VALIDATION_ERROR",
"message": "Validation failed",
"details": [...]
}
}| Status | Code | Description |
|---|---|---|
400 |
VALIDATION_ERROR |
Zod schema violation |
400 |
INVALID_JSON |
Malformed request body |
404 |
RESOURCE_NOT_FOUND |
Entity does not exist |
500 |
INTERNAL_ERROR |
Unexpected server error |
pnpm test # Watch mode
pnpm test:run # CI mode (run once and exit)
pnpm test:coverage # With coverage report
pnpm test:ui # Visual UIWe employ a dual-layered testing strategy:
- Unit Tests: Focus on the Service Layer. Dependencies like DynamoDB are isolated using
vi.mock. - Integration Tests: Test the Lambda Handler end-to-end including Middy middleware behavior (validation, parsing, error interception).
- Mocking: Utilizes Faker.js for realistic data generation and
vi.spyOnfor dependency tracking.
This project implements a Schema-First (Zod) development workflow. API definitions, request validations, and documentation are synchronized using Zod and @asteasolutions/zod-to-openapi.
This ensures 100% consistency between the TypeScript source code and the generated OpenAPI specification.
Documentation is automatically generated from the same Zod schemas used for runtime validation. There is no manual OpenAPI YAML to maintain.
Zod Schema (single source of truth)
↓
restApiHandler({ body, response, openapi })
↓
registry.registerPath() ← auto-called at module load time
↓
pnpm docs:gen → openapi.yaml
1. Define schemas in src/schemas/
export const createUserSchema = z
.object({
name: z.string(),
email: z.string().email(),
})
.openapi("CreateUserRequest");
export const userResponseSchema = z
.object({
id: z.string().uuid(),
name: z.string(),
email: z.string().email(),
createdAt: z.string(),
})
.openapi("UserResponse");2. Register in handler with openapi metadata
// src/handlers/user/index.ts
export const create = restApiHandler({
body: createUserSchema,
response: userResponseSchema,
openapi: {
// ← add this block
method: "post",
path: "/users",
summary: "Create user",
tags: ["User"],
},
}).handler(async ({ body }) => service.create(body));3. Import handler in src/docs/openapi.ts
import "@@handlers/user/index"; // ← triggers auto-registration
// import '@@handlers/order/index' ← add new resources here4. Run
pnpm docs:preview| File | Responsibility |
|---|---|
src/middleware/api.ts |
restApiHandler — calls registerOpenApiRoute at load time |
src/docs/registry.ts |
Global OpenAPI registry singleton |
src/docs/openapi.ts |
Aggregates all handlers, generates the spec |
src/docs/gen-docs.ts |
CLI script — writes openapi.yaml to disk |
src/docs/common.errors.ts |
Shared error response schemas (400/401/404/500) |
- Single source of truth — Zod schemas drive both runtime validation and API docs simultaneously.
- Zero drift — It is impossible for docs to be out of sync with the actual request/response validation.
- Zero boilerplate — No separate
.docs.tsfiles to maintain. Addingopenapi: {}to a handler is all that's needed. - Zero runtime cost —
registerOpenApiRouteexecutes once at module load time, never per-request.
Logging uses @aws-lambda-powertools/logger. A Logger instance is created per handler with a serviceName for correlation.
import { Logger } from "@aws-lambda-powertools/logger";
const logger = new Logger({ serviceName: "myService" });
logger.info("user created", { userId: result.id });
logger.warn("something unexpected", { detail });
logger.error("operation failed", { error });Log output is structured JSON, automatically enriched with Lambda context (request ID, cold start, etc.) when running on AWS.
Viewing logs:
# Tail logs for a deployed handler
pnpm logs user-create --tail
# Filter for errors only
aws logs tail /aws/lambda/aws-serverless-infrastructure-dev-user-create \
--follow \
--filter-pattern "ERROR"X-Ray tracing is enabled at both the API Gateway and Lambda levels via serverless.ts. Each handler invocation is automatically wrapped in an X-Ray subsegment by the tracerMiddleware in src/middleware/api.ts.
The tracer uses @aws-lambda-powertools/tracer.
What is traced automatically:
- Every Lambda invocation (segment created by the X-Ray daemon)
- Each handler execution (subsegment via
tracerMiddleware) - Errors are recorded on the subsegment automatically
Viewing traces:
# Open AWS Console → X-Ray → Traces
# Filter by service name: aws-serverless-infrastructureOr via CLI:
aws xray get-trace-summaries \
--time-range-type TraceId \
--start-time $(date -u -v-1H +%s) \
--end-time $(date -u +%s) \
--region us-east-1X-Ray tracing is only active on deployed AWS environments. It is a no-op locally (
serverless-offlinedoes not emulate X-Ray).
- Define Schemas: Create Zod models in
src/schemas/. - Register Routes: Define API metadata (Method, Path, Tags) in
src/docs/. - Generate Spec: Run the export script to convert TypeScript definitions into
openapi.yaml. - Preview/Build: Render the YAML into an interactive HTML documentation via Redocly.
- Create a
.docs.tsfile insrc/docs/(e.g.,user.docs.ts). - Use
registry.registerPath({...})to define the endpoint's metadata. - Crucial: Import this file in
src/docs/openapi.tsto include it in the final bundle.
Leverage Zod's extend or partial patterns combined with .openapi('Name') to ensure schemas are reusable and named correctly in the documentation:
// Base Model
const BaseUser = z
.object({
name: z.string(),
email: z.string().email(),
})
.openapi("BaseUser");
// Response Model (Inherits fields and displays as a reference)
const UserResponse = BaseUser.extend({
id: z.string().uuid(),
}).openapi("UserResponse");- Onion/Clean Architecture
- Zod runtime validation (request + response)
- Centralized error handling middleware
- Standardized error response schemas
- Local DynamoDB via Docker
- OpenAPI documentation auto-generated from Zod schemas
- Auto-register OpenAPI routes in restApiHander
- Multi-server OpenAPI spec with env-base URLS
- OpenAPI deploy to S3 static html
- Logging — Structured logging with AWS Lambda Powertools
- Observability — AWS X-Ray distributed tracing
- Portrait upload — Dedicated presigned URL endpoint (
GET /users/{id}/portrait/upload-url) - Email verification — SNS-based 6-digit code flow
- AI enrichment — Bedrock (Claude 3 Haiku) generates bio/tags on user creation
- Security — AWS Secrets Manager integration
- SQS — Async decoupling for background tasks
- Lambda Power Tuning — Memory/cost benchmarking