Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
50 changes: 50 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,7 @@ This plugin is updated by its users, I just do maintenance and ensure that PRs a
- [Lambda and Lambda Proxy Integrations](#lambda-and-lambda-proxy-integrations)
- [HTTP Proxy](#http-proxy)
- [Response parameters](#response-parameters)
- [Streaming responses](#streaming-responses)
- [WebSocket](#websocket)
- [Debug process](#debug-process)
- [Resource permissions and AWS profile](#resource-permissions-and-aws-profile)
Expand Down Expand Up @@ -749,6 +750,55 @@ Example response velocity template:
},
```

### Streaming responses

[AWS doc](https://docs.aws.amazon.com/lambda/latest/dg/configuration-response-streaming.html)

You can enable streaming responses for your Lambda functions by setting `response.transferMode` to `RESPONSE_STREAM` in your HTTP event configuration. This allows your handler to stream data to clients in chunks rather than buffering the entire response.

**Configuration:**

```yml
functions:
stream:
handler: handler.stream
events:
- http:
path: stream
method: get
response:
transferMode: STREAM # Enable streaming (default is BUFFERED)
```

**Handler Implementation:**

```javascript
/// <reference types="aws-lambda" />

export const stream = awslambda.streamifyResponse(
async (event, responseStream, context) => {
// Set content type
responseStream.setContentType('text/plain')

// Write chunks
responseStream.write('Hello ')
responseStream.write('World!')

// End the stream
responseStream.end()
}
)
```

**Important Notes:**

- Streaming is only supported with `AWS_PROXY` integration (the default)
- Currently only Node.js runtimes support streaming in serverless-offline
- Use the native `awslambda.streamifyResponse` API (works both locally and in AWS)
- The `awslambda` global is provided by serverless-offline to simulate the AWS Lambda runtime

See the [examples/streaming-response](examples/streaming-response) directory for a complete working example.

## WebSocket

Usage in order to send messages back to clients:
Expand Down
113 changes: 113 additions & 0 deletions examples/streaming-response/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
# API Gateway Streaming Response Example

This example demonstrates how to use API Gateway streaming responses with serverless-offline.

## Overview

API Gateway streaming responses allow you to send data to clients in chunks rather than buffering the entire response. This is useful for:
- Large file transfers
- Real-time data streaming
- Server-sent events
- Reducing memory usage and latency

## Configuration

To enable streaming responses in your `serverless.yml`, add the `response.transferMode` property to your HTTP event:

```yaml
functions:
stream:
handler: handler.stream
events:
- http:
path: stream
method: get
response:
transferMode: STREAM # Enable streaming (default is BUFFERED)
```

## Important Notes

1. **Integration Type**: Streaming is only supported with `AWS_PROXY` integration (the default)
2. **Runtime Support**: Currently only Node.js runtimes support streaming in serverless-offline
3. **Handler Pattern**: Use the native `awslambda.streamifyResponse` wrapper (works both locally and in production)

## Usage

### Install Dependencies

```bash
npm install
```

### Start Serverless Offline

```bash
npm start
# or
serverless offline
```

### Test the Endpoints

```bash
# Test streaming text response
curl http://localhost:3000/dev/stream

# Test streaming JSON response
curl http://localhost:3000/dev/stream-json

# Test regular (non-streaming) response
curl http://localhost:3000/dev/regular
```

## Handler Implementation

```javascript
/// <reference types="aws-lambda" />

export const handler = awslambda.streamifyResponse(
async (event, responseStream, context) => {
// Set content type
responseStream.setContentType('text/plain')

// Write chunks
responseStream.write('Hello ')
responseStream.write('World!')

// End the stream
responseStream.end()
}
)
```

## Response Stream API

The `responseStream` object provides the following methods:

- `write(chunk)` - Write a chunk to the stream (string or Buffer)
- `end([chunk])` - End the stream, optionally writing a final chunk
- `setContentType(contentType)` - Set the Content-Type header
- `setStatusCode(code)` - Set the HTTP status code (default: 200)

## Production Deployment

The handler code works identically in both development (serverless-offline) and production (AWS Lambda) - no changes needed! The `awslambda` global is provided by:
- **serverless-offline**: Simulated in the local runtime
- **AWS Lambda**: Native runtime global

```javascript
/// <reference types="aws-lambda" />

// This works both locally and in production
export const handler = awslambda.streamifyResponse(
async (event, responseStream, context) => {
// Your code here
}
)
```

## Limitations

- Streaming is not yet supported for Docker, Python, Ruby, Go, or Java runtimes in serverless-offline
- The response is buffered in serverless-offline (not truly streamed), but the API is compatible for development
75 changes: 75 additions & 0 deletions examples/streaming-response/handler.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
/// <reference types="aws-lambda" />

/**
* Example handler demonstrating API Gateway streaming response support
*
* To test this example:
* 1. cd examples/streaming-response
* 2. npm install or yarn install
* 3. serverless offline
* 4. curl http://localhost:3000/dev/stream
*/

// Helper function to create delays
const delay = (ms) => new Promise(resolve => setTimeout(resolve, ms))

export const stream = awslambda.streamifyResponse(
async (event, responseStream, _context) => {
// Return http status code 200 and set content type to application/json
responseStream = awslambda.HttpResponseStream.from(responseStream, {
statusCode: 200,
headers: {
"Content-Type": "text/plain",
"X-Custom-Header": "streaming-enabled"
}
});

// Stream json data
console.log(`Sending first chunk`);
responseStream.write("Will send more text every second...");
for (let i = 0; i < 5; i++) {
await delay(1000);
console.log(`Waited 1 seconds, sending more text...`);
responseStream.write("more text...");
console.log(`Sent more text...`);
}
responseStream.write("Done!");
responseStream.end();
}
);

export const streamJson = awslambda.streamifyResponse(
async (event, responseStream, _context) => {
// Return http status code 200 and set content type to application/json
responseStream = awslambda.HttpResponseStream.from(responseStream, {
statusCode: 200,
headers: {
"Content-Type": "application/json",
"X-Custom-Header": "streaming-enabled"
}
});

// Stream json data
responseStream.write("{\"message\": \"Starting stream...\", \"data\": [");
for (let i = 0; i < 5; i++) {
await delay(1000);
console.log(`Waited 1 seconds, sending json data...`);
const chunk = JSON.stringify({ item: i, timestamp: new Date().toISOString() });
responseStream.write(i === 4 ? chunk : chunk + ",");
console.log(`Sent data ${i}`);
}
responseStream.write("]}");
responseStream.end();
}
)

// Regular non-streaming handler for comparison
export const regular = async (event) => {
return {
statusCode: 200,
headers: {
'Content-Type': 'text/plain',
},
body: 'Hello from regular API!',
}
}
Loading
Loading