Skip to content

A template for an AWS Lambda function to infer to a multimodal Amazon Bedrock imported model in Open AI compatible API format.

Notifications You must be signed in to change notification settings

bitlab-experiments/lambda-bedrock

Repository files navigation

AWS Lambda Function - Local Development Setup

This project provides a complete setup for developing and testing AWS Lambda functions locally using Python.

πŸš€ Quick Start

Prerequisites

  • Python 3.9 or higher
  • AWS CLI (for deployment)
  • AWS SAM CLI (for local development)

Installation

  1. Clone and navigate to the project:

    cd lambda-bedrock
  2. Install dependencies:

    make install
    # or
    pip install -r requirements.txt
  3. Run local tests:

    make test
    # or
    python test_lambda.py

πŸ“ Project Structure

lambda-bedrock/
β”œβ”€β”€ lambda_function.py      # Main Lambda function handler
β”œβ”€β”€ test_lambda.py          # Local testing script
β”œβ”€β”€ test_events.json        # Sample test events
β”œβ”€β”€ requirements.txt        # Python dependencies
β”œβ”€β”€ template.yaml           # SAM template for deployment
β”œβ”€β”€ samconfig.toml          # SAM configuration
β”œβ”€β”€ Makefile               # Development commands
β”œβ”€β”€ .gitignore             # Git ignore rules
└── README.md              # This file

πŸ§ͺ Local Testing

Method 1: Direct Python Testing

python test_lambda.py

This will run the Lambda function with sample events and display the results. The script includes mocked Bedrock responses for testing.

Method 2: Pytest Unit Tests

# Run all tests
make test-pytest

# Run specific test categories
make test-unit        # Unit tests only
make test-integration # Integration tests only
make test-bedrock     # Bedrock-related tests only

# Run tests with coverage
make test-coverage

Method 3: Using AWS SAM CLI

# Build the function
make local-build

# Start local API Gateway
make local-start

Then test the endpoints:

  • POST http://localhost:3000/chat/completions

Method 4: Direct Lambda Invocation

sam local invoke LambdaFunction --event test_events.json

πŸ”§ Available Commands

Command Description
make install Install Python dependencies
make test Run local tests with mocked Bedrock
make test-pytest Run pytest test suite
make test-unit Run unit tests only
make test-integration Run integration tests only
make test-bedrock Run Bedrock-related tests only
make test-coverage Run tests with coverage report
make local-build Build Lambda function locally
make local-start Start local API Gateway
make local-invoke Invoke Lambda function locally
make clean Clean build artifacts
make format Format code with black
make lint Lint code with flake8
make type-check Type check with mypy
make deploy Deploy to AWS

πŸ“ Lambda Function Features

The Lambda function includes:

  • Chat Completions API: OpenAI-compatible chat completions endpoint
  • AWS Bedrock Integration: Uses custom Bedrock model for text generation
  • Image Processing: Support for image inputs with base64 encoding
  • Conversation History: Maintains context across multiple messages
  • Error Handling: Comprehensive error handling and logging
  • Local Testing: Built-in mock context and Bedrock mocking for local development
  • Multiple Response Formats: Handles various Bedrock response structures

🌐 API Endpoints

Chat Completions

curl -X POST http://localhost:3000/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "messages": [
      {"role": "user", "content": "Hello, how are you?"}
    ],
    "model": "custom-model",
    "max_tokens": 100,
    "temperature": 0.7
  }'

Response:

{
  "id": "chatcmpl-abc12345",
  "object": "chat.completion",
  "created": 1672531200,
  "model": "custom-model",
  "choices": [{
    "index": 0,
    "message": {
      "role": "assistant",
      "content": "Hello! I'm doing well, thank you for asking."
    },
    "finish_reason": "stop"
  }],
  "usage": {
    "prompt_tokens": 10,
    "completion_tokens": 8,
    "total_tokens": 18
  }
}

Chat with Image

curl -X POST http://localhost:3000/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "messages": [
      {
        "role": "user",
        "content": [
          {"type": "text", "text": "What do you see in this image?"},
          {"type": "image_url", "image_url": {"url": "data:image/jpeg;base64,..."}}
        ]
      }
    ],
    "model": "custom-model"
  }'

πŸš€ Deployment to AWS

Prerequisites for Deployment

  1. AWS CLI configured with appropriate credentials
  2. S3 bucket for deployment artifacts
  3. AWS SAM CLI installed

Deploy Steps

  1. Update samconfig.toml with your S3 bucket name
  2. Deploy using SAM:
    make deploy
    # or
    sam deploy --guided

Manual Deployment

# Build
sam build

# Deploy
sam deploy --guided

πŸ› οΈ Development

Adding New Endpoints

  1. Modify lambda_function.py to handle new routes
  2. Add test events to test_events.json
  3. Update the SAM template if needed

Environment Variables

The function supports these environment variables:

  • ENVIRONMENT: Environment name (dev/staging/prod)
  • LOG_LEVEL: Logging level (INFO/DEBUG/ERROR)

Testing New Features

  1. Add test cases to test_events.json
  2. Run make test to verify locally
  3. Use make local-start to test with API Gateway

πŸ“Š Monitoring and Logging

The function includes structured logging that works with:

  • AWS CloudWatch Logs
  • Local development console
  • Custom log aggregators

πŸ”’ Security Considerations

  • CORS headers are configured for cross-origin requests
  • Input validation should be added for production use
  • Environment variables for sensitive configuration
  • IAM roles follow least privilege principle

🀝 Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Run make test to ensure everything works
  6. Submit a pull request

πŸ“š Additional Resources

πŸ› Troubleshooting

Common Issues

  1. Import errors: Ensure all dependencies are installed with make install
  2. Port conflicts: Change the port in make local-start if 3000 is occupied
  3. AWS credentials: Ensure AWS CLI is configured for deployment
  4. SAM build errors: Try make clean then make local-build

Getting Help

  • Check the AWS Lambda documentation
  • Review the SAM CLI documentation
  • Ensure Python version compatibility (3.9+)

About

A template for an AWS Lambda function to infer to a multimodal Amazon Bedrock imported model in Open AI compatible API format.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published