Skip to content

A lightweight Bun + Express template that connects to the Testune AI API and streams chat responses in real time using Server-Sent Events (SSE)

Notifications You must be signed in to change notification settings

Testune-AI/express-template

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

1 Commit
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

⚑ Bun + Express Streaming Chat Template

A lightweight Bun + Express template that connects to the Testune AI API and streams chat responses in real time using Server-Sent Events (SSE).

Use this project as a starting point for building AI apps, proxies, or backend services that need live, token-by-token responses from an AI model.


✨ Features

  • ⚑ Built with Bun + Express
  • πŸ”Œ Connects to the Testune AI API for LLM interactions
  • πŸ“‘ Supports SSE streaming (just like OpenAI’s streaming responses)
  • πŸ’¬ Example /chat endpoint you can call from your frontend
  • πŸ›  Easy to extend with your own routes, auth, or business logic

πŸ“‚ Project Structure

.
β”œβ”€β”€ src/
β”‚   └── index.ts       # Express server with streaming proxy
β”œβ”€β”€ package.json
β”œβ”€β”€ tsconfig.json
└── README.md

About

A lightweight Bun + Express template that connects to the Testune AI API and streams chat responses in real time using Server-Sent Events (SSE)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published