Skip to content

Simple LLM API created with Express, LangChain.js and OpenAI

Notifications You must be signed in to change notification settings

m1ckc3b/simple-llm-api

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Simple LLM API

In this project, I just created an simple Prompt+LLM+Parser App.

To install dependencies:

bun install

Create an .env file:

PORT=3000
OPENAI_API_KEY="<your-api-key>"

Update the test.rest file:

POST http://localhost:3000
content-type: application/json

{
  "input": "<your-question>"
}

To run:

bun run start

How it works:

/**
 * Get answer from LLM to given input
 *
 * @param {string} input - The question
 * @return {Promise<string>} - The answer
 */
export async function getResponseFromLLM(input: string): Promise<string> {
  // Create a Prompt template
  const prompt = PromptTemplate.fromTemplate(PROMPT_TEMPLATE)

  // Create a String Parser
  const parser = new StringOutputParser()

  // Create a chain
  const chain = prompt.pipe(model).pipe(parser)

  // Call the chain
  const response = await chain.invoke({ question: input})
  return response
}

This project was created with:

       

About

Simple LLM API created with Express, LangChain.js and OpenAI

Topics

Resources

Stars

Watchers

Forks