Skip to content

This repo provides a complete implementation of a simple chat app based on WebLLM.

Notifications You must be signed in to change notification settings

lablab-ai/webgpu-llm-simple-chat-starter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

SimpleChat

This folder provides a complete implementation of a simple chat app based on WebLLM. To try it out, you can do the following steps under this folder

npm install
npm start

Note if you would like to hack WebLLM core package. You can change web-llm dependencies as "file:../..", and follow the build from source instruction in the project to build webllm locally. This option is only recommended if you would like to hack WebLLM core package.

If you are using the Local Server option, first start the local server using the steps outlined here.


Artificial Intelligence Hackathons, tutorials and Boilerplates

Join the LabLab Discord

Discord Banner 1
On lablab discord, we discuss this repo and many other topics related to artificial intelligence! Checkout upcoming Artificial Intelligence Hackathons Event

Acclerating innovation through acceleration

About

This repo provides a complete implementation of a simple chat app based on WebLLM.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published