This folder provides a complete implementation of a simple chat app based on WebLLM. To try it out, you can do the following steps under this folder
npm install
npm start
Note if you would like to hack WebLLM core package.
You can change web-llm dependencies as "file:../.."
, and follow the build from source
instruction in the project to build webllm locally. This option is only recommended
if you would like to hack WebLLM core package.
If you are using the Local Server option, first start the local server using the steps outlined here.
On lablab discord, we discuss this repo and many other topics related to artificial intelligence! Checkout upcoming Artificial Intelligence Hackathons Event