You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Originally posted by hswlab February 2, 2025
Has anyone gotten the latest version of the web project (0.21.0) to work yet? I downloaded the master repository in Visualstudio 2022, various files were downloaded when I rebuilt it, including some models in the LLama.Unittest\Models folder.
The web project starts for me, but when I click on Begin Session, nothing happens, only the loading circle is visible. At first glance, there are no errors in the console output.
I tried a slightly older project a few months ago and it worked straight away. Is it possible that some of the settings in appsettings.json are no longer up to date? For example, the preconfigured “llama-2-7b-chat.Q4_0.gguf” was not downloaded at all. There are only these models in my folder after project build.
Edit: I'm using a Win10 Machine with NVIDIA GeForce GTX 1070 and 4GHz i7-6700K CPU
...
Edit:
Oh, there seem to be a problem in a signalr function.
This Error happends in wwwroot\js\sessionConnectionChat.js if the following row is called:
I suppose, that usually the following Task in Hubs\SessionConnectionHub.cs should be called, but this is not happening for some reason. I can't tell why this signalR call is not working.
Are these params in "connection.invoke('LoadModel', sessionParams, sessionParams)" correct? It looks strange, because that "sessionParams" parameter is used twice.
EDIT:
I think someone forgot to set the second parameter “inferenceConfig” correctly. This also explains the TODO comment. If you compare the SessionConfig and InferenceOptions data types with each other, some properties are similar. The developer probably wanted to fill an InferenceOptions object at this point, but forgot to do so.
Does anyone know if this is still being developed? Unfortunately, the current state of the implementation prevents you from using the web application.
The text was updated successfully, but these errors were encountered:
I have updated the following functions to get it working somehow again, but I'm not sure, if this is a proper sulution. Probably someone with a little more experience in this project can find a proper solution on occasion :)
LLama.Web\wwwroot\js\sessionConnectionChat.js
LLama.Web\Hubs\SessionConnectionHub.cs
In Addition I have downloaded "llama-2-7b-chat.Q4_0.gguf" manually and dropped this model inside LLamaa.Unittest/Models folder which was configured in appsettings.json
Edit: There is no PR on this solution, because this is just a workaround to get the Chatbot answering questions again, but i have no clue how to set this InferenceOptions parameter correctly.
hswlab
changed the title
LLamaSharp Web stopped working
[BUG]: LLamaSharp Web stopped working
Feb 3, 2025
Discussed in #1078
Originally posted by hswlab February 2, 2025
Has anyone gotten the latest version of the web project (0.21.0) to work yet? I downloaded the master repository in Visualstudio 2022, various files were downloaded when I rebuilt it, including some models in the LLama.Unittest\Models folder.
The web project starts for me, but when I click on Begin Session, nothing happens, only the loading circle is visible. At first glance, there are no errors in the console output.
I tried a slightly older project a few months ago and it worked straight away. Is it possible that some of the settings in appsettings.json are no longer up to date? For example, the preconfigured “llama-2-7b-chat.Q4_0.gguf” was not downloaded at all. There are only these models in my folder after project build.
Edit: I'm using a Win10 Machine with NVIDIA GeForce GTX 1070 and 4GHz i7-6700K CPU
...
Edit:
Oh, there seem to be a problem in a signalr function.

This Error happends in wwwroot\js\sessionConnectionChat.js if the following row is called:

I suppose, that usually the following Task in Hubs\SessionConnectionHub.cs should be called, but this is not happening for some reason. I can't tell why this signalR call is not working.

Are these params in "connection.invoke('LoadModel', sessionParams, sessionParams)" correct? It looks strange, because that "sessionParams" parameter is used twice.
EDIT:
I think someone forgot to set the second parameter “inferenceConfig” correctly. This also explains the TODO comment. If you compare the SessionConfig and InferenceOptions data types with each other, some properties are similar. The developer probably wanted to fill an InferenceOptions object at this point, but forgot to do so.
Does anyone know if this is still being developed? Unfortunately, the current state of the implementation prevents you from using the web application.
The text was updated successfully, but these errors were encountered: