You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Additionally, there was a closed issue related to parallelism:
Add workflow multitasking parallel support - This issue is closed, and the suggested solution was that parallelism is on the roadmap, but the team is currently prioritizing resolving bottlenecks in the process [4].
For handling parallel requests, you can adjust the configuration options in Dify:
SERVER_WORKER_AMOUNT: Configure the number of API server workers.
SERVER_WORKER_AMOUNT: <number_of_workers>
CELERY_WORKER_AMOUNT: Configure the number of Celery workers.
CELERY_WORKER_AMOUNT: <number_of_workers>
These settings can help manage parallel processing more efficiently [5].
For the 'Query or prefix prompt is too long' error, consider the following optimizations:
Reduce the prefix prompt length.
Adjust the max_tokens parameter to a smaller value.
Use a model with a larger token limit, such as GPT-4.
Optimize the prompt structure by separating context, pre-prompt, history, and query [6][7][8][9].
Regarding the current state of multimodal model support, Dify supports ASR models and rich-text models up to GPT-4 specifications. The documentation lists various model providers and their supported capabilities, including speech-to-text and text-to-speech functionalities [10][11].
@dosu
Uploading pictures using a URL will prompt another error.
[ollama] Error: API request failed with status code 400: {"error":"illegal base64 data at input byte 5"}
Self Checks
Dify version
0.8.2
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
ollama log:
Dify :
✔️ Expected Behavior
No response
❌ Actual Behavior
No response
The text was updated successfully, but these errors were encountered: