diff --git a/_posts/2023-12-10-MoE.md b/_posts/2023-12-10-MoE.md index 574788da588..41a04edf7fc 100644 --- a/_posts/2023-12-10-MoE.md +++ b/_posts/2023-12-10-MoE.md @@ -291,6 +291,14 @@ Mistral放出这个开源的7B×8E的MoE之前,英伟达和谷歌也放出过 #### Mistral-MoE 体验 +ollama 体验 +- [dolphin-mixtral](https://ollama.ai/library/dolphin-mixtral/tags) + +```sh +ollama run dolphin-mixtral +# 更多版本、模型见上面链接 +``` + Web 体验地址: - [mixtral-8x7b-instruct](https://app.fireworks.ai/models/fireworks/mixtral-8x7b-instruct)