Step-by-Step Guide for Setting Up Ragflow on Google Cloud + OLLama Integration #2403
Chenjie669
started this conversation in
Show and tell
Replies: 1 comment
-
Appreciate that much! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello everyone!
I've recently put together a detailed tutorial for setting up Ragflow on Google Cloud and running the OLLama model, including deploying llama3.1:70b. If you're looking to integrate large language models into a cloud environment, this guide is for you!
The tutorial covers:
Setting up a Google Cloud Compute Engine VM
Installing Docker and necessary dependencies
Configuring Ragflow to work with locally deployed models
Running OLLama with the llama3.1:70b model
Whether you're new to cloud environments or looking to optimize your setup for large models, this guide can help streamline the process.
Feel free to check it out here: https://github.com/Chenjie669/ragflow-google-cloud-setup/tree/main
I'd love to hear your feedback or answer any questions you might have!
Beta Was this translation helpful? Give feedback.
All reactions