-
In your Google Colab notebook, install the necessary Python packages.
-
Load the xterm extension to use a terminal within a Colab notebook.
-
Open the terminal inside the Colab cell by running:
%xterm
-
nside the xterm terminal (opened within the Colab cell):
-
Type the following command to install Ollama:
curl -fsSL https://ollama.com/install.sh | sh
-
Type the following command to install Ollama:
ollama serve & ollama run llama3
-
After leaving the xterm terminal, import the Ollama class from the LangChain community library.
-
Use the llm.invoke() method to prompt the model and receive its response.
Reference: https://www.youtube.com/watch?v=LN9rlGNaXUA&ab_channel=AkashDawari