Skip to content

fxnai/fxn-llm

Repository files navigation

Function LLM for Python

function logo

Dynamic JSON Badge X (formerly Twitter) Follow

Use local LLMs in your Python apps, with GPU acceleration and zero dependencies. This package is designed to patch OpenAI and Anthropic clients for running inference locally, using predictors hosted on Function.

Tip

We offer a similar package for use in the browser and Node.js. Check out fxn-llm-js.

Important

This package is still a work-in-progress, so the API could change drastically between all releases.

Installing Function LLM

Function is distributed on PyPi. To install, open a terminal and run the following command:

# Install Function LLM
$ pip install --upgrade fxn-llm

Note

Function LLM requires Python 3.10+

Important

Make sure to create an access key by signing onto Function. You'll need it to fetch the predictor at runtime.

Using the OpenAI Client Locally

To run text generation and embedding models locally using the OpenAI client, patch your OpenAI instance with the locally function:

from openai import OpenAI
from fxn_llm import locally

# 💥 Create your OpenAI client
openai = OpenAI()

# 🔥 Make it local
openai = locally(openai)

# 🚀 Generate embeddings
embeddings = openai.embeddings.create(
    model="@nomic/nomic-embed-text-v1.5-quant",
    input="search_query: Hello world!"
)

Warning

Currently, only openai.embeddings.create is supported. Text generation is coming soon!


Useful Links

Function is a product of NatML Inc.