Skip to content

Latest commit

 

History

History
114 lines (83 loc) · 4.92 KB

GETTING_STARTED.md

File metadata and controls

114 lines (83 loc) · 4.92 KB
SambaNova logo

SambaNova API QuickStart Guide

This guide walks through setting up an API key, performing a few sample queries with and without LangChain, and shares example applications to bootstrap application development for common AI use cases with open-source Python code on the SambaNova GitHub page. Let's get started!

Setting up SambaNova API Key

  1. Create an account on the SambaNova Developer Portal to get an API key.
  2. Once logged in, navigate to the API section and generate a new key.
  3. Set your API key as an environment variable:
    export SAMBANOVA_API_KEY=<your-api-key-here>

Supported Models

Model Context Length Output Length Dtype / Precision
Meta-Llama-3.1-8B-Instruct 8192 1000 BF16
Meta-Llama-3.1-70B-Instruct 8192 1000 BF16
Meta-Llama-3.1-405B-Instruct 4096 1000 BF16

Query the API

Install the OpenAI Python library:

pip install openai

Perform a chat completion:

from openai import OpenAI
api_key = os.environ.get("SAMBANOVA_API_KEY")

client = OpenAI(
    base_url="https://fast-api.snova.ai/v1/",
    api_key=api_key,  
)

model = "llama3-405b"
prompt = "Tell me a joke about artificial intelligence."

completion = client.chat.completions.create(
    model=model,
    messages=[
        {
            "role": "user", 
            "content": prompt,
        }
    ],
    stream=True,
)

response = ""
for chunk in completion:
    response += chunk.choices[0].delta.content or ""

print(response)

Using SambaNova APIs with Langchain

Here's an example of using SambaNova's APIs with the Langchain library:

import os
from langchain_openai import ChatOpenAI

api_key = os.environ.get("SAMBANOVA_API_KEY")

llm = ChatOpenAI(
    base_url="https://fast-api.snova.ai/v1/",  
    api_key=api_key,
    streaming=True,
    model="llama3-70b",
)

llm('What is the capital of France?')

This code snippet demonstrates how to set up a Langchain ChatOpenAI instance with SambaNova's APIs, specifying the API key, base URL, streaming option, and model. You can then use the llm object to generate completions by passing in prompts.

Starter Applications

SambaNova AI Starter Kits help you build fast, bootstrapping application development for common AI use cases with open-source Python code on a SambaNova GitHub repository. They let you see how the code works and customize it to your needs, so you can prove the business value of AI. Here are some of the most popular kits:

Application Description Demo Source Code
Enterprise Knowledge Retrieval Chatbot Build a retrieval-augmented generation (RAG) chatbot using your enterprise documents Live Demo Source Code
Conversational Search Assistant Semantic search using search engine snippets Live Demo Source Code
Financial Assistant Agentic finance assistant built on our API - Source Code
Function Calling Tools calling implementation and generic function calling module - Source Code
Benchmarking Kit Evaluates performance of multiple LLM models in SambaStudio Live Demo Source Code

Get Help

  • Check out the SambaNova support documentation for additional help
  • Find answers and post questions in the SambaNova Community
  • Join our SambaNova Discord for discussions and support
  • Let us know your most wanted features and challenges via the channels above
  • More inference models, longer context lengths, and embeddings models are coming soon!

Contribute

Building something cool? We welcome contributions to the SambaNova Quickstarts repository! If you have ideas for new quickstart projects or improvements to existing ones, please open an issue or submit a pull request and we'll respond right away.