Skip to content

Commit 30139c4

Browse files
committed
update
1 parent 089f5e6 commit 30139c4

File tree

11 files changed

+250
-76
lines changed

11 files changed

+250
-76
lines changed

docs/docs/integrations/llms/rwkv.mdx

Lines changed: 72 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,72 @@
1+
# RWKV-4
2+
3+
>[RWKV](https://www.rwkv.com/) (pronounced RwaKuv) language model is an RNN
4+
> with GPT-level LLM performance,
5+
> and it can also be directly trained like a GPT transformer (parallelizable).
6+
>
7+
>It's combining the best of RNN and transformer - great performance, fast inference,
8+
> fast training, saves VRAM, "infinite" ctxlen, and free text embedding.
9+
> Moreover it's 100% attention-free, and a LFAI project.
10+
11+
12+
## Installation and Setup
13+
14+
- Install the Python `rwkv` and `tokenizer` packages
15+
16+
```bash
17+
pip install rwkv tokenizer
18+
```
19+
20+
- Download a [RWKV model](https://huggingface.co/BlinkDL/rwkv-4-raven/tree/main) and place it in your desired directory
21+
- Download a [tokens file](https://raw.githubusercontent.com/BlinkDL/ChatRWKV/main/20B_tokenizer.json)
22+
23+
### Rwkv-4 models recommended VRAM
24+
25+
| Model | 8bit | bf16/fp16 | fp32 |
26+
|-------|------|-----------|------|
27+
| 14B | 16GB | 28GB | >50GB |
28+
| 7B | 8GB | 14GB | 28GB |
29+
| 3B | 2.8GB| 6GB | 12GB |
30+
| 1b5 | 1.3GB| 3GB | 6GB |
31+
32+
See the [rwkv pip](https://pypi.org/project/rwkv/) page for more information about strategies,
33+
including streaming and CUDA support.
34+
35+
## Usage
36+
37+
### RWKV
38+
39+
To use the RWKV wrapper, you need to provide the path to the pre-trained model file and the tokenizer's configuration.
40+
```python
41+
from langchain_community.llms import RWKV
42+
43+
# Test the model
44+
45+
```python
46+
47+
def generate_prompt(instruction, input=None):
48+
if input:
49+
return f"""Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
50+
51+
# Instruction:
52+
{instruction}
53+
54+
# Input:
55+
{input}
56+
57+
# Response:
58+
"""
59+
else:
60+
return f"""Below is an instruction that describes a task. Write a response that appropriately completes the request.
61+
62+
# Instruction:
63+
{instruction}
64+
65+
# Response:
66+
"""
67+
68+
69+
model = RWKV(model="./models/RWKV-4-Raven-3B-v7-Eng-20230404-ctx4096.pth", strategy="cpu fp32", tokens_path="./rwkv/20B_tokenizer.json")
70+
response = model.invoke(generate_prompt("Once upon a time, "))
71+
```
72+

docs/docs/integrations/platforms/microsoft.mdx

Lines changed: 18 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -237,6 +237,8 @@ See a [usage example](/docs/integrations/document_loaders/microsoft_onenote).
237237
from langchain_community.document_loaders.onenote import OneNoteLoader
238238
```
239239

240+
## Vectorstores
241+
240242
### Playwright URL Loader
241243

242244
>[Playwright](https://github.com/microsoft/playwright) is an open-source automation tool
@@ -271,8 +273,6 @@ Below are two available Azure Cosmos DB APIs that can provide vector store funct
271273
> You can apply your MongoDB experience and continue to use your favorite MongoDB drivers, SDKs, and tools by pointing your application to the API for MongoDB vCore account's connection string.
272274
> Use vector search in Azure Cosmos DB for MongoDB vCore to seamlessly integrate your AI-based applications with your data that's stored in Azure Cosmos DB.
273275
274-
#### Installation and Setup
275-
276276
See [detail configuration instructions](/docs/integrations/vectorstores/azure_cosmos_db).
277277

278278
We need to install `pymongo` python package.
@@ -281,14 +281,6 @@ We need to install `pymongo` python package.
281281
pip install pymongo
282282
```
283283

284-
#### Deploy Azure Cosmos DB on Microsoft Azure
285-
286-
Azure Cosmos DB for MongoDB vCore provides developers with a fully managed MongoDB-compatible database service for building modern applications with a familiar architecture.
287-
288-
With Cosmos DB for MongoDB vCore, developers can enjoy the benefits of native Azure integrations, low total cost of ownership (TCO), and the familiar vCore architecture when migrating existing applications or building new ones.
289-
290-
[Sign Up](https://azure.microsoft.com/en-us/free/) for free to get started today.
291-
292284
See a [usage example](/docs/integrations/vectorstores/azure_cosmos_db).
293285

294286
```python
@@ -299,12 +291,7 @@ from langchain_community.vectorstores import AzureCosmosDBVectorSearch
299291

300292
>[Azure Cosmos DB for NoSQL](https://learn.microsoft.com/en-us/azure/cosmos-db/nosql/vector-search) now offers vector indexing and search in preview.
301293
This feature is designed to handle high-dimensional vectors, enabling efficient and accurate vector search at any scale. You can now store vectors
302-
directly in the documents alongside your data. This means that each document in your database can contain not only traditional schema-free data,
303-
but also high-dimensional vectors as other properties of the documents. This colocation of data and vectors allows for efficient indexing and searching,
304-
as the vectors are stored in the same logical unit as the data they represent. This simplifies data management, AI application architectures, and the
305-
efficiency of vector-based operations.
306-
307-
#### Installation and Setup
294+
directly in the documents alongside your data.
308295

309296
See [detail configuration instructions](/docs/integrations/vectorstores/azure_cosmos_db_no_sql).
310297

@@ -314,20 +301,14 @@ We need to install `azure-cosmos` python package.
314301
pip install azure-cosmos
315302
```
316303

317-
#### Deploy Azure Cosmos DB on Microsoft Azure
318-
319-
Azure Cosmos DB offers a solution for modern apps and intelligent workloads by being very responsive with dynamic and elastic autoscale. It is available
320-
in every Azure region and can automatically replicate data closer to users. It has SLA guaranteed low-latency and high availability.
321-
322-
[Sign Up](https://learn.microsoft.com/en-us/azure/cosmos-db/nosql/quickstart-python?pivots=devcontainer-codespace) for free to get started today.
323-
324304
See a [usage example](/docs/integrations/vectorstores/azure_cosmos_db_no_sql).
325305

326306
```python
327307
from langchain_community.vectorstores import AzureCosmosDBNoSQLVectorSearch
328308
```
329309

330310
## Retrievers
311+
331312
### Azure AI Search
332313

333314
>[Azure AI Search](https://learn.microsoft.com/en-us/azure/search/search-what-is-azure-search) (formerly known as `Azure Search` or `Azure Cognitive Search` ) is a cloud search service that gives developers infrastructure, APIs, and tools for building a rich search experience over private, heterogeneous content in web, mobile, and enterprise applications.
@@ -445,6 +426,20 @@ See a [usage example](/docs/integrations/tools/playwright).
445426
from langchain_community.agent_toolkits import PlayWrightBrowserToolkit
446427
```
447428

429+
## Memory
430+
431+
### Azure CosmosDB Chat Message History
432+
433+
We need to install a python package.
434+
435+
```bash
436+
pip install azure-cosmos
437+
```
438+
439+
```python
440+
from langchain_community.chat_message_histories import CosmosDBChatMessageHistory
441+
```
442+
448443
## Graphs
449444

450445
### Azure Cosmos DB for Apache Gremlin

docs/docs/integrations/platforms/openai.mdx

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -92,6 +92,14 @@ See a [usage example](/docs/integrations/tools/dalle_image_generator).
9292
from langchain_community.utilities.dalle_image_generator import DallEAPIWrapper
9393
```
9494

95+
### ChatGPT Plugins
96+
97+
See a [usage example](/docs/integrations/tools/chatgpt_plugins).
98+
99+
```python
100+
from langchain_community.tools import AIPluginTool
101+
```
102+
95103
## Adapter
96104

97105
See a [usage example](/docs/integrations/adapters/openai).
Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
# Aerospike
2+
3+
>[Aerospike Vector Search](https://aerospike.com/docs/vector) (AVS) is an extension to
4+
> the `Aerospike Database` that enables searches across very large datasets stored in `Aerospike`.
5+
> This new service lives outside of `Aerospike` and builds an index to perform those searches.
6+
7+
8+
## Installation and Setup
9+
10+
You need to have a running `AVS` instance. Use one of the [installation methods](https://aerospike.com/docs/vector/install).
11+
12+
You need to install `aerospike-vector-search` python package.
13+
14+
```bash
15+
pip install aerospike-vector-search
16+
```
17+
18+
## Vectorstore
19+
20+
See a [usage example](/docs/integrations/vectorstores/aerospike).
21+
22+
```python
23+
from langchain_community.vectorstores import Aerospike
24+
```

docs/docs/integrations/providers/ai21.mdx

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,13 @@ serving as a context, and a question and return an answer based entirely on this
3434
from langchain_ai21 import AI21ContextualAnswers
3535
```
3636

37+
### AI21 Community
38+
39+
```python
40+
from langchain_community.llms import AI21
41+
```
42+
43+
3744

3845
## Chat models
3946

docs/docs/integrations/providers/ainetwork.mdx

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,22 @@ You need to install `ain-py` python package.
1313
pip install ain-py
1414
```
1515
You need to set the `AIN_BLOCKCHAIN_ACCOUNT_PRIVATE_KEY` environmental variable to your AIN Blockchain Account Private Key.
16+
17+
## Tools
18+
19+
Tools that help you interact with the `AINetwork` blockchain. They are all included
20+
in the `AINetworkToolkit` toolkit.
21+
22+
See a [usage example](/docs/integrations/toolkits/ainetwork).
23+
24+
```python
25+
from langchain_community.tools import AINAppOps
26+
from langchain_community.tools import AINOwnerOps
27+
from langchain_community.tools import AINRuleOps
28+
from langchain_community.tools import AINTransfer
29+
from langchain_community.tools import AINValueOps
30+
```
31+
1632
## Toolkit
1733

1834
See a [usage example](/docs/integrations/tools/ainetwork).
Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
# Amadeus
2+
3+
>[Amadeus Travel APIs](https://developers.amadeus.com/). Get instant access to over 400 airlines, 150,000 hotels, 300,000 tours & activities.
4+
5+
## Installation and Setup
6+
7+
To use the `Amadeus` integration, you need to have an `API key` from `Amadeus`.
8+
See [instructions here](https://developers.amadeus.com/get-started/get-started-with-self-service-apis-335).
9+
10+
We have to install the `amadeus` python package:
11+
12+
```bash
13+
pip install amadeus
14+
```
15+
16+
## Tools
17+
18+
Tools that help you interact with the `Amadeus travel APIs`. They are all included
19+
in the `Amadeus` toolkit.
20+
21+
See a [usage example](/docs/integrations/toolkits/amadeus).
22+
23+
```python
24+
from langchain_community.tools.amadeus import AmadeusClosestAirport
25+
from langchain_community.tools.amadeus import AmadeusFlightSearch
26+
```
27+
28+
## Toolkit
29+
30+
See a [usage example](/docs/integrations/toolkits/amadeus).
31+
32+
```python
33+
from langchain_community.agent_toolkits.amadeus.toolkit import AmadeusToolkit
34+
```
Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
# MongoDB Motor
2+
3+
>[MongoDB](https://www.mongodb.com/) is a source-available, cross-platform, document-oriented
4+
> database program. Classified as a `NoSQL` database product, `MongoDB` utilizes JSON-like
5+
> documents with optional schemas.
6+
>
7+
> [Motor](https://pypi.org/project/motor/) is a full-featured, non-blocking `MongoDB` driver
8+
> for Python `asyncio` and `Tornado` applications. `Motor` presents a coroutine-based
9+
> API for non-blocking access to MongoDB.
10+
11+
## Installation and Setup
12+
13+
We need to set up the configuratin parameters for the MongoDB database. See instructions [here](/docs/integrations/document_loaders/mongodb/).
14+
15+
We also need to install `motor` python package.
16+
17+
```bash
18+
pip install motor
19+
```
20+
21+
## Document Loader
22+
23+
See a [usage example](/docs/integrations/document_loaders/mongodb/).
24+
25+
```python
26+
from langchain_community.document_loaders.mongodb import MongodbLoader
27+
```
Lines changed: 13 additions & 53 deletions
Original file line numberDiff line numberDiff line change
@@ -1,65 +1,25 @@
11
# RWKV-4
22

3-
This page covers how to use the `RWKV-4` wrapper within LangChain.
4-
It is broken into two parts: installation and setup, and then usage with an example.
3+
>[RWKV](https://www.rwkv.com/) (pronounced RwaKuv) language model is an RNN
4+
> with GPT-level LLM performance,
5+
> and it can also be directly trained like a GPT transformer (parallelizable).
56
67
## Installation and Setup
7-
- Install the Python package with `pip install rwkv`
8-
- Install the tokenizer Python package with `pip install tokenizer`
9-
- Download a [RWKV model](https://huggingface.co/BlinkDL/rwkv-4-raven/tree/main) and place it in your desired directory
10-
- Download the [tokens file](https://raw.githubusercontent.com/BlinkDL/ChatRWKV/main/20B_tokenizer.json)
11-
12-
## Usage
13-
14-
### RWKV
15-
16-
To use the RWKV wrapper, you need to provide the path to the pre-trained model file and the tokenizer's configuration.
17-
```python
18-
from langchain_community.llms import RWKV
19-
20-
# Test the model
21-
22-
```python
23-
24-
def generate_prompt(instruction, input=None):
25-
if input:
26-
return f"""Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
278

28-
# Instruction:
29-
{instruction}
9+
- Install the Python `rwkv` and `tokenizer` packages
3010

31-
# Input:
32-
{input}
33-
34-
# Response:
35-
"""
36-
else:
37-
return f"""Below is an instruction that describes a task. Write a response that appropriately completes the request.
38-
39-
# Instruction:
40-
{instruction}
41-
42-
# Response:
43-
"""
44-
45-
46-
model = RWKV(model="./models/RWKV-4-Raven-3B-v7-Eng-20230404-ctx4096.pth", strategy="cpu fp32", tokens_path="./rwkv/20B_tokenizer.json")
47-
response = model.invoke(generate_prompt("Once upon a time, "))
11+
```bash
12+
pip install rwkv tokenizer
4813
```
49-
## Model File
14+
- Download a [RWKV model](https://huggingface.co/BlinkDL/rwkv-4-raven/tree/main) and place it in your desired directory
15+
- Download a [tokens file](https://raw.githubusercontent.com/BlinkDL/ChatRWKV/main/20B_tokenizer.json)
5016

51-
You can find links to model file downloads at the [RWKV-4-Raven](https://huggingface.co/BlinkDL/rwkv-4-raven/tree/main) repository.
17+
## LLMs
5218

53-
### Rwkv-4 models -> recommended VRAM
19+
### RWKV
5420

21+
See a [usage example](/docs/integrations/llms/rwkv).
5522

23+
```python
24+
from langchain_community.llms import RWKV
5625
```
57-
RWKV VRAM
58-
Model | 8bit | bf16/fp16 | fp32
59-
14B | 16GB | 28GB | >50GB
60-
7B | 8GB | 14GB | 28GB
61-
3B | 2.8GB| 6GB | 12GB
62-
1b5 | 1.3GB| 3GB | 6GB
63-
```
64-
65-
See the [rwkv pip](https://pypi.org/project/rwkv/) page for more information about strategies, including streaming and cuda support.

0 commit comments

Comments
 (0)