-
Notifications
You must be signed in to change notification settings - Fork 135
Update LLMInterface to restore LC compatibility #416
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Draft
stellasia
wants to merge
19
commits into
neo4j:main
Choose a base branch
from
stellasia:feature/improved-llm-interface
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
Changes from all commits
Commits
Show all changes
19 commits
Select commit
Hold shift + click to select a range
4766d90
Update LLMInterface to restore LC compatibility
stellasia 014af4e
Update AnthropicLLM
stellasia eb9c91c
Update MistralAILLM
stellasia dbc2090
Update OllamaLLM
stellasia dcae75d
Update CohereLLM
stellasia 5695237
Mypy / ruff
stellasia 422fc11
Update VertexAILLM
stellasia b524b1a
Update (a)invoke_with_tools methods in the same way
stellasia 0aa7cee
Rename method and return directly list[LLMMessage]
stellasia 7537643
Update GraphRAG to restore full LC compatibility
stellasia eeefa8a
Test for the utils functions
stellasia 13bb7b7
WIP: update tests
stellasia 5ea2f37
Improve test coverage for utils and base modules
stellasia 9196764
Fix UT OpenAILLM
stellasia 8a94f1a
Update Ollama tests
stellasia 7c22e73
Update Ollama/Anthropic
stellasia 8e72335
WIP update cohere
stellasia a34c760
CHANGELOG.md
stellasia 7a4d4a0
Ruff after rebase
stellasia File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
File renamed without changes.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,12 +1,28 @@ | ||
from neo4j_graphrag.llm import AnthropicLLM, LLMResponse | ||
from neo4j_graphrag.types import LLMMessage | ||
|
||
# set api key here on in the ANTHROPIC_API_KEY env var | ||
api_key = None | ||
|
||
messages: list[LLMMessage] = [ | ||
{ | ||
"role": "system", | ||
"content": "You are a seasoned actor and expert performer, renowned for your one-man shows and comedic talent.", | ||
}, | ||
{ | ||
"role": "user", | ||
"content": "say something", | ||
}, | ||
] | ||
|
||
|
||
llm = AnthropicLLM( | ||
model_name="claude-3-opus-20240229", | ||
model_params={"max_tokens": 1000}, # max_tokens must be specified | ||
api_key=api_key, | ||
) | ||
res: LLMResponse = llm.invoke("say something") | ||
res: LLMResponse = llm.invoke( | ||
# "say something", | ||
messages, | ||
) | ||
print(res.content) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,11 +1,23 @@ | ||
from neo4j_graphrag.llm import CohereLLM, LLMResponse | ||
from neo4j_graphrag.types import LLMMessage | ||
|
||
# set api key here on in the CO_API_KEY env var | ||
api_key = None | ||
|
||
messages: list[LLMMessage] = [ | ||
{ | ||
"role": "system", | ||
"content": "You are a seasoned actor and expert performer, renowned for your one-man shows and comedic talent.", | ||
}, | ||
{ | ||
"role": "user", | ||
"content": "say something", | ||
}, | ||
] | ||
|
||
llm = CohereLLM( | ||
model_name="command-r", | ||
api_key=api_key, | ||
) | ||
res: LLMResponse = llm.invoke("say something") | ||
res: LLMResponse = llm.invoke(input=messages) | ||
print(res.content) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
from neo4j_graphrag.llm import MistralAILLM, LLMResponse | ||
from neo4j_graphrag.message_history import InMemoryMessageHistory | ||
from neo4j_graphrag.types import LLMMessage | ||
|
||
# set api key here on in the MISTRAL_API_KEY env var | ||
api_key = None | ||
|
||
|
||
messages: list[LLMMessage] = [ | ||
{ | ||
"role": "system", | ||
"content": "You are a seasoned actor and expert performer, renowned for your one-man shows and comedic talent.", | ||
}, | ||
{ | ||
"role": "user", | ||
"content": "say something", | ||
}, | ||
] | ||
|
||
|
||
llm = MistralAILLM( | ||
model_name="mistral-small-latest", | ||
api_key=api_key, | ||
) | ||
res: LLMResponse = llm.invoke( | ||
# "say something", | ||
# messages, | ||
InMemoryMessageHistory( | ||
messages=messages, | ||
) | ||
) | ||
print(res.content) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,8 +1,28 @@ | ||
from neo4j_graphrag.llm import LLMResponse, OpenAILLM | ||
from neo4j_graphrag.message_history import InMemoryMessageHistory | ||
from neo4j_graphrag.types import LLMMessage | ||
|
||
# set api key here on in the OPENAI_API_KEY env var | ||
api_key = None | ||
|
||
messages: list[LLMMessage] = [ | ||
{ | ||
"role": "system", | ||
"content": "You are a seasoned actor and expert performer, renowned for your one-man shows and comedic talent.", | ||
}, | ||
{ | ||
"role": "user", | ||
"content": "say something", | ||
}, | ||
] | ||
|
||
|
||
llm = OpenAILLM(model_name="gpt-4o", api_key=api_key) | ||
res: LLMResponse = llm.invoke("say something") | ||
res: LLMResponse = llm.invoke( | ||
# "say something", | ||
# messages, | ||
InMemoryMessageHistory( | ||
messages=messages, | ||
) | ||
) | ||
print(res.content) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the above comments on how to customise a rate limit handler are still valid no?