Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Samples] merge LLM samples to "text_generation" folder #1411

Open
wants to merge 9 commits into
base: master
Choose a base branch
from

Conversation

olpipi
Copy link
Collaborator

@olpipi olpipi commented Dec 19, 2024

No description provided.

@github-actions github-actions bot added category: GHA CI based on Github actions category: cmake / build Cmake scripts category: samples GenAI samples labels Dec 19, 2024
@olpipi olpipi force-pushed the samples_movement branch 2 times, most recently from 9e7f861 to 3901fbb Compare December 19, 2024 11:22
samples/cpp/text_generation/CMakeLists.txt Outdated Show resolved Hide resolved
samples/cpp/text_generation/README.md Outdated Show resolved Hide resolved
samples/cpp/text_generation/README.md Outdated Show resolved Hide resolved
samples/cpp/text_generation/README.md Outdated Show resolved Hide resolved
samples/cpp/text_generation/README.md Outdated Show resolved Hide resolved
samples/cpp/text_generation/README.md Outdated Show resolved Hide resolved
samples/cpp/text_generation/README.md Show resolved Hide resolved
samples/cpp/text_generation/README.md Show resolved Hide resolved
@ilya-lavrenov ilya-lavrenov changed the title Samples movement [Samples] merge LLM samples to "text_generation" folder Dec 20, 2024
samples/cpp/text_generation/CMakeLists.txt Outdated Show resolved Hide resolved
samples/CMakeLists.txt Outdated Show resolved Hide resolved
samples/cpp/text_generation/README.md Outdated Show resolved Hide resolved
samples/cpp/text_generation/README.md Outdated Show resolved Hide resolved
samples/cpp/text_generation/README.md Outdated Show resolved Hide resolved
samples/cpp/text_generation/README.md Outdated Show resolved Hide resolved
samples/cpp/text_generation/README.md Outdated Show resolved Hide resolved
github-merge-queue bot pushed a commit to openvinotoolkit/openvino that referenced this pull request Jan 3, 2025
github-merge-queue bot pushed a commit to openvinotoolkit/openvino that referenced this pull request Jan 3, 2025
connected to:
openvinotoolkit/openvino.genai#1411

Co-authored-by: Andrzej Kopytko <andrzejx.kopytko@intel.com>
@olpipi
Copy link
Collaborator Author

olpipi commented Jan 3, 2025

If there is no more major comments, I will make the similar changes to python samples

Copy link
Collaborator

@Wovchena Wovchena left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You have a merge conflict

@olpipi olpipi enabled auto-merge January 9, 2025 12:26
@olpipi olpipi disabled auto-merge January 9, 2025 12:27
@olpipi
Copy link
Collaborator Author

olpipi commented Jan 9, 2025

@ilya-lavrenov re-review please. The PR cannot be merged without +1 from you

@@ -231,7 +231,7 @@ custom_streamer = CustomStreamer()
pipe.generate("The Sun is yellow because", max_new_tokens=15, streamer=custom_streamer)
```
For fully implemented iterable CustomStreamer please refer to [multinomial_causal_lm](https://github.com/openvinotoolkit/openvino.genai/tree/releases/2024/3/samples/python/multinomial_causal_lm/README.md) sample.
For fully implemented iterable CustomStreamer please refer to [multinomial_causal_lm](https://github.com/openvinotoolkit/openvino.genai/tree/releases/2024/3/samples/python/text_generation/README.md) sample.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have also found that OpenVINO docs refer to GenAI samples locations

{9043CD71-49CD-4EA1-9EB9-649F69350A67}

Please, prepare PRs to OpenVINO repo as well:

  • releases/2024/6 branch to point to GenAI's releases/2024/6
  • master branch to point to GenAI's master with updated locations.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Model examples to use for different samples:
chat_sample - meta-llama/Llama-2-7b-chat-hf
speculative_decoding_lm - meta-llama/Llama-2-13b-hf as main model and TinyLlama/TinyLlama-1.1B-Chat-v1.0 as draft model
other samples - meta-llama/Llama-2-7b-hf
Copy link
Contributor

@ilya-lavrenov ilya-lavrenov Jan 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's drawn as plain text
{76F1C33E-36E9-4076-9F20-B01069A8A810}
which is hard to read

maybe we can show at the beginning how to convert model, while these recommendations about suggested model will be directly in the section with sample?

E.g. in common part:

optimim-cli export openvino --model <xx> <output_folder>

and in per-sample section:

chat sample:
recommended models: meta-llama/Llama-2-7b-chat-hf, etc

The main idea of this is to keep locality: users don't need to scroll up and down to read information about how to run samples.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated

@ilya-lavrenov ilya-lavrenov added this to the 2025.0 milestone Jan 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: cmake / build Cmake scripts category: GHA CI based on Github actions category: samples GenAI samples no-match-files
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants