Skip to content

Commit 5a14db0

Browse files
Doc formatting draft 1 (#1286)
* initial commit to migrate the mtllm documentation into jac-lang.org * Doc formatting draft 1 * Pre commit done and draft mtllm documentation updated * Formatting update --------- Co-authored-by: Thamirawaran <Thamirawaran@users.noreply.github.com> Co-authored-by: savini98 <savikashmira@gmail.com>
1 parent 72ed186 commit 5a14db0

File tree

15 files changed

+1417
-7
lines changed

15 files changed

+1417
-7
lines changed
Lines changed: 74 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,74 @@
1+
# <span style="color: orange">Multimodality
2+
3+
For MTLLM to have actual neurosymbolic powers, it needs to be able to handle multimodal inputs and outputs. This means that it should be able to understand text, images, and videos. In this section, we will discuss how MTLLM can handle multimodal inputs.
4+
5+
## <span style="color: orange">Image
6+
7+
MTLLM can handle images as inputs. You can provide an image as input to the MTLLM Function or Method using the `Image` format of mtllm. Here is an example of how you can provide an image as input to the MTLLM Function or Method:
8+
9+
```jac
10+
import:py from mtllm.llms, OpenAI;
11+
import:py from mtllm, Image;
12+
13+
glob llm = OpenAI(model_name="gpt-4o");
14+
15+
enum Personality {
16+
INTROVERT: 'Person who is shy and reticent' = "Introvert",
17+
EXTROVERT: 'Person who is outgoing and socially confident' = "Extrovert"
18+
}
19+
20+
obj 'Person'
21+
Person {
22+
has full_name: str,
23+
yod: 'Year of Death': int,
24+
personality: 'Personality of the Person': Personality;
25+
}
26+
27+
can get_person_info(img: 'Image of Person': Image) -> Person
28+
by llm();
29+
30+
with entry {
31+
person_obj = get_person_info(Image("person.png"));
32+
print(person_obj);
33+
}
34+
```
35+
36+
Input Image (person.png):
37+
![person.png](https://preview.redd.it/g39au73fdir01.jpg?auto=webp&s=cef8394b639af82ba92d6ab084935f7adc8e841d)
38+
39+
```python
40+
# Output
41+
Person(full_name='Albert Einstein', yod=1955, personality=Personality.INTROVERT)
42+
```
43+
44+
In the above example, we have provided an image of a person ("Albert Einstein") as input to the `get_person_info` method. The method returns the information of the person in the image. The output of the method is a `Person` object with the name, year of death, and personality of the person in the image.
45+
46+
## <span style="color: orange">Video
47+
48+
Similarly, MTLLM can handle videos as inputs. You can provide a video as input to the MTLLM Function or Method using the `Video` format of mtllm. Here is an example of how you can provide a video as input to the MTLLM Function or Method:
49+
50+
```jac
51+
import:py from mtllm.llms, OpenAI;
52+
import:py from mtllm, Video;
53+
54+
glob llm = OpenAI(model_name="gpt-4o");
55+
56+
can is_aligned(video: Video, text: str) -> bool
57+
by llm(method="Chain-of-Thoughts", context="Mugen is the moving character");
58+
59+
with entry {
60+
video = Video("mugen.mp4", 1);
61+
text = "Mugen jumps off and collects few coins.";
62+
print(is_aligned(video, text));
63+
}
64+
```
65+
66+
Input Video (mugen.mp4):
67+
[mugen.mp4](https://github.com/Jaseci-Labs/jaseci/blob/main/jac-mtllm/examples/vision/mugen.mp4)
68+
69+
```python
70+
# Output
71+
True
72+
```
73+
74+
In the above example, we have provided a video of a character ("Mugen") as input to the `is_aligned` method. The method checks if the text is aligned with the video. The output of the method is a boolean value indicating whether the text is aligned with the video.
Lines changed: 159 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,159 @@
1+
# <span style="color: orange">Functions and Methods
2+
3+
Functions and methods play a crucial role in implementing various functionalities in a traditional GenAI application. In jaclang, we have designed these functions and methods to be highly flexible and powerful. Surprisingly, they don't even require a function or method body thanks to the MTLLM `by <your_llm>` syntax. This section will guide you on how to effectively utilize functions and methods in jaclang using MTLLM.
4+
5+
## <span style="color: orange">Functions
6+
7+
Functions/Abilities in jaclang are defined using the `can` keyword. They can be used to define a set of actions. Normal function looks like this in jaclang:
8+
9+
```jac
10+
can <function_name>(<parameter : parameter_type>, ..) -> <return_type> {
11+
<function_body>;
12+
}
13+
```
14+
15+
In a traditional GenAI application, you would make API calls inside the function body to perform the desired action. However, in jaclang, you can define the function using the `by <your_llm>` syntax. This way, you can define the function without a body and let the MTLLM model handle the implementation. Here is an example:
16+
17+
```jac
18+
can greet(name: str) -> str by <your_llm>();
19+
```
20+
21+
In the above example, the `greet` function takes a `name` parameter of type `str` and returns a `str`. The function is defined using the `by <your_llm>` syntax, which means the implementation of the function is handled by the MTLLM.
22+
23+
Below is an example where we define a function `get_expert` that takes a question as input and returns the best expert to answer the question in string format using mtllm with openai model with the method `Reason`. `get_answer` function takes a question and an expert as input and returns the answer to the question using mtllm with openai model without any method. and we can call these function as normal functions.
24+
25+
```jac
26+
import:py from mtllm.llms, OpenAI;
27+
28+
glob llm = OpenAI(model_name="gpt-4o");
29+
30+
can get_expert(question: str) -> 'Best Expert to Answer the Question': str by llm(method='Reason');
31+
can get_answer(question: str, expert: str) -> str by llm();
32+
33+
with entry {
34+
question = "What are Large Language Models?";
35+
expert = get_expert(question);
36+
answer = get_answer(question, expert);
37+
print(f"{expert} says: '{answer}' ");
38+
}
39+
```
40+
41+
Here's another example,
42+
43+
```jac
44+
import:py from mtllm.llms, OpenAI;
45+
46+
glob llm = OpenAI(model_name="gpt-4o");
47+
48+
can 'Get a Joke with a Punchline'
49+
get_joke() -> tuple[str, str] by llm();
50+
51+
with entry {
52+
(joke, punchline) = get_joke();
53+
print(f"{joke}: {punchline}");
54+
}
55+
```
56+
57+
In the above example, the `joke_punchline` function returns a tuple of two strings, which are the joke and its punchline. The function is defined using the `by <your_llm>` syntax, which means the implementation is handled by the MTLLM. You can add semstr to the function to make it more specific.
58+
59+
60+
## <span style="color: orange">Methods
61+
62+
Methods in jaclang are also defined using the `can` keyword. They can be used to define a set of actions that are specific to a class. Normal method looks like this in jaclang:
63+
64+
```python
65+
obj ClassName {
66+
has parameter: parameter_type;
67+
can <method_name>(<parameter : parameter_type>, ..) -> <return_type> {
68+
<method_body>;
69+
}
70+
}
71+
```
72+
73+
In a traditional GenAI application, you would make API calls inside the method body to perform the desired action while using `self` keyword to get necessary information. However, in jaclang, you can define the method using the `by <your_llm>` syntax. This way, you can define the method without a body and let the MTLLM model handle the implementation. Here is an example:
74+
75+
```jac
76+
obj Person {
77+
has name: str;
78+
can greet() -> str by <your_llm>(incl_info=(self));
79+
}
80+
```
81+
82+
In the above example, the `greet` method returns a `str`. The method is defined using the `by <your_llm>` syntax, which means the implementation of the method is handled by the MTLLM. The `incl_info=(self.name)` parameter is used to include the `name` attribute of the `Person` object as an information source for the MTLLM.
83+
84+
In the below example, we define a class `Essay` with a method `get_essay_judgement` that takes a criteria as input and returns the judgement for the essay based on the criteria using mtllm with openai model after a step of `Reasoning`. `get_reviewer_summary` method takes a dictionary of judgements as input and returns the summary of the reviewer based on the judgements using mtllm with openai model. `give_grade` method takes the summary as input and returns the grade for the essay using mtllm with openai model. and we can call these methods as normal methods.
85+
86+
```jac
87+
import:py from mtllm.llms, OpenAI;
88+
89+
glob llm = OpenAI(model_name="gpt-4o");
90+
91+
obj Essay {
92+
has essay: str;
93+
94+
can get_essay_judgement(criteria: str) -> str by llm(incl_info=(self.essay));
95+
can get_reviewer_summary(judgements: dict) -> str by llm(incl_info=(self.essay));
96+
can give_grade(summary: str) -> 'A to D': str by llm();
97+
}
98+
99+
with entry {
100+
essay = "With a population of approximately 45 million Spaniards and 3.5 million immigrants,"
101+
"Spain is a country of contrasts where the richness of its culture blends it up with"
102+
"the variety of languages and dialects used. Being one of the largest economies worldwide,"
103+
"and the second largest country in Europe, Spain is a very appealing destination for tourists"
104+
"as well as for immigrants from around the globe. Almost all Spaniards are used to speaking at"
105+
"least two different languages, but protecting and preserving that right has not been"
106+
"easy for them.Spaniards have had to struggle with war, ignorance, criticism and the governments,"
107+
"in order to preserve and defend what identifies them, and deal with the consequences.";
108+
essay = Essay(essay);
109+
criterias = ["Clarity", "Originality", "Evidence"];
110+
judgements = {};
111+
for criteria in criterias {
112+
judgement = essay.get_essay_judgement(criteria);
113+
judgements[criteria] = judgement;
114+
}
115+
summary = essay.get_reviewer_summary(judgements);
116+
grade = essay.give_grade(summary);
117+
print("Reviewer Notes: ", summary);
118+
print("Grade: ", grade);
119+
}
120+
```
121+
122+
## <span style="color: orange">Ability to Understand Typed Inputs and Outputs
123+
124+
MTLLM is able to represent typed inputs in a way that is understandable to the model. Sametime, this makes the model to generate outputs in the expected output type without any additional information. Here is an example:
125+
126+
```jac
127+
import:py from mtllm.llms, OpenAI;
128+
129+
glob llm = OpenAI(model_name="gpt-4o");
130+
131+
132+
enum 'Personality of the Person'
133+
Personality {
134+
INTROVERT: 'Person who is shy and reticent' = "Introvert",
135+
EXTROVERT: 'Person who is outgoing and socially confident' = "Extrovert"
136+
}
137+
138+
obj 'Person'
139+
Person {
140+
has full_name: 'Fullname of the Person': str,
141+
yod: 'Year of Death': int,
142+
personality: 'Personality of the Person': Personality;
143+
}
144+
145+
can 'Get Person Information use common knowledge'
146+
get_person_info(name: 'Name of the Person': str) -> 'Person': Person by llm();
147+
148+
with entry {
149+
person_obj = get_person_info('Martin Luther King Jr.');
150+
print(person_obj);
151+
}
152+
```
153+
154+
```python
155+
# Output
156+
Person(full_name='Martin Luther King Jr.', yod=1968, personality=Personality.INTROVERT)
157+
```
158+
159+
In the above example, the `get_person_info` function takes a `name` parameter of type `str` and returns a `Person` object. The `Person` object has three attributes: `full_name` of type `str`, `yod` of type `int`, and `personality` of type `Personality`. The `Personality` enum has two values: `INTROVERT` and `EXTROVERT`. The function is defined using the `by <your_llm>` syntax, which means the implementation is handled by the MTLLM. The model is able to understand the typed inputs and outputs and generate the output in the expected type.
Lines changed: 106 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,106 @@
1+
# <span style="color: orange">Language Models
2+
3+
Language models is the most important building block of MTLLM. Without it we can't achieve neuro-symbolic programming.
4+
5+
Let's first make sure you can set up your language model. MTLLM support clients for many remote and local LMs. You can even create your own as well very easily if you want to.
6+
7+
## <span style="color: orange">Setting up a LM client
8+
9+
In this section, we will go through the process of setting up a OpenAI's `GPT-4o` language model client. For that first makesure that you have installed the necessary dependancies by running `pip install mtllm[openai]`.
10+
11+
```jac
12+
import:py from mtllm.llms.openai, OpenAI;
13+
14+
my_llm = OpenAI(model_name="gpt-4o");
15+
```
16+
17+
Makesure to set the `OPENAI_API_KEY` environment variable with your OpenAI API key.
18+
19+
## <span style="color: orange">Directly calling the LM
20+
21+
You can directly call the LM by giving the raw prompts as well.
22+
23+
```jac
24+
my_llm("What is the capital of France?");
25+
```
26+
27+
You can also pass the `max_tokens`, `temperature` and other parameters to the LM.
28+
29+
```jac
30+
my_llm("What is the capital of France?", max_tokens=10, temperature=0.5);
31+
```
32+
33+
## <span style="color: orange">Using the LM with MTLLM
34+
35+
Intented use of MTLLM's LMs is to use them with the `jaclang`'s `BY_LLM` Feature.
36+
37+
### <span style="color: orange">With Abilities and Methods
38+
39+
```jac
40+
can function(arg1: str, arg2: str) -> str by llm();
41+
```
42+
43+
### <span style="color: orange">With Classes
44+
45+
```jac
46+
new_object = MyClass(arg1: str by llm());
47+
```
48+
49+
### <span style="color: orange">You can parse following attributes to the `by llm()` feature:
50+
51+
- `method` (default: `Normal`): Reasoning method to use. Can be `Normal`, `Reason` or `Chain-of-Thoughts`.
52+
- `tools` (default: `None`): Tools to use. This is a list of abilities to use with ReAct Prompting method.
53+
- `model specific parameters`: You can pass the model specific parameters as well. for example, `max_tokens`, `temperature` etc.
54+
55+
## <span style="color: orange">Enabling Verbose Mode
56+
57+
You can enable the verbose mode to see the internal workings of the LM.
58+
59+
```jac
60+
import:py from mtllm.llms, OpenAI;
61+
62+
my_llm = OpenAI(model_name="gpt-4o", verbose=True);
63+
```
64+
65+
## <span style="color: orange">Remote LMs
66+
67+
These language models are provided as managed services. To access them, simply sign up and obtain an API key. Before calling any of the remote language models listed below.
68+
69+
> <span style="color: orange">**NOTICE**
70+
>
71+
> make sure to set the corresponding environment variable with your API key. Use Chat models for better performance.
72+
73+
```jac
74+
llm = mtllm.llms.{provider_listed_below}(model_name="your model", verbose=True/False);
75+
```
76+
77+
1. `OpenAI` - OpenAI's gpt-3.5-turbo, gpt-4, gpt-4-turbo, gpt-4o [[model zoo]](https://platform.openai.com/docs/models)
78+
2. `Anthropic` - Anthropic's Claude 3 & Claude 3.5 - Haiku ,Sonnet, Opus [[model zoo]](https://docs.anthropic.com/en/docs/about-claude/models)
79+
3. `Groq` - Groq's Fast Inference Models [[model zoo]](https://console.groq.com/docs/models)
80+
4. `Together` - Together's hosted OpenSource Models [[model zoo]](https://docs.together.ai/docs/inference-models)
81+
82+
## <span style="color: orange">Local LMs
83+
84+
### <span style="color: orange">Ollama
85+
86+
Initiate an ollama server by following this tutorial [here](https://github.com/ollama/ollama). Then you can use it as follows:
87+
88+
```jac
89+
import:py from mtllm.llms.ollama, Ollama;
90+
91+
llm = Ollama(host="ip:port of the ollama server", model_name="llama3", verbose=True/False);
92+
```
93+
94+
### <span style="color: orange">HuggingFace
95+
96+
You can use any of the HuggingFace's language models as well. [models](https://huggingface.co/models?pipeline_tag=text-generation)
97+
98+
```jac
99+
import:py from mtllm.llms.huggingface, HuggingFace;
100+
101+
llm = HuggingFace(model_name="microsoft/Phi-3-mini-4k-instruct", verbose=True/False);
102+
```
103+
104+
> <span style="color: orange"> **NOTICE**
105+
>
106+
> We are constantly adding new LMs to the library. If you want to add a new LM, please open an issue [here](https://github.com/Jaseci-Labs/jaseci/issues).

0 commit comments

Comments
 (0)