|
| 1 | +# <span style="color: orange">Functions and Methods |
| 2 | + |
| 3 | +Functions and methods play a crucial role in implementing various functionalities in a traditional GenAI application. In jaclang, we have designed these functions and methods to be highly flexible and powerful. Surprisingly, they don't even require a function or method body thanks to the MTLLM `by <your_llm>` syntax. This section will guide you on how to effectively utilize functions and methods in jaclang using MTLLM. |
| 4 | + |
| 5 | +## <span style="color: orange">Functions |
| 6 | + |
| 7 | +Functions/Abilities in jaclang are defined using the `can` keyword. They can be used to define a set of actions. Normal function looks like this in jaclang: |
| 8 | + |
| 9 | +```jac |
| 10 | +can <function_name>(<parameter : parameter_type>, ..) -> <return_type> { |
| 11 | + <function_body>; |
| 12 | +} |
| 13 | +``` |
| 14 | + |
| 15 | +In a traditional GenAI application, you would make API calls inside the function body to perform the desired action. However, in jaclang, you can define the function using the `by <your_llm>` syntax. This way, you can define the function without a body and let the MTLLM model handle the implementation. Here is an example: |
| 16 | + |
| 17 | +```jac |
| 18 | +can greet(name: str) -> str by <your_llm>(); |
| 19 | +``` |
| 20 | + |
| 21 | +In the above example, the `greet` function takes a `name` parameter of type `str` and returns a `str`. The function is defined using the `by <your_llm>` syntax, which means the implementation of the function is handled by the MTLLM. |
| 22 | + |
| 23 | +Below is an example where we define a function `get_expert` that takes a question as input and returns the best expert to answer the question in string format using mtllm with openai model with the method `Reason`. `get_answer` function takes a question and an expert as input and returns the answer to the question using mtllm with openai model without any method. and we can call these function as normal functions. |
| 24 | + |
| 25 | +```jac |
| 26 | +import:py from mtllm.llms, OpenAI; |
| 27 | +
|
| 28 | +glob llm = OpenAI(model_name="gpt-4o"); |
| 29 | +
|
| 30 | +can get_expert(question: str) -> 'Best Expert to Answer the Question': str by llm(method='Reason'); |
| 31 | +can get_answer(question: str, expert: str) -> str by llm(); |
| 32 | +
|
| 33 | +with entry { |
| 34 | + question = "What are Large Language Models?"; |
| 35 | + expert = get_expert(question); |
| 36 | + answer = get_answer(question, expert); |
| 37 | + print(f"{expert} says: '{answer}' "); |
| 38 | +} |
| 39 | +``` |
| 40 | + |
| 41 | +Here's another example, |
| 42 | + |
| 43 | +```jac |
| 44 | +import:py from mtllm.llms, OpenAI; |
| 45 | +
|
| 46 | +glob llm = OpenAI(model_name="gpt-4o"); |
| 47 | +
|
| 48 | +can 'Get a Joke with a Punchline' |
| 49 | +get_joke() -> tuple[str, str] by llm(); |
| 50 | +
|
| 51 | +with entry { |
| 52 | + (joke, punchline) = get_joke(); |
| 53 | + print(f"{joke}: {punchline}"); |
| 54 | +} |
| 55 | +``` |
| 56 | + |
| 57 | +In the above example, the `joke_punchline` function returns a tuple of two strings, which are the joke and its punchline. The function is defined using the `by <your_llm>` syntax, which means the implementation is handled by the MTLLM. You can add semstr to the function to make it more specific. |
| 58 | + |
| 59 | + |
| 60 | +## <span style="color: orange">Methods |
| 61 | + |
| 62 | +Methods in jaclang are also defined using the `can` keyword. They can be used to define a set of actions that are specific to a class. Normal method looks like this in jaclang: |
| 63 | + |
| 64 | +```python |
| 65 | +obj ClassName { |
| 66 | + has parameter: parameter_type; |
| 67 | + can <method_name>(<parameter : parameter_type>, ..) -> <return_type> { |
| 68 | + <method_body>; |
| 69 | + } |
| 70 | +} |
| 71 | +``` |
| 72 | + |
| 73 | +In a traditional GenAI application, you would make API calls inside the method body to perform the desired action while using `self` keyword to get necessary information. However, in jaclang, you can define the method using the `by <your_llm>` syntax. This way, you can define the method without a body and let the MTLLM model handle the implementation. Here is an example: |
| 74 | + |
| 75 | +```jac |
| 76 | +obj Person { |
| 77 | + has name: str; |
| 78 | + can greet() -> str by <your_llm>(incl_info=(self)); |
| 79 | +} |
| 80 | +``` |
| 81 | + |
| 82 | +In the above example, the `greet` method returns a `str`. The method is defined using the `by <your_llm>` syntax, which means the implementation of the method is handled by the MTLLM. The `incl_info=(self.name)` parameter is used to include the `name` attribute of the `Person` object as an information source for the MTLLM. |
| 83 | + |
| 84 | +In the below example, we define a class `Essay` with a method `get_essay_judgement` that takes a criteria as input and returns the judgement for the essay based on the criteria using mtllm with openai model after a step of `Reasoning`. `get_reviewer_summary` method takes a dictionary of judgements as input and returns the summary of the reviewer based on the judgements using mtllm with openai model. `give_grade` method takes the summary as input and returns the grade for the essay using mtllm with openai model. and we can call these methods as normal methods. |
| 85 | + |
| 86 | +```jac |
| 87 | +import:py from mtllm.llms, OpenAI; |
| 88 | +
|
| 89 | +glob llm = OpenAI(model_name="gpt-4o"); |
| 90 | +
|
| 91 | +obj Essay { |
| 92 | + has essay: str; |
| 93 | +
|
| 94 | + can get_essay_judgement(criteria: str) -> str by llm(incl_info=(self.essay)); |
| 95 | + can get_reviewer_summary(judgements: dict) -> str by llm(incl_info=(self.essay)); |
| 96 | + can give_grade(summary: str) -> 'A to D': str by llm(); |
| 97 | +} |
| 98 | +
|
| 99 | +with entry { |
| 100 | + essay = "With a population of approximately 45 million Spaniards and 3.5 million immigrants," |
| 101 | + "Spain is a country of contrasts where the richness of its culture blends it up with" |
| 102 | + "the variety of languages and dialects used. Being one of the largest economies worldwide," |
| 103 | + "and the second largest country in Europe, Spain is a very appealing destination for tourists" |
| 104 | + "as well as for immigrants from around the globe. Almost all Spaniards are used to speaking at" |
| 105 | + "least two different languages, but protecting and preserving that right has not been" |
| 106 | + "easy for them.Spaniards have had to struggle with war, ignorance, criticism and the governments," |
| 107 | + "in order to preserve and defend what identifies them, and deal with the consequences."; |
| 108 | + essay = Essay(essay); |
| 109 | + criterias = ["Clarity", "Originality", "Evidence"]; |
| 110 | + judgements = {}; |
| 111 | + for criteria in criterias { |
| 112 | + judgement = essay.get_essay_judgement(criteria); |
| 113 | + judgements[criteria] = judgement; |
| 114 | + } |
| 115 | + summary = essay.get_reviewer_summary(judgements); |
| 116 | + grade = essay.give_grade(summary); |
| 117 | + print("Reviewer Notes: ", summary); |
| 118 | + print("Grade: ", grade); |
| 119 | +} |
| 120 | +``` |
| 121 | + |
| 122 | +## <span style="color: orange">Ability to Understand Typed Inputs and Outputs |
| 123 | + |
| 124 | +MTLLM is able to represent typed inputs in a way that is understandable to the model. Sametime, this makes the model to generate outputs in the expected output type without any additional information. Here is an example: |
| 125 | + |
| 126 | +```jac |
| 127 | +import:py from mtllm.llms, OpenAI; |
| 128 | +
|
| 129 | +glob llm = OpenAI(model_name="gpt-4o"); |
| 130 | +
|
| 131 | +
|
| 132 | +enum 'Personality of the Person' |
| 133 | +Personality { |
| 134 | + INTROVERT: 'Person who is shy and reticent' = "Introvert", |
| 135 | + EXTROVERT: 'Person who is outgoing and socially confident' = "Extrovert" |
| 136 | +} |
| 137 | +
|
| 138 | +obj 'Person' |
| 139 | +Person { |
| 140 | + has full_name: 'Fullname of the Person': str, |
| 141 | + yod: 'Year of Death': int, |
| 142 | + personality: 'Personality of the Person': Personality; |
| 143 | +} |
| 144 | +
|
| 145 | +can 'Get Person Information use common knowledge' |
| 146 | +get_person_info(name: 'Name of the Person': str) -> 'Person': Person by llm(); |
| 147 | +
|
| 148 | +with entry { |
| 149 | + person_obj = get_person_info('Martin Luther King Jr.'); |
| 150 | + print(person_obj); |
| 151 | +} |
| 152 | +``` |
| 153 | + |
| 154 | +```python |
| 155 | +# Output |
| 156 | +Person(full_name='Martin Luther King Jr.', yod=1968, personality=Personality.INTROVERT) |
| 157 | +``` |
| 158 | + |
| 159 | +In the above example, the `get_person_info` function takes a `name` parameter of type `str` and returns a `Person` object. The `Person` object has three attributes: `full_name` of type `str`, `yod` of type `int`, and `personality` of type `Personality`. The `Personality` enum has two values: `INTROVERT` and `EXTROVERT`. The function is defined using the `by <your_llm>` syntax, which means the implementation is handled by the MTLLM. The model is able to understand the typed inputs and outputs and generate the output in the expected type. |
0 commit comments