Skip to content

Commit

Permalink
chore: refactor documentation (#13)
Browse files Browse the repository at this point in the history
* chore: add assets and joke

* chore: fix indent
  • Loading branch information
henomis authored Apr 21, 2023
1 parent c0fbb3f commit b768930
Show file tree
Hide file tree
Showing 8 changed files with 130 additions and 113 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
*.out
bin/
.vscode/
.local/

# Dependency directories (remove the comment below to include it)
# vendor/
6 changes: 6 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,15 @@
![image](./docs/assets/img/lingoose-small.png )

# 🪿 LinGoose

[![Build Status](https://github.com/henomis/lingoose/actions/workflows/test.yml/badge.svg)](https://github.com/henomis/lingoose/actions/workflows/test.yml) [![GoDoc](https://godoc.org/github.com/henomis/lingoose?status.svg)](https://godoc.org/github.com/henomis/lingoose) [![Go Report Card](https://goreportcard.com/badge/github.com/henomis/lingoose)](https://goreportcard.com/report/github.com/henomis/lingoose) [![GitHub release](https://img.shields.io/github/release/henomis/lingoose.svg)](https://github.com/henomis/lingoose/releases)

**LinGoose** (_Lingo + Go + Goose_ 🪿) aims to be a complete Go framework for creating LLM apps. 🤖 ⚙️

> **Did you know?** A goose 🪿 fills its car 🚗 with goose-line ⛽!
here below an image from docs/assets/img/lingoose.png

# Overview
**LinGoose** is a powerful Go framework for developing Large Language Model (LLM) based applications using pipelines. It is designed to be a complete solution and provides multiple components, including Prompts, Templates, Chat, Output Decoders, LLM, Pipelines, and Memory. With **LinGoose**, you can interact with LLM AI through prompts and generate complex templates. Additionally, it includes a chat feature, allowing you to create chatbots. The Output Decoders component enables you to extract specific information from the output of the LLM, while the LLM interface allows you to send prompts to various AI, such as the ones provided by OpenAI. You can chain multiple LMM steps together using Pipelines and store the output of each step in Memory for later retrieval.

Expand Down
Binary file added docs/assets/img/lingoose-small.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/img/lingoose.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 6 additions & 0 deletions docs/css/styles.css
Original file line number Diff line number Diff line change
Expand Up @@ -11047,4 +11047,10 @@ html {

.big-emoji {
font-size: 10rem !important;
}

blockquote {
border-left: 0.2rem solid #1abc9c;
padding: 1rem;

}
226 changes: 115 additions & 111 deletions docs/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,8 @@
<header class="masthead bg-primary text-white text-center">
<div class="container d-flex align-items-center flex-column">
<!-- Masthead Avatar Image-->
<!-- <img class="masthead-avatar mb-5" src="assets/img/avataaars.svg" alt="..." /> -->
<div class="big-emoji">🪿</div>
<img class="masthead-avatar mb-5" src="assets/img/lingoose-small.png" alt="..." />
<!-- <div class="big-emoji">🪿</div> -->
<!-- Masthead Heading-->
<h1 class="masthead-heading text-uppercase mb-0">LinGoose</h1>
<!-- Icon Divider-->
Expand All @@ -59,6 +59,7 @@ <h2 class="page-section-heading text-center text-uppercase text-secondary mb-0">
<h1 id="🪿-lingoose">🪿 LinGoose</h1>
<p><a href="https://github.com/henomis/lingoose/actions/workflows/test.yml"><img src="https://github.com/henomis/lingoose/actions/workflows/test.yml/badge.svg" alt="Build Status"></a> <a href="https://godoc.org/github.com/henomis/lingoose"><img src="https://godoc.org/github.com/henomis/lingoose?status.svg" alt="GoDoc"></a> <a href="https://goreportcard.com/report/github.com/henomis/lingoose"><img src="https://goreportcard.com/badge/github.com/henomis/lingoose" alt="Go Report Card"></a> <a href="https://github.com/henomis/lingoose/releases"><img src="https://img.shields.io/github/release/henomis/lingoose.svg" alt="GitHub release"></a></p>
<p><strong>LinGoose</strong> (<i>Lingo + Go + Goose 🪿</i>) aims to be a complete Go framework for creating LLM apps. 🤖 ⚙️</p>
<p><blockquote><strong>Did you know?</strong> A goose 🪿 fills its car 🚗 with goose-line ⛽!</blockquote></p>
<h1 id="overview">Overview</h1>
<p><strong>LinGoose</strong> is a powerful Go framework for developing Large Language Model (LLM) based applications using pipelines. It is designed to be a complete solution and provides multiple components, including Prompts, Templates, Chat, Output Decoders, LLM, Pipelines, and Memory. With <strong>LinGoose</strong>, you can interact with LLM AI through prompts and generate complex templates. Additionally, it includes a chat feature, allowing you to create chatbots. The Output Decoders component enables you to extract specific information from the output of the LLM, while the LLM interface allows you to send prompts to various AI, such as the ones provided by OpenAI. You can chain multiple LMM steps together using Pipelines and store the output of each step in Memory for later retrieval.</p>
<h1 id="components">Components</h1>
Expand Down Expand Up @@ -111,116 +112,119 @@ <h1 id="components">Components</h1>
<h1 id="usage">Usage</h1>
<p>Please refer to the <a href="../examples/">examples directory</a> to see other examples. However, here is an example of what <strong>LinGoose</strong> is capable of:</p>
<p><em>Talk is cheap. Show me the <a href="../examples/">code</a>.</em> - Linus Torvalds</p>
<pre><code class="language-go">package main

import (
&quot;encoding/json&quot;
&quot;fmt&quot;
&quot;github.com/henomis/lingoose/decoder&quot;
&quot;github.com/henomis/lingoose/llm/openai&quot;
&quot;github.com/henomis/lingoose/memory/ram&quot;
&quot;github.com/henomis/lingoose/pipeline&quot;
&quot;github.com/henomis/lingoose/prompt&quot;
)
func main() {
llmOpenAI, err := openai.New(openai.GPT3TextDavinci003, true)
if err != nil {
panic(err)
}
cache := ram.New()
prompt1 := prompt.New(&quot;Hello how are you?&quot;)
pipe1 := pipeline.NewStep(
&quot;step1&quot;,
llmOpenAI,
prompt1,
nil,
decoder.NewDefaultDecoder(),
cache,
)
prompt2, _ := prompt.NewPromptTemplate(
"Consider the following sentence.\n\nSentence:\n{{.output}}\n\n" +
"Translate it in {{.language}}!"",
map[string]string{
&quot;language&quot;: &quot;italian&quot;,
},
)
pipe2 := pipeline.NewStep(
&quot;step2&quot;,
llmOpenAI,
prompt2,
nil,
decoder.NewDefaultDecoder(),
nil,
)
prompt3, _ := prompt.NewPromptTemplate(
"Consider the following sentence.\n\nSentence:\n{{.step1.output}}" +
"\n\nTranslate it in {{.language}}!`"",
map[string]string{
&quot;language&quot;: &quot;spanish&quot;,
},
)
pipe3 := pipeline.NewStep(
&quot;step3&quot;,
llmOpenAI,
prompt3,
nil,
decoder.NewDefaultDecoder(),
cache,
)
pipelineSteps := pipeline.New(
pipe1,
pipe2,
pipe3,
)
response, err := pipelineSteps.Run(nil)
if err != nil {
fmt.Println(err)
}
fmt.Printf(&quot;\n\nFinal output: %#v\n\n&quot;, response)
fmt.Println(&quot;---Memory---&quot;)
dump, _ := json.MarshalIndent(cache.All(), &quot;&quot;, &quot; &quot;)
fmt.Printf(&quot;%s\n&quot;, string(dump))
}
</code></pre>
<pre>
<code class="language-go">
package main

import (
&quot;encoding/json&quot;
&quot;fmt&quot;

&quot;github.com/henomis/lingoose/decoder&quot;
&quot;github.com/henomis/lingoose/llm/openai&quot;
&quot;github.com/henomis/lingoose/memory/ram&quot;
&quot;github.com/henomis/lingoose/pipeline&quot;
&quot;github.com/henomis/lingoose/prompt&quot;
)

func main() {

llmOpenAI, err := openai.New(openai.GPT3TextDavinci003, true)
if err != nil {
panic(err)
}
cache := ram.New()

prompt1 := prompt.New(&quot;Hello how are you?&quot;)
pipe1 := pipeline.NewStep(
&quot;step1&quot;,
llmOpenAI,
prompt1,
decoder.NewDefaultDecoder(),
cache,
)

prompt2, _ := prompt.NewPromptTemplate(
&quot;Consider the following sentence.\n\nSentence:\n{{.output}}\n\n&quot;+
&quot;Translate it in {{.language}}!&quot;,
map[string]string{
&quot;language&quot;: &quot;italian&quot;,
},
)
pipe2 := pipeline.NewStep(
&quot;step2&quot;,
llmOpenAI,
prompt2,
decoder.NewDefaultDecoder(),
nil,
)

prompt3, _ := prompt.NewPromptTemplate(
&quot;Consider the following sentence.\n\nSentence:\n{{.step1.output}}&quot;+
&quot;\n\nTranslate it in {{.language}}!&quot;,
map[string]string{
&quot;language&quot;: &quot;spanish&quot;,
},
)
pipe3 := pipeline.NewStep(
&quot;step3&quot;,
llmOpenAI,
prompt3,
decoder.NewDefaultDecoder(),
cache,
)

pipelineSteps := pipeline.New(
pipe1,
pipe2,
pipe3,
)

response, err := pipelineSteps.Run(nil)
if err != nil {
fmt.Println(err)
}

fmt.Printf(&quot;\n\nFinal output: %#v\n\n&quot;, response)

fmt.Println(&quot;---Memory---&quot;)
dump, _ := json.MarshalIndent(cache.All(), &quot;&quot;, &quot; &quot;)
fmt.Printf(&quot;%s\n&quot;, string(dump))
}
</code>
</pre>
<p>Running this example will produce the following output:</p>
<pre><code>---USER---
Hello how are you?
---AI---
I&#39;m doing well, thank you. How about you?
---USER---
Consider the following sentence.\n\nSentence:\nI&#39;m doing well, thank you. How about you?\n\n
Translate it in italian!
---AI---
Sto bene, grazie. E tu come stai?
---USER---
Consider the following sentence.\n\nSentence:\nI&#39;m doing well, thank you. How about you?
\n\nTranslate it in spanish!
---AI---
Estoy bien, gracias. ¿Y tú


Final output: map[string]interface {}{&quot;output&quot;:&quot;Estoy bien, gracias. ¿Y tú&quot;}

---Memory---
{
&quot;step1&quot;: {
&quot;output&quot;: &quot;I&#39;m doing well, thank you. How about you?&quot;
},
&quot;step3&quot;: {
&quot;output&quot;: &quot;Estoy bien, gracias. ¿Y tú&quot;
}
}
</code></pre>
<pre>
<code>
---USER---
Hello how are you?
---AI---
I&#39;m doing well, thank you. How about you?
---USER---
Consider the following sentence.\n\nSentence:\nI&#39;m doing well, thank you. How about you?\n\n
Translate it in italian!
---AI---
Sto bene, grazie. E tu come stai?
---USER---
Consider the following sentence.\n\nSentence:\nI&#39;m doing well, thank you. How about you?
\n\nTranslate it in spanish!
---AI---
Estoy bien, gracias. ¿Y tú


Final output: map[string]interface {}{&quot;output&quot;:&quot;Estoy bien, gracias. ¿Y tú&quot;}

---Memory---
{
&quot;step1&quot;: {
&quot;output&quot;: &quot;I&#39;m doing well, thank you. How about you?&quot;
},
&quot;step3&quot;: {
&quot;output&quot;: &quot;Estoy bien, gracias. ¿Y tú&quot;
}
}
</code>
</pre>
<h1 id="installation">Installation</h1>
<p>Be sure to have a working Go environment, then run the following command:</p>
<pre><code class="language-shell">go get github.com/henomis/lingoose
Expand Down
2 changes: 1 addition & 1 deletion examples/chat/main.go
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ func main() {
},
chat.PromptMessage{
Type: chat.MessageTypeUser,
Prompt: prompt.New("Write a joke about a cat"),
Prompt: prompt.New("Write a joke about a goose"),
},
)

Expand Down
2 changes: 1 addition & 1 deletion examples/pipeline/chat/main.go
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ func main() {

values := map[string]string{
"role": "joke writer",
"animal": "cat",
"animal": "goose",
}
response, err := pipe.Run(values)
if err != nil {
Expand Down

0 comments on commit b768930

Please sign in to comment.