Skip to content

Commit

Permalink
M5 final Website (#272)
Browse files Browse the repository at this point in the history
* m5 first version website

* pages updated

* images, features, future updated

* future bloc text updated

* small update and draft set to false

* update feature page

* add embeddings screenshot

* Update features_embeddings.png
  • Loading branch information
f3lixh authored Aug 1, 2023
1 parent b0a4338 commit ffdf626
Show file tree
Hide file tree
Showing 10 changed files with 40 additions and 12 deletions.
34 changes: 31 additions & 3 deletions content/ss23/master/m5-quizzer-ai/features.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,21 +3,47 @@ title = "Features"
weight = 1
+++

{{<section title="❓ Generate Quizzes">}}

With Quzzier AI, generating quiz questions using AI models is effortless. Simply input a topic or term and submit. You have the option to use various models, including **GPT-4**, **GPT-3.5-Turbo**, or even your own trained models / knowledge bases. Additionally, you can adjust the answers using optional parameters, such as creativity, logic, and size.

{{<image src="features_home.png" alt="Generate Quizzes" caption="Generate Quizzes">}}

After the questions have been successfully generated by the AI and processed by Quizzer Ai, they can be reviewed in more detail. In this **preview window** the user can perform **various actions** with the generated questions.

* In case the answer to the question is wrong or multiple answers are correct, this can easily be changed here.
* If the answers do not match the question, they can be generated again.
* In addition, more detailed explanations for the quiz questions can be generated.
* Furthermore, unnecessary questions can be deleted but also added by manually.

If the users are happy with the generated quiz questions, they can be exported and imported into the target application like [Quizzer 2.0](https://ml-labs.com/).

{{<image src="features_quiz_preview.png" alt="Generate Quizzes" caption="Preview Quizzes">}}


{{</section>}}


{{<section title="📕 Embeddings">}}

Embedding of user-supplied information:
As the knowledge of OpenAI's models is limited with a cutoff date in time and lack of really specific information, more recent or specialized information is unavailable to the AI. To counter this, we are using embeddings to inject our information into the AI prompts. Hereby, chunks of information, like paragraphs, are transformed into a vector space. These vectors capture semantic relationships and patterns to be able to classify their textual content. If the AI is now supposed to create quiz questions on a certain topic, it can compare the prompt embedding with the embeddings in the application database via semantic search to check which sections of text match this topic semantically and how much they match. The information found in the section is then used as the base for question generation.
In our application, embeddings can be generated from all kinds of documents, like factual books, internal documentations, exam preparations, and so on. Our tool accepts docx Word documents, which get cleaned up by another AI function, so only relevant information is extracted from the document before embedding. For example, page numbers should be left out, and line breaks should be undone in each paragraph. Excel files with already cleaned up information can also be used for fast import of user-compiled data.
As the knowledge of OpenAI's models is limited with a cutoff date in time and lack of really specific information, more recent or specialized information is unavailable to the AI. To counter this, we are using **embeddings** to inject our information into the AI prompts. Hereby, chunks of **information**, like paragraphs, are transformed into a **vector space**. These vectors capture **semantic relationships** and **patterns** to be able to classify their textual content. If the AI is now supposed to create quiz questions on a certain topic, it can compare the prompt embedding with the embeddings in the application database via **semantic search** to check which sections of text match this topic semantically and how much they match. The information found in the section is then used as the base for question generation.

{{<image src="features_embeddings.png" alt="Generate Quizzes" caption="Embeddings">}}

In our application, embeddings can be generated from all kinds of documents, like factual **books**, **internal documentations**, **exam preparations**, and so on. Our tool accepts docx Word documents, which get cleaned up by another AI function, so only relevant information is extracted from the document before embedding. For example, page numbers should be left out, and line breaks should be undone in each paragraph. Excel files with already cleaned up information can also be used for fast import of user-compiled data.
Users can then specify which slice of embedded knowledge to use or use all of the saved embeddings found in the application database for question generation.

{{</section>}}


{{<section title="🎛 Fine Tuning">}}
{{<section title="⚙️ Fine Tuning">}}


Users are given the opportunity to train their own model, tailored to their specific queries and responses. Fine-tuning, in our case, is all about predicting the user's input and generating suitable responses. To accomplish this, a dataset is needed—the larger, the better. This dataset is saved in a JSONL file, comprised of a prompt and a completion. Afterwards, it is possible to train one of the standard language models provided by OpenAI.



To streamline this process, Quizzer provides a management section that oversees all your personalized models and more. Additionally, an Installation Wizard is available to facilitate the training of a new model. This **Wizward** is divided into four steps.

* **Basis Model**
Expand All @@ -37,4 +63,6 @@ In the following step, you may optionally adjust hyperparameters to modify the t

Finally, all selected settings will be summarized before initiating the training process. It is essential to review these settings, as costs will be incurred from this point onwards.

{{<image src="features_finetuning.png" alt="Generate Quizzes" caption="Example of Fine Tuning Wizard">}}

{{</section>}}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 3 additions & 3 deletions content/ss23/master/m5-quizzer-ai/future.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,17 +5,17 @@ weight = 0

{{<section title="🧊 bloc integration">}}

We're planning to expand our Flutter application by integrating **Bloc** for state management. **Bloc** will provide us with a more structured approach to handle state, making our codebase cleaner and easier to maintain.
We're planning to expand our Flutter application by integrating **Bloc** for state management. **Bloc** is a state management design pattern and architecture used in Flutter to handle the flow of data and manage the state of an application. It is specifically designed to simplify and structure the way an app's state is handled and updated. One of the main advantages of using **Bloc** for state management is that it helps to maintain a clear separation of concerns. It allows developers to decouple the user interface from the business logic, making the codebase more organized and easier to maintain. This pattern also improves testability, as the logic can be tested independently of the UI.

{{</section>}}

{{<section title="🤖 future models">}}

As of now, the only currently available models for fine-tuning from OpenAI will be deprecated and no longer accessible in the near future. Instead, new models like gpt-3.5-turbo or gpt-4 will be made available for fine-tuning in the upcoming months. With the introduction of these new models, there may be changes in the procedures, and Quizzer will need to be adjusted accordingly to accommodate these changes.
As of now, the only currently available models for fine-tuning from OpenAI will be **discontinued** and no longer accessible in the near future. Instead, new models like **GPT-3.5-Turbo** or **GPT-4** will be made available for fine-tuning in the upcoming months. With the introduction of these new models, there may be changes in the procedures, and **Quizzer AI will need to be adjusted** accordingly to accommodate these changes.

{{</section>}}

{{<section title="ℹ️ support">}}
{{<section title="👋 support">}}

We realized the main requirements of the project tender and contributed our own features and ideas in the process. Handing the project over to Mobile Learning Labs again means that we will act as consultants and supporters should questions and difficulties arise in the future.

Expand Down
8 changes: 4 additions & 4 deletions content/ss23/master/m5-quizzer-ai/process.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@ title = "Process"
weight = 1
+++

{{<section title="Process">}}
{{<image src="process_trello.png" alt="Process Trello" caption="Trello board">}}
{{<section title="📋 Process">}}
In our project, we managed the process using a **Trello** board, a helpful tool for organizing tasks and responsibilities. The Kanban board allowed us to efficiently distribute tasks among the team members and track the progress of each assignment. This method facilitated **clear communication** and coordination throughout the project, ensuring that everyone knew their roles and deadlines. We also held meeting sessions every Monday via **Microsoft Teams**. This helped us stay on track, kept us in discussion, and we were able to plan ahead for the current week. Regular communications and text chat happened through **Discord**.
{{<image src="process_trello.png" alt="Process Trello" caption="Trello board">}}

{{</section>}}

{{<section title="📋 Project start">}}
{{<section title="📅 Project start">}}

As the **OpenAI API** Tools and AI, in general, were relatively new to us, we dedicated the first two weeks to really understanding and researching what and fine-tunings do and how to improve our prompts. We decided on specialists for each big task, with Dustin being our **Flutter** expert and principal developer, Felix being responsible for **fine-tuning**, training our model and creating an easy-to-use fine-tuning assistant, and Michael being responsible for **embeddings** with a semantic search functionality, figuring out ways to import and clean up documents for embeddings, and doing general project management to keep the project on track. Electing specialists and having only 3 members in the team was a huge advantage for us. This way **everybody knew their craf**t, could teach the others about their findings, and work in a really focused manner. This also made project management easier, as dividing tasks was trivial, and therefore we had clean communication and more opportunities for precise timekeeping.

Expand All @@ -27,7 +27,7 @@ Our result is an easy-to-use Question Generator, with advanced features like the

{{</section>}}

{{<section title="🟰 Project Results">}}
{{<section title=" Project Results">}}

We realized the main requirements of the project tender and contributed our own features and ideas in the process. At the same time we were always in communication with Mobile Learning Labs to create a **common vision**. Etensive **documentation** was provided, which can be used by the customers of Mobile Learning Labs.

Expand Down
Binary file modified content/ss23/master/m5-quizzer-ai/team_dustin.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified content/ss23/master/m5-quizzer-ai/team_felix.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 2 additions & 2 deletions content/ss23/master/m5-quizzer-ai/techstack.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,10 @@ The project was built upon an existing prototype, reducing the need for starting

{{<section title="🧠 AI & API">}}
{{<image src="techstack_api.png" alt="openai_api" >}}
For the generation of quiz questions, we use the OpenAI API and utilize its powerful models such as GPT-3.5-Turbo and even GPT-4. In addition to the typical prompts used in ChatGPT, we have the flexibility to pass additional parameters to further refine the desired answers and improve the overall quality of the generated quiz questions.
For the generation of quiz questions, we use the OpenAI API and utilize its powerful models such as **GPT-3.5-Turbo** and even **GPT-4**. In addition to the typical prompts used in ChatGPT, we have the flexibility to pass additional parameters to further refine the desired answers and improve the overall quality of the generated quiz questions.
{{</section>}}

{{<section title="Miscellaneous">}}
{{<section title="Miscellaneous">}}
{{<image src="techstack_misc.png" alt="Miscellaneous" >}}
Since the development environment was already available, we followed these guidelines and used **Andriod Studio**, which is perfectly suited for the development of Flutter. Also **BitBucket** was already predefined. This was a small challenge, since the experience was limited to GitHub and GitLab. **Trello** was helpful in distributing tasks and making it easier to keep track of the project's progress.
{{</section>}}
Expand Down

0 comments on commit ffdf626

Please sign in to comment.