Skip to content

ParlAI with customized chatbot implemented for the EC2 project.

License

Notifications You must be signed in to change notification settings

Empathic-Conversations/ParlAI-Chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Customized ParlAI for Empathic Conversation 2.0

What's ParlAI

ParlAI (pronounced "par-lay") is a framework for dialogue research and training of AI models in conversational settings developed by Facebook AI Research (FAIR), which is a part of Meta (previously known as Facebook, Inc.). It aims to provide the research community with a consistent platform for experimenting with dialogue models, thereby fostering reproducibility and collaboration. Since its introduction, ParlAI has become an influential tool in the conversational AI research domain.

ParlAI Core Concepts

The introduction gives a comprehensive overview of the core concepts employed by ParlAI and you should at least read through this documentation. One should pay particular attention to the notion of Worlds and Agents because here are where most customization happens. Further, one can read this tutorial about parameter-sharing (across multiple chat sessions) to understand how a chatbot is initialized whe a chat server spins up. On a high level, when we start a server, a "parent" chatbot is initialized according to configuration, e.g., model weights are loaded into the memory. When the server receives an incoming connection (ParlAI handles this with tornado), it will instantiate a copy from this initial parent, and one can customize this init-from-share process by overwriting the logic of Agent.share(self) method and the shared parameter of Agent.__init__(self, opt, shared).

What's in This Repository

ParlAI Customization

Archive for ParlAI with customized chatbot implemented for the EC2 project, stored under packages/ParlAI. Specifically, one can go directly to this folder for customized GPT-based style transfer implementations.

Simple Terminal Chatbot Server

We modified the original example from ParlAI into a simple terminal-based chatbot server that can process in-coming connections from chatting clients or other websocket connections. Please read the README.md for more details about how to use the code.

Local Search Engine

The chatbot used by the EC2 project is based on BlenderBot2, which requires a search-engine as an augmented knowledge base for response generation. You may refer to the source code for implementation details. You can also simply refer to this script for how to start the search engine. The documents are already pre-indexed under indexdir. One may refer to this notebook for how to rebuild the index.

Browser-based Chatting Client

One may use a browser-based chat client during the development stage to see if the chatbot works properly. Please read the README.md for more details about how to use the code.

Scripts

All scripts assumes running under the project folder, i.e., the folder where this README.md file resides.

Start BlenderBot2

Please refer to this script for how to host BlenderBot2 via a terminal_server.

Environment Creation

We assume environment management using conda. To create the environment for BlenderBot2 hosting, one can refer to this environment creation script. This script installs dependencies for running BlenderBot2 and its accompany local search engine. You will encounter version conflicts during the installation process but the conflicts does not seem to substantially affect the normal usage. We ran the terminal_server with Python 3.10 and it seems to work just fine.

About

ParlAI with customized chatbot implemented for the EC2 project.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published