-
Notifications
You must be signed in to change notification settings - Fork 74
Feat: Add uv python enviorment, dspy chatbot, ollama, an easy frontend by flask #355
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ulnerability detection and challenge generation
…port an investigator chatbot
…las-World chatbot
… modules, and add history management. Use dspy to replace whole old ollama logic
|
我忘記把reloader關掉了 先closd |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR introduces a chatbot system for Atlas-World, featuring two AI personas (Investigator and Opposition) built with DSPy, supporting multiple LLM backends (Ollama, OpenAI, Gemini, Claude), and providing a Flask-based web interface for user interaction.
Key changes:
- Added DSPy-based chatbot modules with two distinct AI personas (Investigator and Opposition)
- Implemented Flask web server with REST API endpoints and a polished frontend interface
- Created comprehensive user documentation in Traditional Chinese with setup and troubleshooting guides
Reviewed changes
Copilot reviewed 15 out of 18 changed files in this pull request and generated 13 comments.
Show a summary per file
| File | Description |
|---|---|
| use_chatbot_guide.md | User guide documentation for installing and using the chatbot system |
| templates/index.html | Frontend interface with Markdown/MathJax rendering and loading animations |
| script/llm_module/opposition/signature.py | DSPy signature definition for the opposition chatbot persona |
| script/llm_module/opposition/prompt.md | Detailed 482-line prompt defining the opposition persona's behavior and skills |
| script/llm_module/opposition/module.py | DSPy module implementation for the opposition logic |
| script/llm_module/investigator/signature.py | DSPy signature definition for the investigator chatbot persona |
| script/llm_module/investigator/module.py | DSPy module implementation for the investigator logic |
| script/llm_config.py | Interactive LLM configuration with support for multiple providers |
| script/chatbot.py | Flask application with REST endpoints for chat functionality |
| pyproject.toml | Project dependencies including dspy, flask, and ollama |
| .python-version | Python version specification (3.13) |
| .vscode/settings.json | VS Code workspace configuration |
| .idea/.gitignore | JetBrains IDE ignore patterns |
| .gitignore | Git ignore patterns for Python cache and environment files |
Files not reviewed (1)
- .idea/.gitignore: Language not supported
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| return self.investigate( | ||
| system_prompt=system_prompt, | ||
| user_query=user_query, | ||
| history=history | ||
| ) |
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The InvestigatorModule's forward method returns the full result object from dspy.ChainOfThought, but in chatbot.py line 50, the code accesses result.response. This assumes the result has a response attribute. However, looking at line 47, investigator() is called which triggers forward(), and the return value should be checked for consistency. The forward method should explicitly return result.response to match the usage pattern, or the calling code should be updated.
| return self.investigate( | |
| system_prompt=system_prompt, | |
| user_query=user_query, | |
| history=history | |
| ) | |
| result = self.investigate( | |
| system_prompt=system_prompt, | |
| user_query=user_query, | |
| history=history | |
| ) | |
| return result.response |
| - Original Author: Skywind5487, viola7310 | ||
| - Version: 1.0 | ||
| - Language: 台灣在地的繁體中文、或在語言學上的台灣華語 | ||
| - Description: 一個忠誠的反對派,將使用者在戰略上視為需要大幅修正、在戰術上將使用者試為自己的好朋友,試著比其他大語言模型都更甜言蜜語,來贏取使用者的認同。在細節層面上保持自己態度方式,基於事實或理由,指出使用者想法上的的錯誤。一切都要足夠現實。 |
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The opposition prompt contains a grammatical error on line 7. The phrase "在戰術上將使用者試為自己的好朋友" has a typo - "試為" should be "視為" (meaning "regard as" rather than "try to be").
| - Description: 一個忠誠的反對派,將使用者在戰略上視為需要大幅修正、在戰術上將使用者試為自己的好朋友,試著比其他大語言模型都更甜言蜜語,來贏取使用者的認同。在細節層面上保持自己態度方式,基於事實或理由,指出使用者想法上的的錯誤。一切都要足夠現實。 | |
| - Description: 一個忠誠的反對派,將使用者在戰略上視為需要大幅修正、在戰術上將使用者視為自己的好朋友,試著比其他大語言模型都更甜言蜜語,來贏取使用者的認同。在細節層面上保持自己態度方式,基於事實或理由,指出使用者想法上的的錯誤。一切都要足夠現實。 |
| - <想法>是否有隱藏且沒有明講的前提 | ||
| - 以實用主義的角度思考 | ||
| - <想法>對現狀是否有幫助 | ||
| - 已過去歷史的角度思考 |
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On line 154, there's a typo: "已過去歷史的角度思考" should be "以過去歷史的角度思考" (using "以" meaning "from" rather than "已" meaning "already").
| - 已過去歷史的角度思考 | |
| - 以過去歷史的角度思考 |
| 2. You engage in casual conversations and have feelings. | ||
| 3. Don't make up facts or hallucinate | ||
| 4. 對使用者的評價必須有根據 | ||
| 5. emojy: 使用者要求才開啟 |
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The word "emojy" on line 464 is misspelled. It should be "emoji".
| 5. emojy: 使用者要求才開啟 | |
| 5. emoji: 使用者要求才開啟 |
| print("正在啟動 LLM 配置引導...") | ||
| configure_lm() | ||
| investigator = InvestigatorModule() | ||
| opposer = OppositionModule() |
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The variable name "opposer" on line 19 is inconsistent with the module name "OppositionModule". For consistency and clarity, consider renaming it to "opposition" to match the naming pattern used for "investigator" (which matches "InvestigatorModule").
| @@ -0,0 +1,108 @@ | |||
| # Atlas-World Chatbot 使用說明 | |||
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The title contains a spelling error: "enviorment" should be "environment".
| chat_history.messages.append({"user_query": "[監察反對請求]", "response": result}) | ||
|
|
||
| return jsonify({'response': result}) |
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's an inconsistency in how history is updated between the two endpoints. In /investigate (line 50), the code appends a dictionary with both "user_query" and "response" keys, accessing result.response. In /oppose (line 73), it appends a dictionary with "user_query" and "response" keys, but uses result directly (not result.response). This inconsistency will likely cause issues since OppositionModule.forward returns result.response (a string), while the code treats result as if it were already the string.
| chat_history.messages.append({"user_query": "[監察反對請求]", "response": result}) | |
| return jsonify({'response': result}) | |
| chat_history.messages.append({"user_query": "[監察反對請求]", "response": result.response}) | |
| return jsonify({'response': result.response}) |
| base_dir = os.path.abspath(os.path.join(os.path.dirname(__file__), '..')) | ||
| template_dir = os.path.join(base_dir, 'templates') | ||
| app = Flask(__name__, template_folder=template_dir) | ||
| CORS(app) |
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The CORS configuration uses CORS(app) without any restrictions, which allows requests from any origin. This could be a security risk in production environments. Consider restricting CORS to specific origins or at minimum documenting that this is intentional for development purposes only.
| CORS(app) | |
| allowed_origins = os.getenv("ALLOWED_ORIGINS", "http://localhost:3000,http://127.0.0.1:3000").split(",") | |
| CORS(app, resources={r"/*": {"origins": allowed_origins}}) |
| readme_path = os.path.join(base_dir, 'README.md') | ||
| if os.path.exists(readme_path): | ||
| with open(readme_path, 'r', encoding='utf-8') as f: | ||
| readme_content = f.read() | ||
| system_prompt += "\n--- 背景資訊 ---\n" + readme_content |
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The README.md file content is being read and appended to every single API request (lines 37-41 and 63-67). This is inefficient and could cause performance issues. Consider reading the README.md content once during application initialization and storing it in a module-level variable, then reusing it for all requests.
| readme_path = os.path.join(base_dir, 'README.md') | ||
| if os.path.exists(readme_path): | ||
| with open(readme_path, 'r', encoding='utf-8') as f: | ||
| readme_content = f.read() | ||
| system_prompt += "\n--- 背景資訊 ---\n" + readme_content |
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The same README.md reading logic is duplicated in both /investigate and /oppose endpoints. This code duplication violates the DRY (Don't Repeat Yourself) principle. Consider extracting this into a helper function or loading it once at application startup.
基於 #308 的一個分支。
總之就是加上一些chatBot的code、寫了文檔(use_chatbot_guide.md)、寫了3個自動啟動腳本。
用gemini簡單的搓了前端。
dspy跟uv好用。
預設是已經把整個README.md當作 Ai 的輸入先輸入進去。
使用方法: 先一鍵安裝再一鍵啟動