2023-12-13: Support for multi-modal search
- Feature Enhancement:
Introduces multimodal capabilities to the RAG sample.
Leverages Azure OpenAI GPT-4 Vision-preview for enhanced functionality.
- Integration:
Integrates seamlessly with Azure OpenAI GPT-4 Vision for powerful multimodal capabilities.
- Documentation Update:
New documentation available in docs/gpt4.md for setting up and experiencing the new multimodal capability.
Users are encouraged to review the documentation for a smooth onboarding experience.
- How to Get Started:
Follow the instructions in docs/gpt4.md to configure and unlock the multimodal features.
What's Changed
- Update python-test.yaml with 3.12 by @pamelafox in #1030
- Bump vite from 4.4.11 to 4.4.12 in /app/frontend by @dependabot in #1034
- Bump the python-requirements group with 5 updates by @dependabot in #1029
- Bump the node-packages group in /app/frontend with 14 updates by @dependabot in #1038
- Integrate GPT4-vision support by @srbalakr in #1056
Full Changelog: 04-12-2023...12-13-2023