Skip to content

Hinglish AI Translator Project #162

@HariBhuvana

Description

@HariBhuvana

I explored the byteom / hinglish-AI-translator extension—well put together overall with thoughtful features like multiple translation styles, AI explanations, and clear modular architecture. I also reviewed existing GitHub issues, including:

ADDITIONAL IMPROVEMENTS:

  1. Batch Translation of Full Page Content - When users request full-page translation, the extension currently fires off many sequential API calls. This could be optimized by combining multiple text segments and sending batch requests to the API, reducing latency and cost.
  2. Context-Aware Glossary or Terminology Memory - Introduce a small persistent glossary that learns user-preferred translations (e.g., “API key” → “एपीआई कुंजी”) and reuses them consistently. This improves translation quality and familiarity over time, especially for technical or personalized terms.
  3. User-Editable Custom Rules - Allow users to define custom translation rules or substitutions (e.g., always convert “meeting” to “meeting” or “बैठक” depending on context) via a simple JSON or list in settings—enhancing control and domain-specific usage.
  4. Translation History & Copy/Export Feature - Currently, there’s no way to revisit past translations. A history panel with copy-to-clipboard and export (e.g., CSV or JSON) options would make the extension useful beyond immediate context.

NEW ISSUE:

Title: Batch Full-Page Translation via Combined API Calls

Description:
When users select an option like “Translate full page,” the extension currently splits the page content into many individual selections and makes multiple separate API calls via Groq. This impacts performance, increases cost, and is inefficient.

Current Behavior:

  • Numerous API calls per paragraph/selection
  • Longer response times and slower UI
  • Higher usage of credits/cost

Proposed Enhancement:

  • Aggregate selected text or full page content into chunks (e.g., combined paragraphs) and send in a single API call using batch-input.
  • On the API side (Groq), support input arrays or long-form translation.
  • Optionally, fallback to chunking if input size exceeds limits.
  • Provide progress indicator and error handling for large requests.

Benefits:

  • Reduced API calls → faster responses, lower costs.
  • Cleaner popup UX with combined results.
  • Better scalability for long documents or pages.

Files to Update:

  • background.js (adjust translateText() logic)
  • content.js / popup.js UI to reflect batch progress
  • Add unit tests/simulations for batches

Effort Level: Medium. I’m happy to help with PR.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions