Skip to content

Canopywave deepseek v3.2 speciale#5406

Open
cynthia-lixinyi wants to merge 8 commits intoHelicone:mainfrom
cynthia-lixinyi:canopywave-deepseek-v3.2-speciale
Open

Canopywave deepseek v3.2 speciale#5406
cynthia-lixinyi wants to merge 8 commits intoHelicone:mainfrom
cynthia-lixinyi:canopywave-deepseek-v3.2-speciale

Conversation

@cynthia-lixinyi
Copy link
Contributor

Ticket

Link to the ticket(s) this pull request addresses.

Component/Service

What part of Helicone does this affect?

  • Web (Frontend)
  • Jawn (Backend)
  • Worker (Proxy)
  • Bifrost (Marketing)
  • AI Gateway
  • Packages
  • Infrastructure/Docker
  • Documentation

Type of Change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation update
  • Performance improvement
  • Refactoring

Testing

  • Added/updated unit tests
  • Added/updated integration tests
  • Tested locally
  • Verified in staging environment
  • E2E tests pass (if applicable)

Technical Considerations

  • Database migrations included (if needed)
  • API changes documented
  • Breaking changes noted
  • Performance impact assessed
  • Security implications reviewed

Dependencies

  • No external dependencies added
  • Dependencies added and documented
  • Environment variables added/modified

Deployment Notes

  • No special deployment steps required
  • Database migrations need to run
  • Environment variable changes required
  • Coordination with other teams needed

Context

Why are you making this change?

Screenshots / Demos

Before After

Misc. Review Notes

@vercel
Copy link

vercel bot commented Dec 11, 2025

@cynthia-lixiny07 is attempting to deploy a commit to the Helicone Team on Vercel.

A member of the Team first needs to authorize it.

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Dec 11, 2025

Greptile Overview

Greptile Summary

Added support for DeepSeek V3.2 Speciale model through Canopy Wave provider with standard configuration matching other similar integrations.

  • Added new model definition for deepseek-v3.2-speciale with 163K context length and 65K max output tokens
  • Configured Canopy Wave provider endpoint with fp8 quantization, $0.27/$0.41 per 1M tokens pricing
  • Updated test snapshots to reflect the new model (4th DeepSeek V3 variant, bringing Canopy Wave's total to 4 models)

Confidence Score: 5/5

  • Safe to merge - clean provider integration following established patterns
  • Changes follow exact same pattern as previous Canopy Wave integrations, all configuration values are consistent with similar models, test snapshots properly updated, no breaking changes or logic issues
  • No files require special attention

Important Files Changed

File Analysis

Filename Score Overview
packages/cost/models/authors/deepseek/deepseek-v3/model.ts 5/5 Added deepseek-v3.2-speciale model configuration with 163K context and 65K max tokens
packages/cost/models/authors/deepseek/deepseek-v3/endpoints.ts 5/5 Added Canopy Wave provider endpoint for deepseek-v3.2-speciale with fp8 quantization and standard pricing
packages/tests/cost/snapshots/registrySnapshots.test.ts.snap 5/5 Updated test snapshots to reflect new model and endpoint configuration

Sequence Diagram

sequenceDiagram
    participant User
    participant Registry
    participant ModelConfig
    participant EndpointConfig
    participant CanopyWave

    User->>Registry: Request deepseek-v3.2-speciale
    Registry->>ModelConfig: Lookup model definition
    ModelConfig-->>Registry: Returns model specs (163K context, 65K max tokens)
    Registry->>EndpointConfig: Lookup provider endpoint
    EndpointConfig-->>Registry: Returns Canopy Wave config (fp8, pricing)
    Registry->>CanopyWave: Route request to inference.canopywave.io
    CanopyWave-->>User: Return model response
Loading

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 files reviewed, no comments

Edit Code Review Agent Settings | Greptile

"DeepSeek 3.2 Speciale is an advanced AI model engineered for superior reasoning and problem-solving, particularly in technical domains like coding and mathematics. This specialized version offers greater accuracy and nuanced understanding for complex tasks while remaining versatile for general conversation.",
contextLength: 163_840,
maxOutputTokens: 65_536,
created: "2025-09-22T00:00:00.000Z",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Release date is December 1st, 2025

author: "deepseek",
description:
"DeepSeek 3.2 Speciale is an advanced AI model engineered for superior reasoning and problem-solving, particularly in technical domains like coding and mathematics. This specialized version offers greater accuracy and nuanced understanding for complex tasks while remaining versatile for general conversation.",
contextLength: 163_840,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you double check this context length? Online I'm seeing the context length is 131_072.

description:
"DeepSeek 3.2 Speciale is an advanced AI model engineered for superior reasoning and problem-solving, particularly in technical domains like coding and mathematics. This specialized version offers greater accuracy and nuanced understanding for complex tasks while remaining versatile for general conversation.",
contextLength: 163_840,
maxOutputTokens: 65_536,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

^ and this looks like 128_000.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants