Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update dependency @huggingface/transformers to v3.2.4 #102

Merged
merged 1 commit into from
Dec 28, 2024

Conversation

renovate[bot]
Copy link
Contributor

@renovate renovate bot commented Dec 28, 2024

This PR contains the following updates:

Package Change Age Adoption Passing Confidence
@huggingface/transformers 3.2.3 -> 3.2.4 age adoption passing confidence

Release Notes

huggingface/transformers.js (@​huggingface/transformers)

v3.2.4

Compare Source

What's new?

  • Add support for visualizing self-attention heatmaps in https://github.com/huggingface/transformers.js/pull/1117

    Cats Attention Head 0 Attention Head 1 Attention Head 2
    Attention Head 3 Attention Head 4 Attention Head 5
    Example code
    import { AutoProcessor, AutoModelForImageClassification, interpolate_4d, RawImage } from "@​huggingface/transformers";
    
    // Load model and processor
    const model_id = "onnx-community/dinov2-with-registers-small-with-attentions";
    const model = await AutoModelForImageClassification.from_pretrained(model_id);
    const processor = await AutoProcessor.from_pretrained(model_id);
    
    // Load image from URL
    const image = await RawImage.read("https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/cats.jpg");
    
    // Pre-process image
    const inputs = await processor(image);
    
    // Perform inference
    const { logits, attentions } = await model(inputs);
    
    // Get the predicted class
    const cls = logits[0].argmax().item();
    const label = model.config.id2label[cls];
    console.log(`Predicted class: ${label}`);
    
    // Set config values
    const patch_size = model.config.patch_size;
    const [width, height] = inputs.pixel_values.dims.slice(-2);
    const w_featmap = Math.floor(width / patch_size);
    const h_featmap = Math.floor(height / patch_size);
    const num_heads = model.config.num_attention_heads;
    const num_cls_tokens = 1;
    const num_register_tokens = model.config.num_register_tokens ?? 0;
    
    // Visualize attention maps
    const selected_attentions = attentions
        .at(-1) // we are only interested in the attention maps of the last layer
        .slice(0, null, 0, [num_cls_tokens + num_register_tokens, null])
        .view(num_heads, 1, w_featmap, h_featmap);
    
    const upscaled = await interpolate_4d(selected_attentions, {
        size: [width, height],
        mode: "nearest",
    });
    
    for (let i = 0; i < num_heads; ++i) {
        const head_attentions = upscaled[i];
        const minval = head_attentions.min().item();
        const maxval = head_attentions.max().item();
        const image = RawImage.fromTensor(
            head_attentions
                .sub_(minval)
                .div_(maxval - minval)
                .mul_(255)
                .to("uint8"),
        );
        await image.save(`attn-head-${i}.png`);
    }
  • Add min, max, argmin, argmax tensor ops for dim=null

  • Add support for nearest-neighbour interpolation in interpolate_4d

  • Depth Estimation pipeline improvements (faster & returns resized depth map)

  • TypeScript improvements by @​ocavue and @​shrirajh in https://github.com/huggingface/transformers.js/pull/1081 and https://github.com/huggingface/transformers.js/pull/1122

  • Remove unused imports from tokenizers.js by @​pratapvardhan in https://github.com/huggingface/transformers.js/pull/1116

New Contributors

Full Changelog: huggingface/transformers.js@3.2.3...3.2.4


Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Enabled.

Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

@renovate renovate bot enabled auto-merge (squash) December 28, 2024 14:02
@renovate renovate bot merged commit b1e03e7 into main Dec 28, 2024
1 check passed
@renovate renovate bot deleted the renovate/huggingface-transformers-3.x branch December 28, 2024 14:02
@ainoya-bot ainoya-bot bot mentioned this pull request Dec 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants