Skip to content

models distilbert base cased

github-actions[bot] edited this page May 25, 2023 · 21 revisions

distilbert-base-cased

Overview

Description: The DistilBERT model is a smaller, faster version of the BERT model for Transformer-based language modeling with 40% fewer parameters and 60% faster run time while retaining 95% of BERT's performance on the GLUE language understanding benchmark. This English language question answering model has a F1 score of 87.1 on SQuAD v1.1 and was developed by Hugging Face under the Apache 2.0 license. Training the model requires significant computational power, such as 8 16GB V100 GPUs and 90 hours. Intended uses include fine-tuning on downstream tasks, but it should not be used to create hostile or alienating environments and limitations and biases should be taken into account.
Please Note: This model accepts masks in [mask] format. See Sample input for reference. > The above summary was generated using ChatGPT. Review the original model card to understand the data used to train the model, evaluation metrics, license, intended uses, limitations and bias before using the model. ### Inference samples Inference type|Python sample (Notebook)|CLI with YAML |--|--|--| Real time|fill-mask-online-endpoint.ipynb|fill-mask-online-endpoint.sh Batch |fill-mask-batch-endpoint.ipynb| coming soon ### Finetuning samples Task|Use case|Dataset|Python sample (Notebook)|CLI with YAML |--|--|--|--|--| Text Classification|Emotion Detection|Emotion|emotion-detection.ipynb|emotion-detection.sh Token Classification|Named Entity Recognition|Conll2003|named-entity-recognition.ipynb|named-entity-recognition.sh Question Answering|Extractive Q&A|SQUAD (Wikipedia)|extractive-qa.ipynb|extractive-qa.sh ### Model Evaluation Task| Use case| Python sample (Notebook)| CLI with YAML |--|--|--|--| Fill Mask | Fill Mask | rcds/wikipedia-for-mask-filling | evaluate-model-fill-mask.ipynb | evaluate-model-fill-mask.yml ### Sample inputs and outputs (for real-time inference) #### Sample input json { "inputs": { "input_string": ["Paris is the [MASK] of France.", "Today is a [MASK] day!"] } } #### Sample output json [ { "0": "capital" }, { "0": "beautiful" } ]

Version: 5

Tags

Preview license : apache-2.0 task : fill-mask

View in Studio: https://ml.azure.com/registries/azureml/models/distilbert-base-cased/version/5

License: apache-2.0

Properties

SHA: 4dc145c5bd4fdb672dcded7fdc1efd6c2bc55992

datasets: bookcorpus, wikipedia

evaluation-min-sku-spec: 2|0|7|14

evaluation-recommended-sku: Standard_DS2_v2

finetune-min-sku-spec: 4|1|28|176

finetune-recommended-sku: Standard_NC24rs_v3

finetuning-tasks: text-classification, token-classification, question-answering

inference-min-sku-spec: 2|0|7|14

inference-recommended-sku: Standard_DS2_v2

languages: en

Clone this wiki locally