BERT
Model name: bert_local
About BERT
BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based model that is used for natural language processing tasks such as text classification, text generation, and text completion. It is pre-trained on a large corpus of text data and is capable of performing a wide range of natural language processing tasks.
Read more about BERT on Wikipedia and at HuggingFace's SentenceTransformers page.
Supported aidb operations
- encode_text
- encode_text_batch
Supported models
- sentence-transformers/all-MiniLM-L6-v2 (default)
- sentence-transformers/all-MiniLM-L6-v1
- sentence-transformers/all-MiniLM-L12-v1
- sentence-transformers/msmarco-bert-base-dot-v5
- sentence-transformers/multi-qa-MiniLM-L6-dot-v1
- sentence-transformers/paraphrase-TinyBERT-L6-v2
- sentence-transformers/all-distilroberta-v1
- sentence-transformers/all-MiniLM-L6-v2
- sentence-transformers/multi-qa-MiniLM-L6-cos-v1
- sentence-transformers/paraphrase-multilingual-mpnet-base-v2
- sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
Creating the default model
Creating a specific model
You can specify a model and revision in the options JSONB object. In this example, we are creating a sentence-transformers/all-distilroberta-v1
model with the name another_bert_model
:
Model configuration settings
The following configuration settings are available for CLIP models:
model
- The BERT model to use. The default issentence-transformers/all-MiniLM-L6-v2
.revision
- The revision of the model to use. The default isrefs/pr/64
. This entry is a reference to the model revision in the HuggingFace repository, and is used to specify the model version to use, in this case this branch.
Model credentials
No credentials are required for the BERT models as they run locally.
Could this page be better? Report a problem or suggest an addition!