About 7,280,000 results
Open links in new tab
  1. On the Sentence Embeddings from Pre-trained Language Models

    Nov 2, 2020 · Pre-trained contextual representations like BERT have achieved great success in natural language processing. However, the sentence embeddings from the pre-trained …

  2. Extracting Sentence Embeddings from Pretrained Transformer Models

    Oct 2, 2024 · After providing a comprehensive review of existing sentence embedding extraction and refinement methods, we thoroughly test different combinations and our original extensions …

  3. On The Role of Pretrained Language Models in General-Purpose …

    Jul 28, 2025 · The study enabled us to identify the appropriate embedding model for each task, identify the main challenges faced by embedding models, and propose effective solutions to …

  4. Pretrained language models based on bidirectional encoders can be learned using a masked language model objective where a model is trained to guess the missing information from an …

  5. On the Sentence Embeddings from Pre-trained Language Models

    2 days ago · However, the sentence embeddings from the pre-trained language models without fine-tuning have been found to poorly capture semantic meaning of sentences. In this paper, …

  6. Pretrained ModelsSentence Transformers documentation

    We provide various pre-trained Sentence Transformers models via our Sentence Transformers Hugging Face organization. Additionally, over 6,000 community Sentence Transformers …

  7. Unlocking Sentence Embeddings in NLP - numberanalytics.com

    Jun 11, 2025 · Pre-trained language models like BERT provide a powerful means of generating sentence embeddings. By fine-tuning these models on specific NLP tasks, it's possible to …

  8. On the Sentence Embeddings from Pre-Trained Language Models

    Do they carry too little semantic guage model pre-training objective and the se- mantic similarity task theoretically, and then information, or just because the semantic meanings analyze the …

  9. In this paper, we propose an efcient framework for probabilistic sentence embedding (Sen2Pro) from PLMs that represents a sentence as a proba- bility density in an embedding space to …

  10. Extracting Sentence Embeddings from Pretrained Transformer Models

    Background/introduction: Pre-trained transformer models shine in many natural language processing tasks and therefore are expected to bear the representation of the input sentence …