AI Research Wiki

Tag: pretraining

6 items with this tag.

  • Apr 11, 2026

    BERT (Bidirectional Encoder Representations from Transformers)

    • architecture
    • encoder-only
    • pretraining
    • natural-language-understanding
  • Apr 11, 2026

    Pretraining

    • pretraining
    • language-modeling
    • self-supervised-learning
  • Apr 11, 2026

    Transfer Learning

    • transfer-learning
    • pretraining
    • nlp
  • Apr 11, 2026

    BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

    • pretraining
    • bidirectional
    • masked-language-model
    • fine-tuning
    • transformer
  • Apr 11, 2026

    Improving Language Understanding by Generative Pre-Training

    • pretraining
    • fine-tuning
    • transfer-learning
    • language-model
    • transformer
  • Apr 11, 2026

    Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

    • transfer-learning
    • encoder-decoder
    • pretraining
    • nlp-benchmark

Created with Quartz v4.5.2 © 2026

  • GitHub
  • Discord Community