Break points in AI
Contrastive learning
- Contrastive learning SimCLR –it is a self-supervise learning
- Momentum Contrast for Unsupervised Visual Representation Learning MoCo
- why combine crossEntropy with softmax – easy to get the Derivative value. ###
- XLNet: Generalized Autoregressive Pretraining for Language Understanding
- VS RoBERTa: A Robustly Optimized BERT Pretraining Approach
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
This post is licensed under CC BY 4.0 by the author.