Lilian Weng 11/10/2019

Self-Supervised Representation Learning

Read Original

This technical article provides an in-depth overview of self-supervised representation learning. It explains how to create supervised tasks from unlabeled data (pretext tasks) to learn useful feature representations, contrasting it with supervised and unsupervised learning. The post covers motivations, key methods like contrastive learning, and references major models (BERT, MoCo, SimCLR, BYOL), with a focus on computer vision and NLP applications.

Self-Supervised Representation Learning

Comments

No comments yet

Be the first to share your thoughts!