Showcasing Semi-Supervised Learning on CIFAR-10

Abdulkader Helwan
6 min readFeb 23, 2024

In this article, we will be showcasing a semi-supervised learning of a convolutional neural network (CNN) trained with contrastive loss. The article is solely technical where we only show you a brief introduction to SLL and then we show how to build, train, and evaluate a CNN trained on CIFAR-10 dataset.

What is Semi-Supervised Learning

One of the machine learning methods for learning is a fascinating approach known as semi-supervised learning. This technique, as its name implies, acts as a bridge between two established methods: supervised learning and unsupervised learning. Unlike supervised learning, which relies solely on data with pre-defined labels, and unsupervised learning, which operates on unlabeled data without any predetermined categories, semi-supervised learning leverages the power of both. It ingeniously combines a small amount of labeled data, providing valuable guidance, with a much larger pool of unlabeled data, offering additional information and insights. This strategic combination allows the learning process to extract valuable knowledge from the unlabeled data, even without explicit labels, ultimately enriching the model’s understanding and potentially enhancing its performance.

Delving Deeper into Semi-Supervised Learning: Leveraging Unlabeled Data for Enhanced Learning

Building upon the understanding of semi-supervised learning as a bridge between…

--

--