Showcasing Semi-Supervised Learning on CIFAR-10

Abdulkader Helwan
6 min readFeb 23, 2024

In this article, we will be showcasing a semi-supervised learning of a convolutional neural network (CNN) trained with contrastive loss. The article is solely technical where we only show you a brief introduction to SLL and then we show how to build, train, and evaluate a CNN trained on CIFAR-10 dataset.

What is Semi-Supervised Learning

One of the machine learning methods for learning is a fascinating approach known as semi-supervised learning. This technique, as its name implies, acts as a bridge between two established methods: supervised learning and unsupervised learning. Unlike supervised learning, which relies solely on data with pre-defined labels, and unsupervised learning, which operates on unlabeled data without any predetermined categories, semi-supervised learning leverages the power of both. It ingeniously combines a small amount of labeled data, providing valuable guidance, with a much larger pool of unlabeled data, offering additional information and insights. This strategic combination allows the learning process to extract valuable knowledge from the unlabeled data, even without explicit labels, ultimately enriching the model’s understanding and potentially enhancing its performance.

Delving Deeper into Semi-Supervised Learning: Leveraging Unlabeled Data for Enhanced Learning

Building upon the understanding of semi-supervised learning as a bridge between supervised and unsupervised approaches, let’s delve deeper into its core mechanism. At its heart, semi-supervised learning hinges on treating data points differently based on whether they possess labels or not.

Labeled Data: Guiding the Learning Process

For data points fortunate enough to have labels, the algorithm employs traditional supervised learning techniques. These labeled examples serve as valuable guides, enabling the model to adjust its internal parameters (weights) in a way that aligns with the desired classifications. This traditional approach ensures the model learns effectively from the explicitly provided information.

Unlabeled Data: Unlocking Hidden Potential

However, semi-supervised learning doesn’t stop there. It takes advantage of the vast amount of unlabeled data, which, while lacking explicit labels, still holds valuable information. For these unlabeled points…

--

--