Mobile Style Transfer With Image-to-Image Translation

Abdulkader Helwan
4 min readJan 11, 2024

In this article, we discuss the concepts of conditional generative adversarial networks (CGAN).

Here we do a brief overview of image-to-image translation and generative adversarial learning.

This is a series of articles discussing Image-to-Image Translation using CycleGAN. Find the Next article here.

Introduction

In this series of articles, we’ll present a Mobile Image-to-Image Translation system based on a Cycle-Consistent Adversarial Networks (CycleGAN). We’ll build a CycleGAN that can perform unpaired image-to-image translation, as well as show you some entertaining yet academically deep examples.

In this project, we’ll use:

We assume that you are familiar with the concepts of Deep Learning as well as with Jupyter Notebooks and TensorFlow. You are welcome to download the project code.

Image-to-Image Translation

Style transfer is built using image-to-image translation. This technique transfers images from source domain A to target domain B. What does that mean, exactly? To put it succinctly, image-to-image translation lets us take properties from one image…

--

--