11th Place Solution for the Kaggle competition "I'm Something of a Painter Myself"
This project implements a CycleGAN-based solution to transform conventional photographs into paintings that emulate Claude Monet's distinctive impressionist style. The system was developed for the Kaggle challenge "I'm Something of a Painter Myself" and achieved 11th place.
This repository contains the code for the final project of the course Deep Generative Neural Networks: Fundamentals and Problem Solving at the Facultad de Ingeniería of the Universidad de la República.
The implemented CycleGAN architecture enables style transfer without requiring paired images (photograph-painting), making it possible to learn transformations between two domains using only unpaired examples from each domain. Key features:
- Generators based on ResNet with residual blocks to maintain structure while transferring style
- Discriminators using PatchGAN that classify whether portions of an image are real or generated
- Bidirectional training (photo→painting and painting→photo) to improve transformation consistency
- MiFID metrics for quantitative performance evaluation
git clone https://github.com/juanpablosotelo/cyclegan-style-transfer.gitcd cyclegan-style-transferThis project uses uv for dependency and virtual environment management, offering faster and more efficient package installation and management than traditional tools.
If you don't have uv installed yet, you can install it with:
curl -LsSf https://astral.sh/uv/install.sh | shOr with pip:
pip install uvTo create a virtual environment and install dependencies:
# Create a virtual environment
uv venv
source .venv/bin/activate # On Linux/macOS
# Install all project dependencies using uv sync
uv sync
# Alternatively, for development with additional tools:
uv sync --devTo train the model locally:
# Using the train script defined in pyproject.toml
train --config configs/local.yaml
# Or directly with the module
python -m src.train --config configs/local.yamlTo generate images with a trained model:
# Using the predict script defined in pyproject.toml
predict --config configs/local.yaml
# Or directly with the module
python -m src.predict --config configs/local.yamlThis project is optimized to run in Kaggle environments with GPUs. To train on Kaggle:
- Create a Kaggle notebook
- Clone this repository into the notebook
- Install the dependencies with uv
- Run !cd cyclegan-style-transfer
- Use the provided Kaggle configuration:
!python -m src.train --config configs/kaggle.yaml
- Run the prediction script
!python -m src.predict --config configs/kaggle.yaml
El archivo cyclegan-style-transfer.ipynb contiene el código para entrenar y predecir en Kaggle.
.
├── configs/ # YAML configuration files
├── data/ # Training and validation data
├── results/ # Training results and predictions
├── src/ # Source code
│ ├── models/ # Models (Generator, Discriminator)
│ ├── config.py # Configuration management
│ ├── dataset.py # Data loading and preprocessing
│ ├── metrics.py # Evaluation metrics
│ ├── predict.py # Prediction generation
│ ├── train.py # Training script
│ ├── trainer.py # Training process logic
│ └── utils.py # General utilities
└── pyproject.toml # Project configuration and dependencies
This project is licensed under the MIT License. See the LICENSE file for details.
