DISCOVER PROJECT

Deep Learning

From Classic Datasets to Modern CNNs Deep Learning Experiments in Vision and Classification
Deep learning lab series exploring neural networks from simple perceptrons on tabular data to convolutional architectures on CIFAR and MNIST image datasets. The work progresses from scikit-learn baselines to Keras/TensorFlow implementations, covering model design, training, evaluation, and comparison of architectures on increasingly complex tasks.

PYTHON - NUMPY - PANDAS - SCIKIT-LEARN - TENSORFLOW / KERAS - MATPLOTLIB / SEABORN MNIST - LOGISTIC REGRESSION - MLPS - CNNS

Project Overview

On structured data, Implementing logistic regression and multilayer perceptrons for the Iris dataset, including one‑hot encoding, 75/25 train–test splits, and systematic comparison of training vs test accuracy. reimplementing a similar perceptron in Keras, controlling architecture (hidden units, activations) and loss/optimizer choices, and extended this design to MNIST by flattening 28×28 images, normalizing pixels, and adapting the output layer to 10 digit classes. On images, I'built a custom CNN for CIFAR‑10 with multiple convolutional–pooling blocks and dropout, achieving strong performance, and then implemented a VGG16‑inspired network plus a more efficient BatchNorm‑based CNN, training and evaluating while comparing accuracy, overfitting behavior.

The objective of this deep learning work was to build an end‑to‑end understanding of neural networks by starting with classical classifiers (logistic regression, shallow MLPs) and gradually moving to deeper architectures tailored for images. The focus was on implementing and adapting architectures such as custom CNNs and VGG16‑like networks to CIFAR‑10/100 and MNIST, analyzing their training dynamics, generalization performance, and trade-offs between depth, capacity, and efficiency.

The overall philosophy was to learn deep learning by implementation and comparison rather than treating architectures as black boxes. Each experiment is designed to isolate one idea—activation choice, depth, regularization, or dataset complexity—and observe its impact on accuracy, overfitting, and training time. By starting from simple baselines and iteratively scaling up to complex CNNs on challenging vision benchmarks, the work emphasizes reproducible notebooks, clear visualizations, and critical reflection on when added model complexity is actually justified.

The overall philosophy was to learn deep learning by implementation and comparison rather than treating architectures as black boxes. Each experiment is designed to isolate one idea—activation choice, depth, regularization, or dataset complexity—and observe its impact on accuracy, overfitting, and training time. By starting from simple baselines and iteratively scaling up to complex CNNs on challenging vision benchmarks, the work emphasizes reproducible notebooks, clear visualizations, and critical reflection on when added model complexity is actually justified.

SUCCESS RATE

100
%

%

SATISFACTION

93
%

%

Have a project?

Schedule a Call.

Let's talk !

Have a project?

Available for a new position from

4th August 2026

Available for a new position from

4th July 2025

Available for a new position from

4th August 2026

Create a free website with Framer, the website builder loved by startups, designers and agencies.