Publication Date

Spring 5-3-2026

Document Type

Student Project

First Advisor

Krislock, Nathan

Degree Name

B.S. (Bachelor of Science)

Department

Department of Mathematical Sciences

Abstract

Handwritten character recognition remains a challenging problem in machine learning due to the high variability of handwriting across individuals and the visual similarity between certain character classes. This project explores whether Singular Value Decomposition (SVD)-based dimensionality reduction can serve as an effective preprocessing step for a fully connected neural network trained on the EMNIST Balanced dataset, a 47-class benchmark of handwritten digits and letters. By projecting 784- dimensional pixel inputs onto the top 70 principal components, approximately 90% of the total variance is preserved while reducing input dimensionality by 91%. The resulting SVD-based model achieves approximately 94% test accuracy, outperforming a baseline Convolutional Neural Network (CNN) (∼85%) trained under identical conditions, despite CNNs being the standard approach for image classification tasks. Additional models, including a reduced SVD variant and a flattened input baseline, highlight a clear tradeoff between parameter count, input representation, and classification accuracy. Overall, the results show that mathematical preprocessing can effectively replace architectural complexity in this setting, and that strong performance does not always require the most complex model.

Share

COinS