CIFAR-10 and CIFAR-100 models: LeNet-5 and more
Collection
We created multiple variations to the LeNet-5 model, for example by changing the activation function, adding a dropout or extra convolution layer. • 4 items • Updated
• 1
This repository contains the base implementation of the LeNet5 architecture adapted for CIFAR-10. The model consists of two convolutional layers followed by three fully connected layers, using ReLU activations and Kaiming uniform initialization. It is trained with a batch size of 32 using the Adam optimizer (learning rate 0.001) and CrossEntropyLoss. In our experiments, this model achieved a test loss of 0.0539 and a top-1 accuracy of 58.52% on CIFAR-10.
Load this model in PyTorch to fine-tune or evaluate on CIFAR-10 using your training and evaluation scripts.
Feel free to update this model card with further training details, benchmarks, or usage examples.